Home

Contact


QUOC TRAN-DINH
+ Department of Statistics and Operations Research
+ Adjunct Member of Department of Computer Science (2020-present) 
Address: 333 Hanes Hall CB #3260
The University of North Carolina at Chapel Hill     
Chapel Hill, NC 27599
E-mail: quoctdATemailDOTuncDOTedu
Phone: +1-919-843-6023
Name in native language: Trần Đình Quốc (Quốc is my first name).

About me

I am an associate professor at the Department of Statistics and Operations Research. Previously, I was a postdoctoral researcher at Laboratory for Information and Inference Systems (LIONS), École Polytechnique Fédérale de Lausanne (EPFL),  Switzerland from November 2012 to June 2015. I completed my Ph.D. in Optimization in Engineering in November 2012 at the Department of Electrical Engineering (ESAT) and Optimization in Engineering Center (OPTEC) under the supervision of Prof. Moritz Diehl.

News


  • May 2023: My ONR grant has been recommended for an award. Thanks ONR for supporting my research in the last few years. It has been a while, this page has not been updated.
  • October 30, 2020: Our paper A new homotopy proximal variable-metric framework for composite convex minimization gets accepted for publication on Mathematics of Operations Research (This is joint work with Ling Liang and Kim-Chuan Toh (NUS, Singapore)).
  • October 12: Our paper A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization gets accepted for publication on Mathematical Programming, Ser. A.
  • September 29: Our paper Hybrid Variance-Reduced SGD Algorithms for Nonconvex-Concave Minimax Problems (Q. Tran-Dinh, Deyi Liu, and Lam M. Nguyen) got accepted for presenting at NeurIPs 2020.
  • September 29: I serve as an Associate Editor of Computational Optimization and Applications (COAP).
  • September 10: I have been appointed for an adjunct position at the CS department.
  • August 1, 2020: Our paper: Non-Stationary First-Order Primal-Dual Algorithms with Fast Convergence Rates has been accepted for publication on SIAM J. Optimization.
  • June 1, 2020: Our paper: Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization (Joint work with N. H. Pham, and L. M. Nguyen) has been accepted for presenting at ICML 2020.
  • May 5, 2020: Our paper: ProxSARAH: An efficient algorithmic framework for stochastic composite nonconvex optimization has been accepted for publication on Journal of Machine Learning Research.
  • February 3, 2020: Our paper (with my student and colleague) has been accepted for publication on COAP (Computational Optimization and Applications).
  • January 28, 2020: A standard ONR grant gets awarded for 3 years.
  • January 7, 2020: Our paper: A Hybrid Stochastic Policy Gradient Algorithm for Reinforcement Learning (N. H. Pham, L. M. Nguyen, D. T. Phan, H. P. Nguyen, M. van Dijk, and Q. Tran-Dinh) got  accepted for publication in the proceedings of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020), Italy, 2020.
  • January 6, 2020: I am teaching STOR 614: Linear Programming and Extensions this Spring. The course covers theory, solution methods, and applications of LPs, QPs, second-order cone programming, semidefinite programming, and convex programming.  If you are interested in, please take this course. The tentative syllabus can be found HERE.
  • December 23, 2019: Our paper Transferring Optimality Across Data Distributions via Homotopy Methods (Matilde Gargiani, Andrea Zanelli, Quoc Tran-Dinh, Moritz Diehl, Frank Hutter) got accepted for publication in the proceedings of the Eighth International Conference on Learning Representations (ICLR 2020), Ethiopia, April 2020.
  • August 3-8, 2019: I am attending ICCOPT 2019 in Berlin. Together with I. Necoara, we organize a section: “Recent advances in first-order methods for constrained convex optimization and related problems” with 4 parts running from Monday to Tuesday with 12 talks (9 invited talks and 3 contributed talks). Please join us!
  • I will be taking a research leave in Fall 2019, and plan to visit some universities in the US and Europe. I am happy to connect to researchers who have common research interests as I do. I will also be a SAMSI fellow within the Deep Learning Program in Fall 2019.
  • March 15, 2019: We have uploaded a new preprint: “Non-Stationary First-Order Primal-Dual Algorithms with Fast NonErgodic Convergence Rates” on Arxiv (https://arxiv.org/pdf/1903.05282.pdf). This paper is concerned with faster than O(1/k) convergence rates of primal-dual methods. Any feedback is highly appreciated.
  • February 15, 2019: Together with my student, we have completed a manuscript: ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization. This is a joint work with Lam Nguyen M. and Dzung Phan T. at IBM Waston. The preprint can be found at: https://arxiv.org/pdf/1902.05679.pdf. Any comments and suggestions are highly appreciated.
  • August 21st, 2018: My first student (co-advisor with Shu), Tianxiao Sun, has successfully finished his Ph.D. defense on August 20, 2018. He is going to work at Lowe’s in September.  I am still looking for a new well-motivated Ph.D. student to work with me in the area of numerical optimization: theory and algorithms (with possible applications in machine learning and data analysis).
  • April 30, 2018: Our paper “Generalized self-concordant functions: A recipe for Newton-type methods” has been accepted for publication on Mathematical Programming. The preprint version is on arxiv.org (https://arxiv.org/pdf/1703.04599.pdf [PDF]).
  • April 1, 2018: In Fall 2018, I am teaching a special topic course: Selected Numerical Methods for Modern Optimization in Data Analysis. The tentative syllabus can be found HERE. Please join us!!!! Some lecture notes are posted online HERE.
  • March 30, 2018: Our paper “Self-concordant inclusions:  A unified framework for path-following generalized Newton-type algorithms” has been published in Mathematical Programming. The preprint can be found at https://arxiv.org/pdf/1707.07403.pdf.
  • November 30, 2017: I have completed a manuscript: “Proximal Alternating Penalty Algorithms for Nonsmooth Constrained Convex Optimization“. This paper aims at developing first-order methods for solving general constrained convex optimization problems. The MATLAB code is available at https://github.com/quoctd/PAPA-1.0. If you find any bug, please let me know.
  • September 19, 2017: After almost two years, our paper A single-phase, proximal path-following framework has been accepted for publication on Mathematics of Operations Research.
  • September 4, 2017: Our conference paper “Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization” has been accepted at NIPS, 2017, in Long Beach, CA. I am planning to attend it in December.
  • September 1, 2017: Ahmet Alacaoglu, a Ph.D. student from LIONS, EPFL, Switzerland, is visiting me as an internship student. He will be here until September 30, 2017, and is working on our joint project.
  • August 10, 2017: I have just completed a paper: “A new inexact homotopy proximal Newton algorithms for scalar parametric composite convex minimization. It deals with an interesting topic: Homotopy second-order methods. I have not posted this paper on Arxiv, but if you are interested in, please let me know.
  • August 7, 2017: Prof. Ion Necoara, a Fulbright scholar from the University Politehnica Bucharest, Romania, is currently visiting me at UNC-Chapel Hill. He is with our department until the end of November 2017.
  • July 27, 2017: After a long time, our paper “A smooth primal-dual optimization framework for nonsmooth composite convex minimization” has been accepted for publication on SIAM J. Optimization. The preprint can be found at https://arxiv.org/abs/1507.06243 [pdf].
  • July 26, 2017: I gave a talk on “Exploiting analytical structures in convex optimization applications” at the IBM Waston Research Center.
  • July 15, 2017: With my student, Tianxiao Sun, and my colleague, Shu Lu, we have posted our paper “Self-concordant inclusions:  A unified framework for path-following generalized Newton-type algorithms” on Arvix (https://arxiv.org/pdf/1707.07403.pdf). This paper was completed for a year ago and is under revision [pdf].
  • March 14, 2017: Together with my student, Tianxiao Sun, we have posted a new paper “Generalized self-concordant functions: A recipe for Newton-type methods” on arxiv.org (https://arxiv.org/pdf/1703.04599.pdf [PDF]).
  • In Spring 2017, I am teaching a special topic course for graduate students at our department
    • Title: Selected topics in modern convex optimization: theory, algorithms and applications.
    • The tentative syllabus is available [HERE].

Visitors (From 20.07.2013):
http://www.hitwebcounter.com