- August 3-8, 2019: I am attending ICCOPT 2019 in Berlin. Together with I. Necoara, we organize a section: “Recent advances in first-order methods for constrained convex optimization and related problems” with 4 parts running from Monday to Tuesday with 12 talks (9 invited talks and 3 contributed talks). Please join us!
- I will be taking a research leave in Fall 2019, and plan to visit some universities in the US and Europe. I am happy to connect to researchers who have common research interests as I do. I will also be a SAMSI fellow within the Deep Learning Program in Fall 2019.
- March 15, 2019: We have uploaded a new preprint: “Non-Stationary First-Order Primal-Dual Algorithms with Fast NonErgodic Convergence Rates” on Arxiv (https://arxiv.org/pdf/1903.05282.pdf). This paper is concerned with faster than O(1/k) convergence rates of primal-dual methods. Any feedback is highly appreciated.
- February 15, 2019: Together with my student, we have completed a manuscript: ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization. This is a joint work with Lam Nguyen M. and Dzung Phan T. at IBM Waston. The preprint can be found at: https://arxiv.org/pdf/1902.05679.pdf. Any comments and suggestions are highly appreciated.
- August 21st, 2018: My first student (co-advisor with Shu), Tianxiao Sun, has successfully finished his Ph.D. defense on August 20, 2018. He is going to work at Lowe’s in September. I am still looking for a new well-motivated Ph.D. student to work with me in the area of numerical optimization: theory and algorithms (with possible applications in machine learning and data analysis).
- April 30, 2018: Our paper “Generalized self-concordant functions: A recipe for Newton-type methods” has been accepted for publication on Mathematical Programming. The preprint version is on arxiv.org (https://arxiv.org/pdf/1703.04599.pdf [PDF]).
- April 1, 2018: In Fall 2018, I am teaching a special topic course: Selected Numerical Methods for Modern Optimization in Data Analysis. The tentative syllabus can be found HERE. Please join us!!!! Some lecture notes are posted online HERE.
- March 30, 2018: Our paper “Self-concordant inclusions: A unified framework for path-following generalized Newton-type algorithms” has been published in Mathematical Programming. The preprint can be found at https://arxiv.org/pdf/1707.07403.pdf.
- November 30, 2017: I have completed a manuscript: “Proximal Alternating Penalty Algorithms for Nonsmooth Constrained Convex Optimization“. This paper aims at developing first-order methods for solving general constrained convex optimization problems. The MATLAB code is available at https://github.com/quoctd/PAPA-1.0. If you find any bug, please let me know.
- September 19, 2017: After almost two years, our paper “A single-phase, proximal path-following framework“ has been accepted for publication on Mathematics of Operations Research.
- September 4, 2017: Our conference paper “Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization” has been accepted at NIPS, 2017, in Long Beach, CA. I am planning to attend it in December.
- September 1, 2017: Ahmet Alacaoglu, a Ph.D. student from LIONS, EPFL, Switzerland, is visiting me as an internship student. He will be here until September 30, 2017, and is working on our joint project.
- August 10, 2017: I have just completed a paper: “A new inexact homotopy proximal Newton algorithms for scalar parametric composite convex minimization. It deals with an interesting topic: Homotopy second-order methods. I have not posted this paper on Arxiv, but if you are interested in, please let me know.
- August 7, 2017: Prof. Ion Necoara, a Fulbright scholar from the University Politehnica Bucharest, Romania, is currently visiting me at UNC-Chapel Hill. He is with our department until the end of November 2017.
- July 27, 2017: After a long time, our paper “A smooth primal-dual optimization framework for nonsmooth composite convex minimization” has been accepted for publication on SIAM J. Optimization. The preprint can be found at https://arxiv.org/abs/1507.06243 [pdf].
- July 26, 2017: I gave a talk on “Exploiting analytical structures in convex optimization applications” at the IBM Waston Research Center.
- July 15, 2017: With my student, Tianxiao Sun, and my colleague, Shu Lu, we have posted our paper “Self-concordant inclusions: A unified framework for path-following generalized Newton-type algorithms” on Arvix (https://arxiv.org/pdf/1707.07403.pdf). This paper was completed for a year ago and is under revision [pdf].
- March 14, 2017: Together with my student, Tianxiao Sun, we have posted a new paper “Generalized self-concordant functions: A recipe for Newton-type methods” on arxiv.org (https://arxiv.org/pdf/1703.04599.pdf [PDF]).
- In Spring 2017, I am teaching a special topic course for graduate students at our department
- Title: Selected topics in modern convex optimization: theory, algorithms and applications.
- The tentative syllabus is available [HERE].
QUOC TRAN-DINH Department of Statistics and Operations Research 333 Hanes Hall CB #3260 The University of North Carolina at Chapel Hill Chapel Hill, NC 27599 E-mail: quoctdATemailDOTuncDOTedu Phone: +1-919-843-6023
I am an assistant professor at the Department of Statistics and Operations Research, where I joined in July 2015. I was a postdoctoral researcher at Laboratory for Information and Inference Systems (LIONS), École Polytechnique Fédérale de Lausanne (EPFL), Switzerland from November 2012 to June 2015. I completed my PhD in Optimization in Engineering in November 2012 at the Department of Electrical Engineering (ESAT) and Optimization in Engineering Center (OPTEC) under the supervision of Prof. Moritz Diehl.
My research is on numerical optimization: theory, algorithms, and applications. I have been working on equilibrium problems and variational inequalities; sequential convex programming (SCP) for nonlinear optimization and applications in model predictive control, optimal control, and static output feedback control; and first order and second order decomposition methods for [large-scale] convex optimization. My current research focuses on efficient methods for convex optimization and matrix optimization, with applications in signal/image processing, statistics, and machine learning.