**( New) Our recent software and computer codes can be found at** https://github.com/orgs/unc-optimization/dashboard

**ProxSGD – A Python package of stochastic algorithms for nonconvex composite optimization**

**ProxSGD**is a set of optimization routines for solving nonconvex composite optimization problems. It consists of several methods such as proximal SGD, proximal SVRG, proximal SARAH, and many other variants.- It is implemented in Python using also
**TensorFlow**. - It can be downloaded
**HERE**. - References:
- N. H. Pham, L. M. Nguyen, D. T. Phan, and Q. Tran-Dinh,
,*ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconve*x Optimization*arXiv preprint arXiv:1902.05679*, 2019. - Q. Tran-Dinh, N. H. Pham. D. T. Phan, and L. M. Nguyen,
, https://arxiv.org/pdf/1905.05920.pdf, 2019.**Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization**

- N. H. Pham, L. M. Nguyen, D. T. Phan, and Q. Tran-Dinh,

**Sieve-SDP – A Preprocessing Tool for Semidefinite Programming**

**Sieve-SDP**is a MATAB code to preprocess SDPs based on a simple algorithm developed in “Y. Zhu, G. Pataki, and Q. Tran-Dinh:, https://arxiv.org/abs/1710.08954“.**Sieve-SDP: a simple facial reduction algorithm to preprocess semidefinite programs****Sieve-SDP**can be downloaded from https://github.com/unc-optimization/SieveSDP.

**PAPA – Proximal Alternating Penalty Algorithms**

**PAPA-v1.0**is a MATLAB software package for solving constrained convex optimization problems of the form:where and are two convex functions, , is a simple, nonempty, closed, and convex set in .

Here, we assume that and are proximally tractable, i.e., the proximal operators and are efficient to compute (see below).**PAPA**aims at solving constrained convex optimization problem for any convex functions and , where their proximal operator is provided.

**PAPA**is developed by**Quoc Tran-Dinh**at the Department of Statistics and Operations Research, University of North Carolina at Chapel Hill (UNC), North Carolina.**Download:****PAPA-v1.0**can be downloaded**HERE**.**References:**The theory and algorithms implemented in**PAPA**can be found in:- Q. Tran Dinh: Proximal Alternating Penalty Algorithms for Nonsmooth Constrained Convex Optimization, Manuscript, STOR-UNC, Nov., 2017. (preprint:

- Q. Tran Dinh: Proximal Alternating Penalty Algorithms for Nonsmooth Constrained Convex Optimization, Manuscript, STOR-UNC, Nov., 2017. (preprint:

**LMAOPT**** – Low-Rank Matrix Approximation Optimization**

**– Low-Rank Matrix Approximation Optimization****LMAOPT-v1.0**is a MATLAB code collection for solving three special cases of the following low-rank matrix optimization problem:where is a proper, closed and convex function from , is a linear operator from to , and is a given observed vector. Here, we are more interested in the case . Currently, we provide the code to solve three special cases of the above problem:

**Quadratic loss:**;**Quadratic loss and symmetric case:**and ; and**Nonsmooth objective loss with tractably proximal operators:**For instance, .

**LMAOPT**is implemented by**Quoc Tran-Dinh**at the Department of Statistics and Operations Research (STAT&OR), The University of North Carolina at Chapel Hill (UNC). This is a joint work with Zheqi Zhang at STAT & OR, UNC.**Download:****LMAOPT-v1.0**can be downloaded**HERE**.

For further information, please read**Readme3.txt**.**References:**The theory and algorithms implemented in**LMAOPT**can be found in the following manuscript: Q. Tran-Dinh, and Z. Zhang:**Extended Gauss-Newton and Gauss-Newton ADMM algorithms for low-rank matrix optimization**.*STAT&OR Tech. Report UNC-REPORT-2016.a*, (2016). Preprint: http://arxiv.org/abs/1606.03358

**DECOPT: Decomposition OPTimization**

**DECOPT-v1.0**is a MATLAB software package for solving constrained convex optimization problems of the form:where and are two convex functions, , , are the lower and upper bound of .

Here, we assume that and are proximally tractable, i.e., the proximal operators and are efficient to compute (see below).**DECOPT**aims at solving constrained convex optimization problem for any convex functions and , where their proximal operator is provided.

The following special cases have been customized in**DECOPT**:**Basis pursuit:****The -unconstrained LASSO problem:****The -convex problem:****The square-root LASSO problem:****The – basis denosing (BPDN) problem:****The – – constrained LASSO problem:**

Here, is a penalty parameter, is a given noise level parameter, is the observed measurement vector, and is a linear operator.

**DECOPT**is developed by**Quoc Tran-Dinh**when he was with the Laboratory for Information and Inference Systems (LIONS), EPFL, Lausanne, Switzerland.

This is a joint work with**V. Cevher**at EPFL/LIONS.**Download:****DECOPT-v1.0**can be downloaded**HERE**.

For further information, please read**Readme2.txt**.**References:**The theory and algorithms implemented in**DECOPT**can be found in:- Q. Tran-Dinh, and V. Cevher: Constrained convex minimization via model-based excessive gap.
*Proceedings of the annual conference on Neural Information Processing Systems Foundation*(NIPS), Montreal, Canada, (2014).

Preprint: http://papers.nips.cc/paper/5574-constrained-convex-minimization-via-model-based-excessive-gap. - Q. Tran Dinh, and V. Cevher: A Primal-Dual Algorithmic Framework for Constrained Convex Minimization.
*LIONS Tech. Report EPFL-REPORT-199844,*,(2014).

Preprint:

- Q. Tran-Dinh, and V. Cevher: Constrained convex minimization via model-based excessive gap.

**SCOPT: ****S**elf-**C**oncordant **OPT**imization

**S**elf-

**C**oncordant

**OPT**imization

**SCOPT-v1.0**, a**MATLAB package for Composite Self-concordant (like) Minimization,**is a MATLAB implementation of the proximal-gradient, proximal quasi-Newton, proximal Newton, and path-following interior-point algorithms for solving composite convex minimization problems involving self-concordant and self-concordant-like cost functions:where is a self-concordant or self-concordant-like function, and is a possibly nonsmooth and convex function.The function is usually assumed that its proximal operator is “efficient” to compute (i.e., in a closed form or by polynomial-time algorithms). If the proximal operator of can be computed efficently, then we say that is

*tractably proximal*.**Definition:**Briefly, a function is called self-concordant (with the parameter ) if , where for , and .**Examples:**(for a symmetric and positive definite matrix ), , and are all self-concordant.**SCOPT**is developed by**Quoc Tran-Dinh**at the Laboratory for Information and Inference Systems (LIONS), EPFL, Lausanne, Switzerland.

This is a joint work with**A. Kyrillidis**and**V. Cevher**at EPFL/LIONS.**SCOPT**consists of several algorithms customized for solving the following convex optimization problems (but not limited to those).**Sparse Inverse Covariance Estimation**: An instance of this problem class can be expressed as follows:where is a penalty parameter.

*Reference:*J. Friedman, T. Hastie, and R. Tibshirani. Sparse Inverse Covariance Estimation with the graphical LASSO.*Biostatistics*, vol. 9, no. 3, pp. 432–441, 2007.

**Sparse Poisson Intensity Reconstruction:**As an example, this problem has the following form:where , is a penalty parameter, and is the TV (total-variation) norm.

*Reference:*Z. T. Harmany, R. F. Marcia, and R. M. Willett. This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction Algorithms – Theory and Practice.*IEEE Trans. Image Processing*, vol. 21, no. 3, pp. 1084–1096, 2012.

**Heteroscedastic LASSO:**This problem is formulated as follows:where is a penalty parameter.

*Reference:*N. Stadler, P. Buhlmann, and S. V. de Geer. L1-Penalization for mixture regression models.*TEST*, vol. 19, no. 2, pp. 209–256, 2010.

**Sparse logistic and multimonomial logistic regression:**This problem can be written in the following mathematical form:where and are input data, and is a penalty parameter.

*Reference:*B. Krishnapuram, M. Figueredo, L. Carin, and A. Hertemink. Sparse multinomial logistic regression: Fast algorithms and generalization bounds.*IEEE Trans. Pattern Analysis and Machine Intelligence (PAMI)*, vol. 27, pp. 957–968, 2005.

- In addition to the above mentioned problems,
**SCOPT**also contains the codes which can be customized to solve the following constrained convex optimization problem:where is a convex function with a tractable proximity operator, and is a nonempty, closed and convex set in endowed with a self-concordant barrier (i.e., is self-concordant with and for given and ).

*Example:*The clustering problem in unsupervised learning can be reformulated as the above constrained convex problem using the -norm:*Reference:*Jalali, A., and Srebro, N. Clustering using max-norm constrained optimization.*In Proc. of International Conference on Machine Learning*(ICML2012) (2012), pp. 1–17.

**Download: SCOPT-v1.0**can be downloaded**HERE**. For further information, please read**Readme.txt**.**References:**The**theory**and**algorithms**implemented in**SCOPT**can be found in:- Q. Tran-Dinh, A. Kyrillidis, and V. Cevher (2013): A proximal Newton framework for composite minimization: Graph learning without Cholesky decomposition and matrix inversions.
*Proceedings of the 30th International Conference on Machine Learning*(ICML), JMLR W&CP 28(2): 271–279, 2013.**Preprint:**https://arxiv.org/abs/1301.1459. - Q. Tran-Dinh, A. Kyrillidis and V. Cevher: Composite Self-concordant Minimization,
*Journal of Machine Learning Research*, 2014.**Preprint:**arxiv.org/abs/1308.2867 - Q. Tran-Dinh, A. Kyrillidis and V. Cevher: An inexact proximal path-following algorithm for constrained convex minimization,
*SIAM J. Optimization*, vol. 24, no. 4, pp. 1718–1745, 2014.**Preprint:**http://arxiv.org/abs/ 1311.1756.

- Q. Tran-Dinh, A. Kyrillidis, and V. Cevher (2013): A proximal Newton framework for composite minimization: Graph learning without Cholesky decomposition and matrix inversions.

**BMISolver**** – Optimization involving Bilinear Matrix Inequalities**

**BMIsolver-v1.1****– a MATLAB software package for BMI optimization**

is a MATLAB package for solving optimization problems with BMI (**BMIsolver****bilinear matrix inequality**) constraints arising in feedback controller design (spectral abscissa, H2, H-infinity and mixed H2/Hinf optimization problems). It is implemented based on the methods proposed in [1] and [2] which is a joint work with**S. Gummusoy**and**W. Michiels**. The code is developed by**Quoc Tran Dinh**at Department of Electrical Engineering (ESAT/SCD) and Optimization in Engineering Center (OPTEC), KU Leuven, Belgium (under the supervision of Prof.**M. Diehl**). The current version requires**YALMIP**as a modeling language. We recommend**SeDuMi**as an SDP solver.**Download BMIsolver**- A recently updated version (v.1.1) can be found
**HERE**. - Third-party packages:
**YALMIP**– A MATLAB modeling language toolbox. It can be downloaded at http://users.isy.liu.se/johanl/yalmip/.**SEDUMI**– An SDP solver. It can be downloaded at http://sedumi.ie.lehigh.edu/.**COMPLeib**– A collection of test problems. It can be downloaded at http://www.compleib.de/.

- A recently updated version (v.1.1) can be found
**References**- Q. Tran Dinh, S. Gumussoy, W. Michiels and M. Diehl. Combining Convex-Concave Decompositions and Linearization Approaches for solving BMIs, with Application to Static Output Feedback. Tech. Report, July, 2011, [pdf].
- Q. Tran Dinh, W. Michiels and M. Diehl. An inner convex approximation algorithm for BMI optimization and applications in control. Tech. Report, December, 2011, [pdf].

**SCPCVX: Sequential Convex Programming – CVX Interface**

**SCPCVX**– an interface for Sequential Convex Programming Methods (SCP) using the CVX package in Matlab. It can be downloaded HERE.

**Useful software tools and codes in optimization**

I have been using many open-source as well as commercial software packages for my research. I find that they are useful and valuable. Here are some of them.

- ACADOToolkit – A Toolkit for Automatic Control and Dynamic Optimization (open-source), Boris Houska and Hans Joachim Ferreau (for C++ and Matlab).

- IPOPT – Sparse Nonlinear Optimization (open-source).
- CasADi – A minimalistic computer algebra system with automatic differentiation (open-source), Joel Andersson (for C++ and Python).
- qpOASES – Parametric Quadratic Programming for MPC (open-source), Hans Joachim Ferreau.
- TFOCS – Templates for first-order conic solvers, Stephen Becker.
- CVX – Disciplined Convex Programming, Michael Grant.
- Some of many other software tools that I have been using: SDPT3, SDPNAL+, SeDuMi, SDPA, Mosek, CPLEX, etc.