Software: OptSuite
OptSuite is a collection of optimization software composed of multiple relatively independent modules, covering various research directions ranging from low-level algebraic libraries, convex optimization, nonlinear programming, manifold optimization, integer programming, to the integration of machine learning and optimization. Each module is designed for specific types of problems, featuring relatively independent code repositories, algorithm implementations, and usage interfaces. Users can select individual modules based on their research needs or combine multiple modules to construct more complex solving workflows.
The following is a list of packages in OptSuite:
Convex Optimization
SSNCVX, A Semi-Smooth Newton method for Convex optimization problems, 2025 — code: “SSNCVX”
SDPDAL, A Decomposition-based Augmented Lagrangian method for low-rank Semi-Definite Programming, 2023 — code: “SDPDAL”
SSNLP, A Semi-smooth Newton Method for Linear Programming (with Y. Liu), 2017 — code: “SSNLP”
LMaFit, A MATLAB solver for low-rank matrix fitting (with Y. Zhang & W. Yin), 2010 — code: “LMaFit”
FPC_AS, A MATLAB solver for L1 regularization problems (with W. Yin), 2008 — code: “FPC_AS”
Formalization
Learn to Optimize
O2O, ODE-based learning to optimize, 2024 — code: “O2O”
MCPG, A Monte Carlo Policy Gradient Method for binary optimization (with C. Chen, R. Chen, T. Li), 2023 — code: “MCPG”
LLM Modeling
LLM Training and Deep Learning
RSO, A memory efficient Randomized Subspace Optimization method for training large language models, 2025 — code: “RSO”
LOZO, Enhancing Zeroth-order fine-tuning for language models with low-rank structures, 2024 — code: “LOZO”
SENG, A Sketchy Empirical Natural Gradient methods (SENG) for solving large-scale deep learning problems, 2021 — code: “SENG”
NGPlus, A new second-order optimizer for deep learning, 2021 — code: “NGPlus”
Manifold Optimization
RNGD, Riemannian Natural Gradient Descent methods, 2023 — code: “RNGD”
ASQN, A Structured Quasi-Newton method for solving optimization with orthogonality constraint, 2019 — code: “ASQN”
RBR, A Row-By-Row Method for Community Detection (with J. Zhang & H. Liu), 2017 — code: “RBR”
ARNT, Adaptive Regularized Newton Method for Riemannian Optimization (with J. Hu), 2017 — code: “ARNT”
OptM, Optimization with Orthogonality Constraints (with W. Yin), 2010 — code
Nonlinear Programming
Numerical Linear Algebra
PFOpt, A polynomial-filtered subspace extraction for low-rank optimization (with H. Liu), 2019 — code: “PFOpt”
Arrabit, Augmented Rayleigh-Ritz (ARR) And Block Iteration for large-scale eigenpair computation (with Y. Zhang), 2015 — code
LMSVD, Limited Memory Block Krylov Subspace Optimization for Computing Dominant Singular Value Decompositions (with X. Liu & Y. Zhang), 2012 — code
miscellaneous
Demo codes for education
刘浩洋, 户将, 李勇锋,文再文,最优化:建模、算法与理论, 高教出版社, 书号978-7-04-055035-1
H. Liu, J. Hu, Y. Li, Z. Wen, Optimization: Model, Algorithm and Theory (in Chinese)
(点击此链接访问详细代码和注释)
|