|
Software: OptSuite
OptSuite is a collection of optimization software composed of multiple relatively independent modules, covering various research directions ranging from low-level algebraic libraries, convex optimization, nonlinear programming, manifold optimization, integer programming, to the integration of machine learning and optimization. Each module is designed for specific types of problems, featuring relatively independent code repositories, algorithm implementations, and usage interfaces. Users can select individual modules based on their research needs or combine multiple modules to construct more complex solving workflows.
The following is a list of packages in OptSuite:
Convex Optimization
SSNCVX, A Semi-Smooth Newton method for Convex optimization problems, 2025 — code: “SSNCVX”
SDPDAL, A Decomposition-based Augmented Lagrangian method for low-rank Semi-Definite Programming, 2023 — code: “SDPDAL”
SSNLP, A Semi-smooth Newton Method for Linear Programming (with Y. Liu), 2017 — code: “SSNLP”
LMaFit, A MATLAB solver for low-rank matrix fitting (with Y. Zhang & W. Yin), 2010 — code: “LMaFit”
FPC_AS, A MATLAB solver for L1 regularization problems (with W. Yin), 2008 — code: “FPC_AS”
Formalization
optlib, A collection of codes in Lean for the formalization of mathematical optimization, ongoing — code: “optlib”
ReasBook, A Lean 4 project for formalizing mathematics from textbooks and research papers, ongoing — code: “ReasBook”
AMBER, A benchmark for applied mathematics in Lean 4, 2025, - code: “AMBER”
SITA, A framework that automates the formalization of mathematical theorems in Lean by bridging abstract structures and their concrete instances, 2025 - code: “SITA”
M2F, a framework for translating textbook- and paper-level mathematics into Lean projects that pass machine verification at scale, 2026 - code: “M2F”
Learn to Optimize
O2O, ODE-based learning to optimize, 2024 — code: “O2O”
MCPG, A Monte Carlo Policy Gradient Method for binary optimization (with C. Chen, R. Chen, T. Li), 2023 — code: “MCPG”
LMask, A learning framework that utilizes dynamic masking to generate high-quality feasible solutions for constrained routing problems, 2025 - code: “LMask”
LLM Modeling
OptMATH, A Scalable Bidirectional Data Synthesis Framework for Optimization Modeling, 2025 — code: “OptMATH”
MIPLIB-NL, A dataset of large-scale natural language optimization problems derived and reverse-engineered from the MIPLIB 2017 collection, 2026 - code: “MIPLIB-NL”
LLM Training and Deep Learning
RSO, A memory efficient Randomized Subspace Optimization method for training large language models, 2025 — code: “RSO”
LOZO, Enhancing Zeroth-order fine-tuning for language models with low-rank structures, 2024 — code: “LOZO”
SENG, A Sketchy Empirical Natural Gradient methods (SENG) for solving large-scale deep learning problems, 2021 — code: “SENG”
NGPlus, A new second-order optimizer for deep learning, 2021 — code: “NGPlus”
Manifold Optimization
RNGD, Riemannian Natural Gradient Descent methods, 2023 — code: “RNGD”
ASQN, A Structured Quasi-Newton method for solving optimization with orthogonality constraint, 2019 — code: “ASQN”
RBR, A Row-By-Row Method for Community Detection (with J. Zhang & H. Liu), 2017 — code: “RBR”
ARNT, Adaptive Regularized Newton Method for Riemannian Optimization (with J. Hu), 2017 — code: “ARNT”
OptM, Optimization with Orthogonality Constraints (with W. Yin), 2010 — code
Nonlinear Programming
Numerical Linear Algebra
PFOpt, A polynomial-filtered subspace extraction for low-rank optimization (with H. Liu), 2019 — code: “PFOpt”
Arrabit, Augmented Rayleigh-Ritz (ARR) And Block Iteration for large-scale eigenpair computation (with Y. Zhang), 2015 — code
LMSVD, Limited Memory Block Krylov Subspace Optimization for Computing Dominant Singular Value Decompositions (with X. Liu & Y. Zhang), 2012 — code
miscellaneous
Demo codes for education
刘浩洋, 户将, 李勇锋,文再文,最优化:建模、算法与理论, 高教出版社, 书号978-7-04-055035-1
H. Liu, J. Hu, Y. Li, Z. Wen, Optimization: Model, Algorithm and Theory (in Chinese)
(点击此链接访问详细代码和注释)
|