PETSc is widely-used and highly performant library that has a wrapper in Python (https://petsc.org/release/), but I suspect that with the problems you are encountering in SciPy, simply swapping the solver library won't have a large effect. Newton-Krylov methods at thier core are pretty similar across different libraries modulo efficiency and options. Convergence issues are resolved by better preconditioning and better globalization methods. Without more information on your particular problem, no specific advice can be given. The problem you linked had horrible conditioning with no preconditioner, so it is no surprise it failed to converge. Newton-Krylov methods are powerful tools, but they are tools you must understand and tailor to your specific problem to efficiently obtain accurate solutions.

Here are some related posts you may find helpful:

When is Newton-Krylov not an appropriate solver?

Why is Newton's method not converging?

Why is my iterative linear solver not converging?

Answer from whpowell96 on Stack Exchange
🌐
SciPy
docs.scipy.org › doc › scipy › reference › optimize.html
Optimization and root finding (scipy.optimize) — SciPy v1.17.0 Manual
Scientific Python Forum · Search Ctrl+K · SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programming, ...
🌐
SciPy
docs.scipy.org › doc › scipy › tutorial › optimize.html
Optimization (scipy.optimize) — SciPy v1.17.0 Manual
The solution can, however, be found using one of the large-scale solvers, for example krylov, broyden2, or anderson. These use what is known as the inexact Newton method, which instead of computing the Jacobian matrix exactly, forms an approximation for it. ... import numpy as np from ...
Discussions

optimization - Is there a high quality nonlinear programming solver for Python? - Computational Science Stack Exchange
Currently I use MATLAB's Optimization Toolbox (specifically, fmincon() with algorithm='sqp'), which is quite effective. However, most of my code is in Python, and I'd love to do the optimization in Python as well. Is there a NLP solver with Python bindings that can compete with fmincon()? More on scicomp.stackexchange.com
🌐 scicomp.stackexchange.com
November 30, 2011
Is scipy.optimize a good replacement for Excel Solver’s constraint-based optimization?
I tried scipy.optimize, and then i tried implementing a Newton-raphson algorithm for myself using numpy. The numpy was faster by 8000 times for my usecase (80 seconds vs .01) More on reddit.com
🌐 r/learnpython
5
4
June 18, 2025
Python optimization with a solver - Stack Overflow
I have a question about solvers. I've read a lot about them and also here on stackoverflow, but I have stil some questions about the mode of operation. I wanna start with a simple scipy.optimize. More on stackoverflow.com
🌐 stackoverflow.com
Optimization Libraries
Optimization is one of the topics Python is pretty hot for. Often, the optimization code is written in C/++ with the python bindings being the preferred usage. There is scipy optimize (see the minimize function) which is usually the jumping off point. CVXOPT = "Convex Opt" -- (local) solvers like Nelder-Meade, BFGS, etc, for solving convex problems. For nonconvex problems (i.e., ones with multiple minima) you need a global solver (simulated annealing, metropolis-hastings, etc). Pyswarms gives you particle swarm optimizers. Mystic is quite similar to scipy.optimize.minimize, with a very different interface. More on reddit.com
🌐 r/Python
10
2
March 25, 2019
🌐
Gurobi Optimization
gurobi.com › solutions › gurobi-optimizer
Gurobi Optimizer | Gurobi
Our global support team includes PhD mathematicians and experienced practitioners who understand optimization at the deepest level. They'll help you diagnose solver behavior, improve model performance, and get unstuck fast.Talk with an Expert · Over 1,500 companies—including SAP, Air France, and the NFL—rely on Gurobi to power the decisions that keep their operations running. Gurobi integrates easily with modern analytics and development environments. With flexible APIs—including a widely used Python API—teams can build and deploy optimization models directly within applications, services, and data pipelines.
🌐
DataCamp
datacamp.com › tutorial › optimization-in-python
Optimization in Python: Techniques, Packages, and Best Practices | DataCamp
August 31, 2024 - Popular Python packages for numerical optimization include SciPy (for general-purpose optimization), CVXPY (for convex optimization), Pyomo (for flexible modeling), and powerful solvers like Gurobi and CPLEX, which are suited for large-scale industry applications.
Find elsewhere
🌐
Real Python
realpython.com › linear-programming-python
Hands-On Linear Programming: Optimization With Python – Real Python
June 16, 2023 - PuLP is a Python linear programming API for defining problems and invoking external solvers. SciPy is straightforward to set up. Once you install it, you’ll have everything you need to start. Its subpackage scipy.optimize can be used for both linear and nonlinear optimization.
Top answer
1 of 16
42

fmincon(), as you mentioned, employs several strategies that are well-known in nonlinear optimization that attempt to find a local minimum without much regard for whether the global optimum has been found. If you're okay with this, then I think you have phrased the question correctly (nonlinear optimization).

The best package I'm aware of for general nonlinear optimization is IPOPT[1]. Apparently Matthew Xu maintains a set of Python bindings to IPOPT, so this might be somewhere to start.

UPDATE: the current maintained Python bindings for IPOPT seems to be ipyopt

[1]: Andreas Wachter is a personal friend, so I may be a bit biased.

2 of 16
47

I work in a lab that does global optimization of mixed-integer and non-convex problems. My experience with open source optimization solvers has been that the better ones are typically written in a compiled language, and they fare poorly compared to commercial optimization packages.

If you can formulate your problem as an explicit system of equations and need a free solver, your best bet is probably IPOPT, as Aron said. Other free solvers can be found on the COIN-OR web site. To my knowledge, the nonlinear solvers do not have Python bindings provided by the developers; any bindings you find would be third-party. In order to obtain good solutions, you would also have to wrap any nonlinear, convex solver you found in appropriate stochastic global optimization heuristics, or in a deterministic global optimization algorithm such as branch-and-bound. Alternatively, you could use Bonmin or Couenne, both of which are deterministic non-convex optimization solvers that perform serviceably well compared to the state-of-the-art solver, BARON.

If you can purchase a commercial optimization solver, you might consider looking at the GAMS modeling language, which includes several nonlinear optimization solvers. Of particular mention are the interfaces to the solvers CONOPT, SNOPT, and BARON. (CONOPT and SNOPT are convex solvers.) A kludgey solution that I've used in the past is to use the Fortran (or Matlab) language bindings to GAMS to write a GAMS file and call GAMS from Fortran (or Matlab) to calculate the solution of an optimization problem. GAMS has Python language bindings, and a very responsive support staff willing to help out if there's any trouble. (Disclaimer: I have no affiliation with GAMS, but my lab does own a GAMS license.) The commercial solvers should be no worse than fmincon; in fact, I'd be surprised if they weren't a lot better. If your problems are sufficiently small in size, then you may not even need to purchase a GAMS license and licenses to solvers, because an evaluation copy of GAMS may be downloaded from their web site. Otherwise, you would probably want to decide which solvers to purchase in conjunction with a GAMS license. It's worth noting that BARON requires a mixed-integer linear programming solver, and that licenses for the two best mixed-integer linear programming solvers CPLEX and GUROBI are free for academics, so you might be able to get away with just purchasing the GAMS interfaces rather than the interfaces and the solver licenses, which can save you quite a bit of money.

This point bears repeating: for any of the deterministic non-convex optimization solvers I've mentioned above, you need to be able to formulate the model as an explicit set of equations. Otherwise, the non-convex optimization algorithms won't work, because all of them rely on symbolic analysis to construct convex relaxations for branch-and-bound-like algorithms.

UPDATE: One thought that hadn't occurred to me at first was that you could also call the Toolkit for Advanced Optimization (TAO) and PETSc using tao4py and petsc4py, which would have the potential added benefit of easier parallelization, and leveraging familiarity with PETSc and the ACTS tools.

UPDATE #2: Based on the additional information you mentioned, sequential quadratic programming (SQP) methods are going to be your best bet. SQP methods are generally considered more robust than interior point methods, but have the drawback of requiring dense linear solves. Since you care more about robustness than speed, SQP is going to be your best bet. I can't find a good SQP solver out there written in Python (and apparently, neither could Sven Leyffer at Argonne in this technical report). I'm guessing that the algorithms implemented in packages like SciPy and OpenOpt have the basic skeleton of some SQP algorithms implemented, but without the specialized heuristics that more advanced codes use to overcome convergence issues. You could try NLopt, written by Steven Johnson at MIT. I don't have high hopes for it because it doesn't have any reputation that I know of, but Steven Johnson is a brilliant guy who writes good software (after all, he did co-write FFTW). It does implement a version of SQP; if it's good software, let me know.

I was hoping that TAO would have something in the way of a constrained optimization solver, but it doesn't. You could certainly use what they have to build one up; they have a lot of the components there. As you pointed out, though, it'd be much more work for you to do that, and if you're going to that sort of trouble, you might as well be a TAO developer.

With that additional information, you are more likely to get better results calling GAMS from Python (if that's an option at all), or trying to patch up the IPOPT Python interface. Since IPOPT uses an interior point method, it won't be as robust, but maybe Andreas' implementation of an interior point method is considerably better than Matlab's implementation of SQP, in which case, you may not be sacrificing robustness at all. You'd have to run some case studies to know for sure.

You're already aware of the trick to reformulate the rational inequality constraints as polynomial inequality constraints (it's in your book); the reason this would help BARON and some other nonconvex solvers is that it can use term analysis to generate additional valid inequalities that it can use as cuts to improve and speed up solver convergence.

Excluding the GAMS Python bindings and the Python interface to IPOPT, the answer is no, there aren't any high quality nonlinear programming solvers for Python yet. Maybe @Dominique will change that with NLPy.

UPDATE #3: More wild stabs at finding a Python-based solver yielded PyGMO, which is a set of Python bindings to PaGMO, a C++ based global multiobjective optimization solver. Although it was created for multiobjective optimization, it can also be used to single objective nonlinear programming, and has Python interfaces to IPOPT and SNOPT, among other solvers. It was developed within the European Space Agency, so hopefully there's a community behind it. It was also released relatively recently (November 24, 2011).

🌐
W3Schools
w3schools.com › python › scipy › scipy_optimizers.php
SciPy Optimizers
HTML CSS JAVASCRIPT SQL PYTHON JAVA PHP HOW TO W3.CSS C C++ C# BOOTSTRAP REACT MYSQL JQUERY EXCEL XML DJANGO NUMPY PANDAS NODEJS DSA TYPESCRIPT SWIFT ANGULAR ANGULARJS GIT POSTGRESQL MONGODB ASP AI R GO KOTLIN SWIFT SASS VUE GEN AI SCIPY AWS CYBERSECURITY DATA SCIENCE INTRO TO PROGRAMMING HTML & CSS BASH RUST TOOLS · SciPy Home SciPy Intro SciPy Getting Started SciPy Constants SciPy Optimizers SciPy Sparse Data SciPy Graphs SciPy Spatial Data SciPy Matlab Arrays SciPy Interpolation SciPy Significance Tests
🌐
Real Python
realpython.com › python-scipy-cluster-optimize
Scientific Python: Using SciPy for Optimization – Real Python
July 21, 2023 - As expected, the minimum was found at x = -1/√2. Note the additional output from this method, which includes a message attribute in res. This field is often used for more detailed output from some of the minimization solvers. ... scipy.optimize also includes the more general minimize().
🌐
Google
developers.google.com › or-tools › get started with or-tools for python
Get Started with OR-Tools for Python | Google for Developers
OR-Tools for Python helps find the best solution to a problem by defining an objective and constraints. The library provides solvers for various optimization problems, including linear optimization, mixed-integer programming, constraint programming, ...
🌐
Medium
medium.com › @chongjingting › 4-ways-to-solve-linear-programming-in-python-b4af36b7894d
4 Ways to Solve Linear Programming in Python | by Chong Jing Ting | Medium
March 10, 2022 - The first option is SciPy’s optimize.linprog. It is quite easy to use, considering many Python users are familiar with the SciPy library. A plus point is that it interfaces with HiGHS, a high-performance linear programming solver. However, SciPy’s linprog only solves minimization problems.
🌐
Caam37830
caam37830.github.io › book › 03_optimization › scipy_opt.html
Optimization in SciPy — Scientific Computing with Python
If you don’t provide the Hessian, many solvers will numerically approximate it, which will typically not work as well as an explicit Hessian. Again, you can also provide a Hessian to NonlinearConstraint · def Hf(x): return np.array([[-np.cos(x[0]), 0], [0, -np.sin(x[1])]]) %time sol3 = opt.minimize(f, x0, jac=Jf, hess=Hf) sol3 · CPU times: user 1.14 ms, sys: 947 µs, total: 2.09 ms Wall time: 3.48 ms · /home/brad/miniconda3/envs/pycourse/lib/python3.8/site-packages/scipy/optimize...
🌐
Solvermax
solvermax.com › home › resources › links
Solver Max - Optimization modelling in Python
PuLP is free, open source software written in Python. It is used to describe optimisation problems as mathematical models. PuLP can generate MPS or LP files to solve linear and integer problems using any of the following solvers:
🌐
Kindsonthegenius
kindsonthegenius.com › data-science › solving-an-optimization-problem-with-python-step-by-step
Solving an Optimization Problem with Python – Step by Step – Data Science Tutorials
January 20, 2021 - # Display the results if status == cp_model.OPTIMAL: print('Value of objective function: %i' % solver.ObjectiveValue()) print('x = %i' %solver.Value(x)) print('y = %i' %solver.Value(y)) print('z = %i' %solver.Value(z)) The output would be as shown below: Value of objective function: 35 x = 7 y = 3 z = 5 · Complete program is given below: Constraint Optimization Problem in Python ·
🌐
Reddit
reddit.com › r/python › optimization libraries
r/Python on Reddit: Optimization Libraries
March 25, 2019 -

I had a class on optimization and used AMPL for the work. The underlying solvers were CPLEX, GUROBI, Minos, and others. These appear to be a mix of commercial and open source solvers, but mostly I'm wondering if these are outdated, or if Python has a better alternative (Googling it looks to produce pandas/pysci and something called CVXOPT), and just to get some general input on this subject. I have yet to try out the various Python packages but would like to know people's thoughts on the subject.