Optimization that reads like Python.
📚 Documentation · 🚀 Quickstart · 💡 Examples
| With Optyx | With SciPy |
|---|---|
from optyx import Variable, Problem
x = Variable("x", lb=0)
y = Variable("y", lb=0)
solution = (
Problem()
.minimize(x**2 + y**2)
.subject_to(x + y >= 1)
.solve()
)
# x=0.5, y=0.5 |
from scipy.optimize import minimize
import numpy as np
def objective(v):
return v[0]**2 + v[1]**2
def gradient(v): # manual!
return np.array([2*v[0], 2*v[1]])
result = minimize(
objective, x0=[1, 1], jac=gradient,
method='SLSQP',
bounds=[(0, None), (0, None)],
constraints={'type': 'ineq',
'fun': lambda v: v[0]+v[1]-1}
) |
Your optimization code should read like your math. With Optyx, x + y >= 1 is exactly that—not a lambda buried in a constraint dictionary.
Python has excellent optimization libraries. SciPy provides algorithms. CVXPY handles convex problems. Pyomo scales to industrial applications.
Optyx takes a different path: radical simplicity.
- Write problems as you think them —
x**2 + y**2notlambda v: v[0]**2 + v[1]**2 - Never compute gradients by hand — symbolic autodiff handles derivatives
- Skip solver configuration — sensible defaults, automatic solver selection
Optyx is young and opinionated. It's not a replacement for specialized tools:
| Need | Use Instead |
|---|---|
| MILP at scale | Pyomo, OR-Tools, Gurobi |
| Convex guarantees | CVXPY |
| Maximum performance | Raw solver APIs |
But if you want readable optimization code that just works for most problems, Optyx might be for you.
pip install optyxRequires Python 3.12+, NumPy ≥2.0, SciPy ≥1.6.
from optyx import Variable, Problem
x = Variable("x", lb=0)
y = Variable("y", lb=0)
solution = (
Problem()
.minimize(x**2 + y**2)
.subject_to(x + y >= 1)
.solve()
)
# x=0.5, y=0.5, objective=0.5from optyx import Variable, Problem
# Asset weights
tech = Variable("tech", lb=0, ub=1)
energy = Variable("energy", lb=0, ub=1)
finance = Variable("finance", lb=0, ub=1)
# Expected returns and risk (simplified)
returns = 0.12*tech + 0.08*energy + 0.10*finance
risk = tech**2 + energy**2 + finance**2 # variance proxy
solution = (
Problem()
.minimize(risk)
.subject_to(returns >= 0.09) # minimum return
.subject_to((tech + energy + finance).eq(1)) # fully invested
.solve()
)from optyx import Variable
from optyx.core.autodiff import gradient
x = Variable("x")
f = x**3 + 2*x**2 - 5*x + 3
df = gradient(f, x) # Symbolic: 3x² + 4x - 5
print(df.evaluate({"x": 2.0})) # 15.0| Feature | Description |
|---|---|
| Natural syntax | x + y >= 1 instead of constraint dictionaries |
| Automatic gradients | Symbolic differentiation—no manual derivatives |
| Smart solver selection | HiGHS for LP, SLSQP/BFGS for NLP |
| Fast re-solve | Cached compilation, up to 900x speedup |
| Debuggable | Inspect expression trees, understand your model |
See the documentation for the full API reference, tutorials, and real-world examples.
Optyx is actively evolving:
- Vector/Matrix variables — Handle thousands of decision variables cleanly
- JIT compilation — Faster execution for complex models
- More solvers — IPOPT integration for large-scale NLP
- Better debugging — Infeasibility diagnostics and model inspection
See the roadmap for details.
git clone https://github.com/daggbt/optyx.git
cd optyx
uv sync
uv run pytestContributions welcome! See our contributing guide.
MIT