Skip to content

Commit a1b5735

Browse files
authored
Add how-to add an algorithm guide and improve documentation of some internal classes (#570)
1 parent ba26787 commit a1b5735

13 files changed

+991
-34
lines changed

.tools/envs/testenv-linux.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ dependencies:
77
- petsc4py
88
- jax
99
- cyipopt>=1.4.0 # dev, tests
10-
- pygmo>=2.19.0 # dev, tests
10+
- pygmo>=2.19.0 # dev, tests, docs
1111
- nlopt # dev, tests, docs
1212
- pip # dev, tests, docs
1313
- pytest # dev, tests

.tools/envs/testenv-numpy.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ dependencies:
77
- pandas>=2
88
- numpy<2
99
- cyipopt>=1.4.0 # dev, tests
10-
- pygmo>=2.19.0 # dev, tests
10+
- pygmo>=2.19.0 # dev, tests, docs
1111
- nlopt # dev, tests, docs
1212
- pip # dev, tests, docs
1313
- pytest # dev, tests

.tools/envs/testenv-others.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ channels:
55
- nodefaults
66
dependencies:
77
- cyipopt>=1.4.0 # dev, tests
8-
- pygmo>=2.19.0 # dev, tests
8+
- pygmo>=2.19.0 # dev, tests, docs
99
- nlopt # dev, tests, docs
1010
- pip # dev, tests, docs
1111
- pytest # dev, tests

.tools/envs/testenv-pandas.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ dependencies:
77
- pandas<2
88
- numpy<2
99
- cyipopt>=1.4.0 # dev, tests
10-
- pygmo>=2.19.0 # dev, tests
10+
- pygmo>=2.19.0 # dev, tests, docs
1111
- nlopt # dev, tests, docs
1212
- pip # dev, tests, docs
1313
- pytest # dev, tests

docs/rtd_environment.yml

+2-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ channels:
44
- conda-forge
55
- nodefaults
66
dependencies:
7-
- python=3.10
7+
- python=3.11
88
- typing-extensions
99
- pip
1010
- setuptools_scm
@@ -29,6 +29,7 @@ dependencies:
2929
- plotly
3030
- nlopt
3131
- annotated-types
32+
- pygmo>=2.19.0
3233
- pip:
3334
- ../
3435
- kaleido

docs/source/explanation/internal_optimizers.md

+88-22
Original file line numberDiff line numberDiff line change
@@ -9,48 +9,114 @@ internal optimizer interface.
99

1010
The advantages of using the algorithm with optimagic over using it directly are:
1111

12+
- You can collect the optimizer history and create criterion_plots and params_plots.
13+
- You can use flexible formats for your start parameters (e.g. nested dicts or
14+
namedtuples)
1215
- optimagic turns unconstrained optimizers into constrained ones.
1316
- You can use logging.
1417
- You get great error handling for exceptions in the criterion function or gradient.
15-
- You get a parallelized and customizable numerical gradient if the user did not provide
16-
a closed form gradient.
17-
- You can compare your optimizer with all the other optimagic optimizers by changing
18-
only one line of code.
18+
- You get a parallelized and customizable numerical gradient if you don't have a closed
19+
form gradient.
20+
- You can compare your optimizer with all the other optimagic optimizers on our
21+
benchmark sets.
1922

2023
All of this functionality is achieved by transforming a more complicated user provided
2124
problem into a simpler problem and then calling "internal optimizers" to solve the
2225
transformed problem.
2326

24-
## The internal optimizer interface
27+
(functions_and_classes_for_internal_optimizers)=
2528

26-
(to be written)
29+
## Functions and classes for internal optimizers
30+
31+
The functions and classes below are everything you need to know to add an optimizer to
32+
optimagic. To see them in action look at
33+
[this guide](../how_to/how_to_add_optimizers.ipynb)
34+
35+
```{eval-rst}
36+
.. currentmodule:: optimagic.mark
37+
```
38+
39+
```{eval-rst}
40+
.. dropdown:: mark.minimizer
41+
42+
The `mark.minimizer` decorator is used to provide algorithm specific information to
43+
optimagic. This information is used in the algorithm selection tool, for better
44+
error handling and for processing of the user provided optimization problem.
45+
46+
.. autofunction:: minimizer
47+
```
48+
49+
```{eval-rst}
50+
.. currentmodule:: optimagic.optimization.internal_optimization_problem
51+
```
52+
53+
```{eval-rst}
54+
55+
56+
.. dropdown:: InternalOptimizationProblem
57+
58+
The `InternalOptimizationProblem` is optimagic's internal representation of objective
59+
functions, derivatives, bounds, constraints, and more. This representation is already
60+
pretty close to what most algorithms expect (e.g. parameters and bounds are flat
61+
numpy arrays, no matter which format the user provided).
62+
63+
.. autoclass:: InternalOptimizationProblem()
64+
:members:
65+
66+
```
2767

28-
## Output of internal optimizers
68+
```{eval-rst}
69+
.. currentmodule:: optimagic.optimization.algorithm
70+
```
71+
72+
```{eval-rst}
73+
74+
.. dropdown:: InternalOptimizeResult
75+
76+
This is what you need to create from the output of a wrapped algorithm.
77+
78+
.. autoclass:: InternalOptimizeResult
79+
:members:
80+
81+
```
82+
83+
```{eval-rst}
84+
85+
.. dropdown:: Algorithm
86+
87+
.. autoclass:: Algorithm
88+
:members:
89+
:exclude-members: with_option_if_applicable
90+
91+
```
2992

3093
(naming-conventions)=
3194

3295
## Naming conventions for algorithm specific arguments
3396

34-
Many optimizers have similar but slightly different names for arguments that configure
35-
the convergence criteria, other stopping conditions, and so on. We try to harmonize
36-
those names and their default values where possible.
37-
38-
Since some optimizers support many tuning parameters we group some of them by the first
39-
part of their name (e.g. all convergence criteria names start with `convergence`). See
40-
{ref}`list_of_algorithms` for the signatures of the provided internal optimizers.
97+
To make switching between different algorithm as simple as possible, we align the names
98+
of commonly used convergence and stopping criteria. We also align the default values for
99+
stopping and convergence criteria as much as possible.
41100

42-
The preferred default values can be imported from `optimagic.optimization.algo_options`
43-
which are documented in {ref}`algo_options`. If you add a new optimizer to optimagic you
44-
should only deviate from them if you have good reasons.
101+
You can find the harmonized names and value [here](algo_options_docs).
45102

46-
Note that a complete harmonization is not possible nor desirable, because often
47-
convergence criteria that clearly are the same are implemented slightly different for
48-
different optimizers. However, complete transparency is possible and we try to document
49-
the exact meaning of all options for all optimizers.
103+
To align the names of other tuning parameters as much as possible with what is already
104+
there, simple have a look at the optimizers we already wrapped. For example, if you are
105+
wrapping a bfgs or lbfgs algorithm from some libray, try to look at all existing
106+
wrappers of bfgs algorithms and use the same names for the same options.
50107

51108
## Algorithms that parallelize
52109

53-
(to be written)
110+
Algorithms that evaluate the objective function or derivatives in parallel should only
111+
do so via `InternalOptimizationProblem.batch_fun`,
112+
`InternalOptimizationProblem.batch_jac` or
113+
`InternalOptimizationProblem.batch_fun_and_jac`.
114+
115+
If you parallelize in any other way, the automatic history collection will stop to work.
116+
117+
In that case, call `om.mark.minimizer` with `disable_history=True`. In that case you can
118+
either do your own history collection and add that history to `InternalOptimizeResult`
119+
or the user has to rely on logging.
54120

55121
## Nonlinear constraints
56122

0 commit comments

Comments
 (0)