Derivative-free optimization
Derivative free optimization (or derivative-free optimization) is a subject of mathematical optimization. It may refer to problems for which derivative information is unavailable, unreliable or impractical to obtain (derivative-free optimization problems), or methods that do not use derivatives (derivative-free optimization methods).[1]
Algorithms
- DONE
- Bayesian optimization
- CMA-ES
- Coordinate descent and adaptive coordinate descent
- Cuckoo search
- Genetic algorithms
- MCS algorithm
- Nelder-Mead method
- Particle swarm optimization
- Pattern search
- Powell's COBYLA, UOBYQA, NEWUOA, BOBYQA and LINCOA algorithms
- Random search
- Shuffled complex evolution algorithm
- Simulated annealing
See also
References
- ↑ Conn, A. R.; Scheinberg, K.; Vicente, L. N. (2009). Introduction to Derivative-Free Optimization. MPS-SIAM Book Series on Optimization. Philadelphia: SIAM. Retrieved 2014-01-18.
This article is issued from Wikipedia - version of the 9/1/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.