1
2
Derivative Free Optimization | Derivative Based Optimization |
---|---|
Derivative Free Optimization cannot be derived | Derivative Based Optimization can be derived |
It makes use of evolutionary concepts. | It does not makes use of evolutionary concepts. |
It is slower than Derivative Based Optimization. | It is faster than Derivative Free Optimization. |
It makes use of random number generator to find the search directions. | It does not makes use of random number generator to find the search directions. |
No analysis is done sue to randomness. | Analysis is performed at every step. |
There is no need of differentiable function. | There is need of differentiable function. |
Some Natural wisdom is used that is based on evolution & thermo Dynamics. | No Natural wisdom is used. |
Techique: Simulated Annealing. | Techique: Descent Method & Newton's Method. |
2
ADD COMMENT
0
Derivative free optimization
- Derivative free optimization is repeated evaluation of objective function
- The concept are based on natures wisdom, such as evolution and thermodynamics
- The analytic opacity knowledge about them are based on empirical studies.
Stopping condition in derivative free optimization:
Let k denote an iteration count and fk denote the best objective function obtained at count k. stopping condition depends on
- Computation time
- Optimization goal;
- Minimal Improvement
- Minimal relative improvement
Derivative based optimization
- Derivative based optimization deals with gradient-based optimization techniques, capable of determining search directions according to an objective function’s derivative information
It is used in optimizing non-linear neuro-fuzzy models,
– Steepest descent
– Conjugate gradient
0
ADD COMMENT
Please log in to add an answer.