1. What
Broyden-Fletcher-Goldfarb-Shanno
- Quasi-Newton optimization algorithm
- Gradient based: use first derivative of th objective function to improve the solution
- differnce of different algorithm: how to update the parameters
- first derivative
- hessian matrix – BFGS
- Maintain an approximation of the Hessian
2. How
Find the minimum, the gradient is known or calculated
$$
B_{k+1} = B_k + \frac{(\Delta \theta_k)(\Delta \theta_k)^T}{(\Delta \theta_k)^T \Delta g_k} - \frac{B_k \Delta g_k (\Delta g_k)^T B_k}{(\Delta g_k)^T B_k \Delta g_k}
$$
1 | import numpy as np |
3. Ref
ChatGPT