Introduction - If you have any usage issues, please Google them yourself
Conjugate gradient method is between the steepest descent method and Newton method between a method that only use the first derivative information, but the steepest descent method to overcome the disadvantage of slow convergence, but also avoids the need to store and calculate Newton Hesse matrix and the shortcomings of the inverse, conjugate gradient method is not only linear equations to solve large-scale one of the most useful, large-scale nonlinear optimization solution is also the most efficient algorithms.