PyTorch optimizer based on nonlinear conjugate gradient method
-
Updated
Apr 25, 2025 - Python
PyTorch optimizer based on nonlinear conjugate gradient method
A comparative optimization study of logistic regression for fake news detection, evaluating gradient descent, conjugate gradient, Newton, and L-BFGS methods on large-scale TF-IDF text features for convergence behavior, runtime, and memory efficiency.
A python package that extends PyTorch by providing Conjugate Gradient optimization algorithms.
keywords: nonlinear optimization, pattern search, augmented lagrangian, karush-kuhn-tucker, constrained optimization, conjugate gradient methods, quasi newton methods, line search descent methods, onedimensional and multidimensional optimazation
An application for visualizing and comparing the work of multidimensional optimization methods with using derivative of the objective function: Gradient Descent Method, Newton Method, Fletcher-Reeves Method.
🔍 Optimize fake news detection using binary classification with a focus on convergence speed and efficiency over various numerical optimization methods.
Add a description, image, and links to the fletcher-reeves topic page so that developers can more easily learn about it.
To associate your repository with the fletcher-reeves topic, visit your repo's landing page and select "manage topics."