PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
-
Updated
Feb 27, 2023 - Jupyter Notebook
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
Pytorch implementation of preconditioned stochastic gradient descent
Distributed K-FAC Preconditioner for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
Tensorflow implementation of preconditioned stochastic gradient descent
Hessian based stochastic optimization in TensorFlow and keras
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
PyTorch implementation of the Hessian-free optimizer
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
Prototyping of matrix free Newton methods in Julia
Subsampled Riemannian trust-region (RTR) algorithms
An efficient and easy-to-use Theano implementation of the stochastic Gauss-Newton method for training deep neural networks.
Second-Order Convergence of Alternating Minimizations
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."