Skip to content

Distributed Optimization Methods for Machine Learning

Notifications You must be signed in to change notification settings

Team-60/Distributed-ML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

896212b · Dec 26, 2021

History

42 Commits
Dec 10, 2021
Dec 10, 2021
Dec 11, 2021
Dec 8, 2021
Dec 10, 2021
Dec 26, 2021

Repository files navigation

Distributed-ML

Implementing an optimizer (Nesterov SGD) for training a CNN model on the CIFAR-10 dataset in the following settings:

  • Shared memory Hogwild! | Directory: hogwild
  • Distributed Local-SGD | Directory: Local-SGD

Directory optimizer-benchmarks contains benchmarks for various first-order and second-order based GD methods:

  • SGD
  • Momentum SGD
  • Nesterov SGD
  • Adagrad
  • RMSProp
  • ADAM