Shortcuts

Optimizers & Schedulers

Optimizers

Asteroid relies on torch_optimizer and torch for optimizers. We provide a simple get method that retrieves optimizers from string, which makes it easy to specify optimizers from the command line.

Here is a list of supported optimizers, retrievable from string:

  • AccSGD
  • AdaBound
  • AdaMod
  • DiffGrad
  • Lamb
  • NovoGrad
  • PID
  • QHAdam
  • QHM
  • RAdam
  • SGDW
  • Yogi
  • Ranger
  • RangerQH
  • RangerVA
  • Adam
  • RMSprop
  • SGD
  • Adadelta
  • Adagrad
  • Adamax
  • AdamW
  • ASG

Schedulers

Asteroid provides step-wise learning schedulers, integrable to pytorch-lightning via System.

Read the Docs v: v0.5.1
Versions
latest
stable
v0.5.1
v0.5.0
v0.4.5
v0.4.4
v0.4.3
v0.4.2
v0.4.1
v0.4.0
v0.3.5_b
v0.3.4
v0.3.3
v0.3.2
v0.3.1
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.