neuralogic.optim.lr_scheduler

class ArithmeticLR(max_steps: int)[source]

Bases: LRDecay

Decay learning rate on every epoch by the following formula

\[\mathbf{lr}_i = \mathbf{lr}_{i-1} - \dfrac{\mathbf{lr}_{0}}{max\_steps}\]
Parameters:

max_steps (int)

class GeometricLR(decay_rate: float, steps: int)[source]

Bases: LRDecay

Decay learning rate on every \(steps\) epoch by the following formula

\[\mathbf{lr}_i = \mathbf{lr}_{i-1} \cdot decay\_rate\]
Parameters:
  • decay_rate (float)

  • steps (int)

class LRDecay[source]

Bases: object

decay(epoch: int)[source]

Manually run the learning rate decay - this is useful when passing sample by sample into the training method instead of passing the whole batch of samples. In that case, the decay is not triggered automatically, as it is unknown what the current epoch is.

Parameters:

epoch (int) – The number of the current epoch.

restart()[source]

Reset the learning rate, and the learning decay itself to its original state.