towhee.trainer.scheduler.configure_constant_scheduler

towhee.trainer.scheduler.configure_constant_scheduler(optimizer: Optimizer, last_epoch: int = -1)[source]

Return a scheduler with a constant learning rate, using the learning rate set in optimizer.

Parameters:
  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • last_epoch (int) – The last epoch when resuming training.

Return (LambdaLR):

A constant scheduler

Example

>>> from towhee.trainer.scheduler import configure_constant_scheduler
>>> from towhee.trainer.optimization.adamw import AdamW
>>> from torch import nn
>>> def unwrap_scheduler(scheduler, num_steps=10):
>>>     lr_sch = []
>>>     for _ in range(num_steps):
>>>         lr_sch.append(scheduler.get_lr()[0])
>>>         scheduler.step()
>>>     return lr_sch
>>> mdl = nn.Linear(50, 50)
>>> optimizer = AdamW(mdl.parameters(), lr=10.0)
>>> num_steps = 2
>>> scheduler = configure_constant_scheduler(optimizer)
>>> lr_sch_1 = unwrap_scheduler(scheduler, num_steps)
[10.0, 10.0]