towhee.trainer.scheduler.configure_linear_scheduler_with_warmup

towhee.trainer.scheduler.configure_linear_scheduler_with_warmup(optimizer, num_warmup_steps, num_training_steps, last_epoch=-1)[source]

Return a scheduler with a learning rate that decreases linearly from the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.

Parameters:
  • optimizer (Optimizer) – The optimizer to be scheduled.

  • num_warmup_steps (int) – Warmup steps.

  • num_training_steps (int) – Training steps.

  • last_epoch (int) – The last epoch when training is resumed.

Return (LambdaLR):

A linear scheduler with warmup.

Example

>>> from towhee.trainer.scheduler import configure_linear_scheduler_with_warmup
>>> from towhee.trainer.optimization.adamw import AdamW
>>> from torch import nn
>>> def unwrap_scheduler(scheduler, num_steps=10):
>>>     lr_sch = []
>>>     for _ in range(num_steps):
>>>         lr_sch.append(scheduler.get_lr()[0])
>>>         scheduler.step()
>>>     return lr_sch
>>> mdl = nn.Linear(50, 50)
>>> optimizer = AdamW(mdl.parameters(), lr=10.0)
>>> num_steps = 10
>>> num_warmup_steps = 4
>>> num_training_steps = 10
>>> scheduler = configure_constant_scheduler_with_warmup(optimizer, num_warmup_steps, num_training_steps)
>>> lr_sch_1 = unwrap_scheduler(scheduler, num_steps)
[0.0, 2.5, 5.0, 7.5, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0]