towhee.trainer.optimization.optimization.get_constant_schedule_with_warmup(optimizer: Optimizer, num_warmup_steps: int, last_epoch: int = -1)[source]

Create a schedule with a constant learning rate preceded by a warmup period during which the learning rate increases linearly between 0 and the initial lr set in the optimizer.

  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • num_warmup_steps (int) – The number of steps for the warmup phase.

  • last_epoch (int, optional, defaults to -1) – The index of the last epoch when resuming training.


torch.optim.lr_scheduler.LambdaLR with the appropriate schedule.