- towhee.trainer.optimization.optimization.get_linear_schedule_with_warmup(optimizer, num_warmup_steps, num_training_steps, last_epoch=-1)¶
Create a schedule with a learning rate that decreases linearly from the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.
Optimizer) – The optimizer for which to schedule the learning rate.
int) – The number of steps for the warmup phase.
int) – The total number of training steps.
int, optional, defaults to -1) – The index of the last epoch when resuming training.
torch.optim.lr_scheduler.LambdaLRwith the appropriate schedule.