towhee.trainer.optimization.optimizationΒΆ

PyTorch optimization for BERT model.

Functions

get_constant_schedule

Create a schedule with a constant learning rate, using the learning rate set in optimizer.

get_constant_schedule_with_warmup

Create a schedule with a constant learning rate preceded by a warmup period during which the learning rate increases linearly between 0 and the initial lr set in the optimizer.

get_cosine_schedule_with_warmup

Create a schedule with a learning rate that decreases following the values of the cosine function between the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly between 0 and the initial lr set in the optimizer.

get_cosine_with_hard_restarts_schedule_with_warmup

Create a schedule with a learning rate that decreases following the values of the cosine function between the initial lr set in the optimizer to 0, with several hard restarts, after a warmup period during which it increases linearly between 0 and the initial lr set in the optimizer.

get_linear_schedule_with_warmup

Create a schedule with a learning rate that decreases linearly from the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.

get_polynomial_decay_schedule_with_warmup

Create a schedule with a learning rate that decreases as a polynomial decay from the initial lr set in the optimizer to end lr defined by lr_end, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.

get_scheduler

Unified API to get any scheduler from its name.

get_warmup_steps

Get number of steps used for a linear warmup.