towhee.trainer.schedulerΒΆ

Scheduler utilities for pytorch optimization.

Functions

check_scheduler

Check if the scheduler type is supported.

configure_constant_scheduler

Return a scheduler with a constant learning rate, using the learning rate set in optimizer.

configure_constant_scheduler_with_warmup

Return a schedule with a constant learning rate preceded by a warmup period during which the learning rate increases linearly between 0 and the initial lr set in the optimizer.

configure_cosine_scheduler_with_warmup

Return a scheduler with a learning rate that decreases following the values of the cosine function between the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly between 0 and the initial lr set in the optimizer.

configure_cosine_with_hard_restarts_scheduler_with_warmup

Return a scheduler with a learning rate that decreases following the values of the cosine function between the initial lr set in the optimizer to 0, with several hard restarts, after a warmup period during which it increases linearly between 0 and the initial lr set in the optimizer.

configure_linear_scheduler_with_warmup

Return a scheduler with a learning rate that decreases linearly from the initial lr set in the optimizer to 0, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.

configure_polynomial_decay_scheduler_with_warmup

Return a scheduler with a learning rate that decreases as a polynomial decay from the initial lr set in the optimizer to end lr defined by lr_end, after a warmup period during which it increases linearly from 0 to the initial lr set in the optimizer.