Skip to content

Latest commit

 

History

History
10 lines (9 loc) · 464 Bytes

README.md

File metadata and controls

10 lines (9 loc) · 464 Bytes

MultiLR

A method for assigning separate learning rate schedulers to different parameters groups in a model. Pull requests are welcome.

Usage

Write a lambda function that constructs a scheduler for each parameter group.

scheduler = MultiLR(optimizer, 
                [lambda opt: torch.optim.lr_scheduler.StepLR(opt, step_size=10, gamma=0.5), 
                 lambda opt: torch.optim.lr_scheduler.LinearLR(opt, start_factor=0.25, total_iters=10)])