Shawn’s Blog
  • πŸ—‚οΈ Collections
    • πŸ–₯️ Slides Gallery
    • πŸ“š Library (Reading Notes)
    • πŸ§‘β€πŸ³οΈ Cooking Ideas
    • 🍱 Cookbook
    • πŸ’¬ Language Learning
    • 🎼 Songbook
  • βš™οΈ Projects
    • βš› Continual Learning Arena
  • πŸ“„ Papers
    • AdaHAT
  • πŸŽ“ CV
    • CV (English)
    • CV (Mandarin)
  • About
  1. CLArena
  2. Configure Your Experiment
  3. Configure Learning Rate Scheduler
  • CLArena
    • Welcome Page
    • Get Started
    • Configure Your Experiment
      • Experiment Index Config
      • Configure CL Algorithm
      • Configure CL Dataset
      • Configure Backbone Network
      • Configure Optimizer
      • Configure Learning Rate Scheduler
      • Configure Trainer
      • Configure Lightning Loggers
      • Configure Callbacks
      • Other Configs
    • Implement Your CL Modules
    • API Reference

On this page

  • Configure Uniform Learning Rate Scheduler For All Tasks
  • Configure Distinct Learning Rate Scheduler For Each Task
  • Supported Learning Rate Schedulers
  1. CLArena
  2. Configure Your Experiment
  3. Configure Learning Rate Scheduler

Configure Learning Rate Scheduler

We use PyTorch Learning Rate Scheduler objects to train models within the framework of PyTorch and Lightning.

As continual learning involves multiple tasks, each task is supposed to be given a learning rate scheduler for training. We can either use a uniform learning rate scheduler across all tasks or assign distinct learning rate scheduler to each task.

If you don’t want to use a learning rate scheduler, you can set the field /lr_scheduler in the experiment index config to null, or simply remove this field.

Configure Uniform Learning Rate Scheduler For All Tasks

To configure uniform learning rate scheduler for all tasks for your experiment, link the /lr_scheduler field in the experiment index config to a YAML file in lr_scheduler/ subfolder of your configs. That YAML file should use _target_ field to link to a PyTorch learning rate scheduler class and specify its arguments in the following field. Here is an example:

./clarena/example_configs
β”œβ”€β”€ __init__.py
β”œβ”€β”€ entrance.yaml
β”œβ”€β”€ experiment
β”‚   β”œβ”€β”€ example.yaml
β”‚   └── ...
β”œβ”€β”€ lr_scheduler
β”‚   β”œβ”€β”€ reduce_lr_on_plateau_10_tasks.yaml
β”‚   └── reduce_lr_on_plateau.yaml
...
example_configs/experiment/example.yaml
defaults:
  ...
  - /lr_scheduler: reduce_lr_on_plateau.yaml
  ...
example_configs/lr_scheduler/reduce_lr_on_plateau.yaml
_target_: torch.optim.lr_scheduler.ReduceLROnPlateau
_partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
mode: min
factor: 0.1
patience: 10
Warning

Make sure to include field _partial_: True to enable partial instantiation. PyTorch learning rate scheduler need the optimizer as an argument to be fully instantiated, but we are now in the phase of configuration and certainly don’t have that argument, so the learning rate scheduler can be only partially instantiated.

Configure Distinct Learning Rate Scheduler For Each Task

To configure distinct learning rate scheduler for each task for your experiment, the YAML file linked in lr_scheduler/ subfolder should be a list of PyTorch learning rate scheduler classes. Each class is assigned to a task. The length of the list must be equal to field num_tasks in experiment index config.

example_configs/lr_scheduler/reduce_lr_on_plateau_10_tasks.yaml
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10
- _target_: torch.optim.lr_scheduler.ReduceLROnPlateau
  _partial_: true # partially instantiate learning rate scheduler without 'optimizer' argument. Make sure this is included in any case!
  mode: min
  factor: 0.1
  patience: 10

Supported Learning Rate Schedulers

We fully support all built-in learning rate schedulers defined in PyTorch. Please refer to PyTorch documentation to see the full list. Please also refer to PyTorch documentation of each learning rate scheduler class to learn its required arguments.

PyTorch Documentation (Built-In Learning Rate Schedulers)

Note

Optimisation-based approaches in continual learning methodology focus on designing mechanisms from the perspective of manipulating optimisation step. These approaches may involve using different learning rate schedulers for various tasks. However, the evolution of optimisation can be directly integrated into the CL algorithm so we don’t particularly design our own CL learning rate schedulers.

Back to top
Configure Optimizer
Configure Trainer
 
 

©️ 2025 Pengxiang Wang. All rights reserved.