-
Notifications
You must be signed in to change notification settings - Fork 77
Open
Description
I encountered an issue related to the learning rate scheduler configuration. The train_medmnist.sh
passes the warm_ratio parameter with a value of 0.6 to train.py
. However, upon running the script, an error occurs indicating that train.py
did not receive the warm_ratio parameter. This suggests a problem with the learning rate scheduler configuration. To troubleshoot, I added code to train.py
to accept the warm_ratio parameter without performing any operations with it. This led to another error indicating an issue with the lr_scheduler, which is part of the fairseq package. I reviewed the Learning Rate Schedulers on fairseq documentation and noted that it mentions warmup settings. So should the warm_ratio parameter play a role in fairseq?
Metadata
Metadata
Assignees
Labels
No labels