Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a learning rate schedule #18

Open
pokey opened this issue Oct 9, 2022 · 2 comments
Open

Implement a learning rate schedule #18

pokey opened this issue Oct 9, 2022 · 2 comments

Comments

@pokey
Copy link
Contributor

pokey commented Oct 9, 2022

Currently, it appears learning rate is constant. My loss started to really slow down; I think dropping it could help a lot, but there is no schedule, and due to #15 and #17, I couldn't just restart it with a smaller LR

See #29, which has a few lines that implement a LR schedule that could be extracted for a PR here

@ym-han
Copy link
Contributor

ym-han commented Oct 10, 2022

As a related quick note to self: In addition to implementing some sort of learning rate scheduler, we should also add a learning rate finder. Lightning seems to ship with one: https://pytorch-lightning.readthedocs.io/en/1.4.5/advanced/lr_finder.html

@pokey
Copy link
Contributor Author

pokey commented Oct 10, 2022

yes was thinking a lr range test would be a good idea; forgot that was automated in lighning. good stuff

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants