Skip to content

Conversation

@benbennett
Copy link
Owner

Summary

  • update BaseModel.update_learning_rate to accept the current epoch and adjust scheduler stepping accordingly
  • move learning rate updates in training loops to the end of each epoch so the optimizer steps precede scheduler steps

Testing

  • python -m compileall models/base_model.py simple.py atme.py

https://chatgpt.com/codex/tasks/task_e_68d371ef55708331b938ee60cd448348

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants