Skip to content

Conversation

@benbennett
Copy link
Owner

Summary

  • extend the scheduler factory to include warm restarts, exponential, and polynomial decay while exposing configurable thresholds
  • clamp learning rates to optional minimums and improve logging when stepping multiple optimizers
  • surface new command-line flags so both training entry points can tune the richer learning-rate policies

Testing

  • python -m compileall .
  • python -m compileall options

https://chatgpt.com/codex/tasks/task_e_68d3724865f08331b23931005646551f

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants