AdamW and Super-convergence is now the fastest way to train neural nets


New top story on Hacker News: AdamW and Super-convergence is now the fastest way to train neural nets