skip to content

Department of Computer Science and Technology

Date: 
Tuesday, 8 April, 2025 - 14:00 to 15:00
Speaker: 
Alex Hägele, EPFL
Venue: 
Computer Lab, FW26

Large Language Model (LLM) pretraining relies on complex strategies for large-scale optimization, with the learning rate schedule being particularly important yet often following conventional rules.
In this talk, I will discuss our recent NeurIPS Spotlight that investigates a simple but effective strategy: a constant learning rate followed by strategic cooldowns. Our analysis demonstrates that this approach does not only perform reliably, but it offers practical advantages as it does not require predetermined training lengths and easily allows continual training. Importantly, these findings enable more efficient scaling law experiments, as they allow for reuse of training runs and thereby substantially reduce compute and GPU hours. In a followup work, we investigate theoretical explanations of the unique behavior of such learning rate schedules, leveraging last-iterate convergence bounds which closely match real experiments.
At the end of the talk, I will conclude by introducing the Swiss AI initiative (https://www.swiss-ai.org/) which deploys the world's first national research infrastructure with 10,000 NVIDIA Grace Hopper GPUs. This initiative leverages our research innovations, such as the above, to develop state-of-the-art open and multilingual LLMs, with the goal of advancing fully transparent scientific research on foundation models.

Bio: Alex Hägele is a PhD Student at EPFL in the Machine Learning and Optimization group (MLO) supervised by Martin Jaggi. Currently, he is part of the inaugural Anthropic Fellowship for AI Safety research, based in London. Previously, he completed his BSc+MSc in Computer Science at ETH Zürich and was a visiting Student Researcher at Apple MLR in Paris. His research explores scaling behavior and training of language models, spanning optimization, data, and architectures.

Seminar series: 
Cambridge ML Systems Seminar Series

Upcoming seminars