Models of Tomorrow: Scaling Laws in Machine Learning
Episode 5 of the Voices of Tomorrow podcast, hosted by Aleix M Martinez, titled "Models of Tomorrow: Scaling Laws in Machine Learning" was published on October 13, 2024 and runs 11 minutes.
October 13, 2024 ·11m · Voices of Tomorrow
Episode Description
In this episode of Voices of Tomorrow, we take a deep dive into one of the most critical concepts driving advancements in artificial intelligence: scaling laws in machine learning. The exponential growth of AI capabilities, leading to two Nobel Prizes in Physics and Chemistry, has been fueled by breakthroughs in scaling model size, data, and compute. This episode unpacks the mathematical foundations of scaling laws, explaining how they govern the performance improvements in today’s largest models, in particular in Large Language Models or LLMs.
We explore key insights from recent research on optimal resource allocation, highlighting how scaling dataset size at a slower rate than model parameters leads to more efficient training. We also address the complexities of multi-dimensional optimization, which moves beyond just model size and data, considering factors like inference efficiency and context length. And, much more.
Whether you’re in the field working on AI models or simply interested in the frontier of machine learning research, this episode will provide you with a comprehensive look at the laws governing AI’s growth and the future of scaling machine learning models..
Similar Episodes
Mar 2, 2026 ·22m
Nov 19, 2025 ·26m
Oct 29, 2025 ·33m
Oct 2, 2025 ·32m
Aug 26, 2025 ·35m
Jun 24, 2025 ·34m