Video URL
Mean-Field Theory Insights into Neural Feature Dynamics, Infinite-Scale Limits, and Scaling LawsMean-Field Theory Insights into Neural Feature Dynamics, Infinite-Scale Limits, and Scaling Laws
APA
(2025). Mean-Field Theory Insights into Neural Feature Dynamics, Infinite-Scale Limits, and Scaling Laws. SciVideos. https://youtube.com/live/fUw0IsnSs4E
MLA
Mean-Field Theory Insights into Neural Feature Dynamics, Infinite-Scale Limits, and Scaling Laws. SciVideos, Aug. 12, 2025, https://youtube.com/live/fUw0IsnSs4E
BibTex
@misc{ scivideos_ICTS:32497,
doi = {},
url = {https://youtube.com/live/fUw0IsnSs4E},
author = {},
keywords = {},
language = {en},
title = {Mean-Field Theory Insights into Neural Feature Dynamics, Infinite-Scale Limits, and Scaling Laws},
publisher = {},
year = {2025},
month = {aug},
note = {ICTS:32497 see, \url{https://scivideos.org/index.php/icts-tifr/32497}}
}
Abstract
When a neural network becomes extremely wide or deep, its learning dynamics simplify and can be described by the same “mean-field” ideas that explain magnetism and fluids. I will walk through these ideas step-by-step, showing how they suggest practical recipes for initialization and optimization that scale smoothly from small models to cutting-edge transformers. I will also discuss neural scaling laws—empirical power-law rules that relate model size, data, and compute—and illustrate them with solvable toy models.