Concentration on HDX: Derandomization Beyond Chernoff
APA
(2025). Concentration on HDX: Derandomization Beyond Chernoff. SciVideos. https://youtube.com/live/wLfeu7KqdBQ
MLA
Concentration on HDX: Derandomization Beyond Chernoff. SciVideos, May. 04, 2025, https://youtube.com/live/wLfeu7KqdBQ
BibTex
@misc{ scivideos_ICTS:31746, doi = {}, url = {https://youtube.com/live/wLfeu7KqdBQ}, author = {}, keywords = {}, language = {en}, title = {Concentration on HDX: Derandomization Beyond Chernoff}, publisher = {}, year = {2025}, month = {may}, note = {ICTS:31746 see, \url{https://scivideos.org/icts-tifr/31746}} }
Abstract
Chernoff's bound states that for any $A \subset [N]$ the probability a random $k$-tuple $s \in {[N] \choose k}$ correctly `samples' $A$ (i.e. that the density of $A$ in $s$ is close to its mean) decays exponentially in the dimension $k$. In 1987, Ajtai, Komlos, and Szemeredi proved the "Expander-Chernoff Theorem", a powerful derandomization of Chernoff's bound stating that one can replace ${[N] \choose k}$ with the significantly sparser family $X_G(k) \subsetneq {[N] \choose k}$ of length-$k$ paths on an expander graph $G$ while maintaining essentially the same concentration. Their result, which allows amplification without significant blow-up in size or randomness, has since become a mainstay in theoretical computer science with breakthrough applications in derandomization, coding, pseudorandomness, cryptography, and complexity.
One natural way to view AKS is to say Expander-Walks are pseudorandom with respect to functions of their vertices, or against "degree 1" functions. In modern complexity, especially in the context of hardness amplification and PCPs, we often need concentration against higher degree functions, e.g. functions of edges or triples. Unfortunately, due to their inherent low-dimensionality, walks on expanders are not pseudorandom even at degree 2, and the construction of such a de-randomized object has remained largely open.
In 2017 Dinur and Kaufman offered a partial resolution to this question in high dimensional expanders, a derandomized family satisfying Chebyshev-type (inverse polynomial) concentration for higher degree functions. Their work led to breakthrough applications in agreement testing and PCPs (Bafna, Minzer, Vyas, and Yun STOC 2025), but left an exponential gap with known bounds for the complete hypergraph ${[N] \choose k}$ needed for further applications. In this talk, we close this gap and prove (strong enough) HDX indeed have Chernoff-type tails. Time willing, we will discuss the relation of these bounds to a powerful analytic inequality called reverse hypercontractivity and its applications to agreement tests with optimal soundness.
Based on joint work with Yotam Dikstein.