16863

Learning and Testing for Gradient Descent

APA

(2020). Learning and Testing for Gradient Descent. The Simons Institute for the Theory of Computing. https://simons.berkeley.edu/talks/graph-testing-spectrum

MLA

Learning and Testing for Gradient Descent. The Simons Institute for the Theory of Computing, Dec. 15, 2020, https://simons.berkeley.edu/talks/graph-testing-spectrum

BibTex

          @misc{ scivideos_16863,
            doi = {},
            url = {https://simons.berkeley.edu/talks/graph-testing-spectrum},
            author = {},
            keywords = {},
            language = {en},
            title = {Learning and Testing for Gradient Descent},
            publisher = {The Simons Institute for the Theory of Computing},
            year = {2020},
            month = {dec},
            note = {16863 see, \url{https://scivideos.org/index.php/Simons-Institute/16863}}
          }
          
Emmanuel Abbe (EPFL)
Talk number16863
Source RepositorySimons Institute

Abstract

We present lower-bounds for the generalization error of gradient descent on free initializations, reducing the problem to testing the algorithm’s output under different data models. We then discuss lower-bounds on random initialization and present the problem of learning communities in the pruned-block-model, where it is conjectured that GD fails.