16942

Learning Ising Models from One, Ten or a Thousand Samples

APA

(2020). Learning Ising Models from One, Ten or a Thousand Samples. The Simons Institute for the Theory of Computing. https://simons.berkeley.edu/talks/learning-ising-models-one-ten-or-thousand-samples

MLA

Learning Ising Models from One, Ten or a Thousand Samples. The Simons Institute for the Theory of Computing, Dec. 14, 2020, https://simons.berkeley.edu/talks/learning-ising-models-one-ten-or-thousand-samples

BibTex

          @misc{ scivideos_16942,
            doi = {},
            url = {https://simons.berkeley.edu/talks/learning-ising-models-one-ten-or-thousand-samples},
            author = {},
            keywords = {},
            language = {en},
            title = {Learning Ising Models from One, Ten or a Thousand Samples},
            publisher = {The Simons Institute for the Theory of Computing},
            year = {2020},
            month = {dec},
            note = {16942 see, \url{https://scivideos.org/Simons-Institute/16942}}
          }
          
Costis Daskalakis (MIT)
Talk number16942
Source RepositorySimons Institute

Abstract

Samples from high-dimensional distributions can be scarce or expensive. Can we meaningfully learn such distributions from one or just a few samples? We provide guarantees for single-sample estimation of Ising models, quantifying the estimation error in terms of the metric entropy of possible interaction matrices. As corollaries of our main result, we derive bounds when the model's interaction matrix is a (sparse) linear combination of known matrices, or it belongs to a finite set, or to a high-dimensional manifold.  Our result handles multiple independent samples by viewing them as one sample from a larger model, and can be used to derive estimation bounds that are qualitatively similar to state-of-the-art in the multiple-sample literature. We thus unify two separate strands of work in the literature:  (1) a renaissance of recent work in Computer Science on estimating Ising models/MRFs from multiple independent samples under minimal assumptions about the model's interaction matrix; and (2) a growing literature in Probability Theory on estimating them from one sample in restrictive settings. On the technical front, we exploit novel concentration and anti-concentration inequalities for functions of the Ising model.