PIRSA:23040113

Causal Discovery via Common Entropy

APA

Kocaoglu, M. (2023). Causal Discovery via Common Entropy. Perimeter Institute for Theoretical Physics. https://pirsa.org/23040113

MLA

Kocaoglu, Murat. Causal Discovery via Common Entropy. Perimeter Institute for Theoretical Physics, Apr. 18, 2023, https://pirsa.org/23040113

BibTex

          @misc{ scivideos_PIRSA:23040113,
            doi = {10.48660/23040113},
            url = {https://pirsa.org/23040113},
            author = {Kocaoglu, Murat},
            keywords = {Quantum Foundations},
            language = {en},
            title = {Causal Discovery via Common Entropy},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2023},
            month = {apr},
            note = {PIRSA:23040113 see, \url{https://scivideos.org/index.php/pirsa/23040113}}
          }
          

Murat Kocaoglu Purdue University

Talk numberPIRSA:23040113
Talk Type Conference
Subject

Abstract

Distinguishing causation from correlation from observational data requires assumptions. We consider the setting where the unobserved confounder between two observed variables is simple in an information-theoretic sense, captured by its entropy. When the observed dependence is not due to causation, there exists a small-entropy variable that can make the observed variables conditionally independent. The smallest such entropy is known as common entropy in information theory. We extend this notion to Renyi common entropy by minimizing the Renyi entropy of the latent variable. We establish identifiability results with Renyi-0 common entropy, and a special case of (binary) Renyi-1 common entropy. To efficiently compute common entropy, we propose an iterative algorithm that can be used to discover the trade-off between the entropy of the latent variable and the conditional mutual information of the observed variables. We show that our algorithm can be used to distinguish causation from correlation in such simple two-variable systems. Additionally, we show that common entropy can be used to improve constraint-based methods such as the PC algorithm in the small-sample regime, where such methods are known to struggle. We propose modifying these constraint-based methods to assess if a separating set found by these algorithms is valid using common entropy. We finally evaluate our algorithms on synthetic and real data to establish their performance.