## Video URL

https://pirsa.org/24010081# Closed-Form Interpretation of Neural Network Classifiers with Symbolic Regression Gradients

### APA

Wetzel, S. (2024). Closed-Form Interpretation of Neural Network Classifiers with Symbolic Regression Gradients. Perimeter Institute for Theoretical Physics. https://pirsa.org/24010081

### MLA

Wetzel, Sebastian. Closed-Form Interpretation of Neural Network Classifiers with Symbolic Regression Gradients. Perimeter Institute for Theoretical Physics, Jan. 19, 2024, https://pirsa.org/24010081

### BibTex

@misc{ scivideos_PIRSA:24010081, doi = {10.48660/24010081}, url = {https://pirsa.org/24010081}, author = {Wetzel, Sebastian}, keywords = {Other Physics}, language = {en}, title = {Closed-Form Interpretation of Neural Network Classifiers with Symbolic Regression Gradients}, publisher = {Perimeter Institute for Theoretical Physics}, year = {2024}, month = {jan}, note = {PIRSA:24010081 see, \url{https://scivideos.org/pirsa/24010081}} }

Sebastian Wetzel Mitacs

**Source Repository**PIRSA

**Collection**

**Talk Type**Scientific Series

**Subject**

## Abstract

I introduce a unified framework for interpreting neural network classifiers tailored toward automated scientific discovery. In contrast to neural network-based regression, for classification, it is in general impossible to find a one-to-one mapping from the neural network to a symbolic equation even if the neural network itself bases its classification on a quantity that can be written as a closed-form equation. In this paper, I embed a trained neural network into an equivalence class of classifying functions that base their decisions on the same quantity. I interpret neural networks by finding an intersection between this equivalence class and human-readable equations defined by the search space of symbolic regression. The approach is not limited to classifiers or full neural networks and can be applied to arbitrary neurons in hidden layers or latent spaces or to simplify the process of interpreting neural network regressors.

---