PIRSA:23110064

Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain

APA

Jain, B. (2023). Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain. Perimeter Institute for Theoretical Physics. https://pirsa.org/23110064

MLA

Jain, Bhuvnesh. Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain. Perimeter Institute for Theoretical Physics, Nov. 14, 2023, https://pirsa.org/23110064

BibTex

          @misc{ scivideos_PIRSA:23110064,
            doi = {10.48660/23110064},
            url = {https://pirsa.org/23110064},
            author = {Jain, Bhuvnesh},
            keywords = {Cosmology},
            language = {en},
            title = {Transformers for scientific data - VIRTUAL - Helen Qu and Bhuvnesh Jain},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2023},
            month = {nov},
            note = {PIRSA:23110064 see, \url{https://scivideos.org/pirsa/23110064}}
          }
          

Bhuvnesh Jain University of Pennsylvania

Talk numberPIRSA:23110064
Source RepositoryPIRSA
Talk Type Scientific Series
Subject

Abstract

The deep learning architecture associated with ChatGPT and related generative AI products is known as transformers. Initially applied to Natural Language Processing, transformers and the self-attention mechanism they exploit have gained widespread interest across the natural sciences. We will present the mathematics underlying the attention mechanism and describe the basic transformer architecture. We will then describe applications to time series and imaging data in astronomy and discuss possible foundation models.

---

Zoom link https://pitp.zoom.us/j/91226066758?pwd=TWZ5RVliMjVKYXdLcHdya09lNWZhQT09