Learning Deep ReLU Networks is Fixed-Parameter Tractable
APA
(2020). Learning Deep ReLU Networks is Fixed-Parameter Tractable. The Simons Institute for the Theory of Computing. https://simons.berkeley.edu/talks/learning-deep-relu-networks-fixed-parameter-tractable
MLA
Learning Deep ReLU Networks is Fixed-Parameter Tractable. The Simons Institute for the Theory of Computing, Dec. 16, 2020, https://simons.berkeley.edu/talks/learning-deep-relu-networks-fixed-parameter-tractable
BibTex
@misc{ scivideos_16879, doi = {}, url = {https://simons.berkeley.edu/talks/learning-deep-relu-networks-fixed-parameter-tractable}, author = {}, keywords = {}, language = {en}, title = {Learning Deep ReLU Networks is Fixed-Parameter Tractable}, publisher = {The Simons Institute for the Theory of Computing}, year = {2020}, month = {dec}, note = {16879 see, \url{https://scivideos.org/Simons-Institute/16879}} }
Sitan Chen (MIT)
Talk number16879
Source RepositorySimons Institute
Subject