NeuralNetworksApproximationTheory

Open questions:

  • Can every -Lipschitz function on a bounded domain be uniformly approximated by a shallow network? (this must be for dimension independent rates)
  • Is a heavy-tailed measure necessary for exponential depth separations?
  • Is approximation rate achievable for either
    1. Oscillations grow at rate with a fast-decaying measure (Gaussian measure)
    2. Oscillations grow rate with a heavy-tailed measure

Current reads:

Textbooks & Theses:

Approximation Theory:

Harmonic & Functional Analysis:

Shallow networks:

Classical results
Approximation of Lipschitz Functions
Infinite-width Shallow Networks/Neural Tangent Kernel (NTK)
Shallow Networks as Integral Transform/Ridge Functions and Radon Transform methods
Banach spaces of functions expressible by shallow networks
Misc

Depth separations:

Deep networks:

Misc
Banach spaces of functions expressible by deep networks

Expressivity and learning:

Miscellaneous Topics & Views:

PDEs
Quantized/bounded networks:
Random approximations:
Dynamical systems: