Options
Approximation bounds for random neural networks and reservoir systems
Journal
The Annals of Applied Probability
Date Issued
2023-02
Abstract (De)
This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights. These methods, in which only the last layer of weights and a few hyperparameters are optimized, have been successfully applied in a wide range of static and dynamic learning problems. Despite the popularity of this approach in empirical tasks, important theoretical questions regarding the relation between the unknown function, the weight distribution, and the approximation rate have remained open. In this work it is proved that, as long as the unknown function, functional, or dynamical system is sufficiently regular, it is possible to draw the internal weights of the random (recurrent) neural network from a generic distribution (not depending on the unknown object) and quantify the error in terms of the number of neurons and the hyperparameters. In particular, this proves that echo state networks with randomly generated weights are capable of approximating a wide class of dynamical systems arbitrarily well and thus provides the first mathematical explanation for their empirically observed success at learning dynamical systems.
Language
English
Keywords
Approximation error
echo state networks
neural networks
random function approximation
reservoir computing
Refereed
Yes
Publisher
Institute of Mathematical Statistics
Volume
33
Number
1
Start page
28
End page
69
Official URL
Eprints ID
269606
File(s)
Loading...
open access
Name
2002.05933.pdf
Size
497.58 KB
Format
Adobe PDF
Checksum (MD5)
70ec28be530c29bdf0ae094153e6a4f8