SterileNeutrino
The Lunduke Journal
comments
Comments by "SterileNeutrino" (@SterileNeutrino) on "A.I. Now Convincingly Replicates a Person's Handwriting... and it's Just as Creepy as it Sounds." video.
1
Function fitting is what the Multilayer Perceptrons do. The scene in Terminator 2 with the phone call to the foster parents was on the money.
... Found it: The (rather theoretical) antithesis to Minsky and Papert being downbeat on the capabilities of single-layer neural networks in 1969 (could have been written in 1970, too)
Multilayer Feedforward Networks are Universal Approximators
Kurt Hornik, Technische Universität Wien, Maxwell Stinchcomba and Halbert White, University of California, San Diego
(Received 16 September 1988; revised und accepted 9 March 1989)
"This paper rigorously establishes thut standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function [~ continuous] from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available. In this sense, multilayer feedforward networks are a class of universal approximators."
"The apparent ability of sufficiently elaborate feed-forward networks to approximate quite well nearly any function encountered in applications leads one to wonder about the ultimate capabilities of such networks. Are the successes observed to date reflective of some deep and fundamental approximation capability, or are they merely flukes, resulting from selective reporting and a fortuitous choice of problems? Are multilayer feedforward networks in fact inherently limited to approximating only some fairly special class of functions, albeit a class somewhat larger than the lowly perceptron? The purpose of this paper is to address these issues. We show that multilayer feed-forward networks with as few as one hidden layer are indeed capable of universal approximation in a very precise and satisfactory sense."
1