Quantifying Generalization in Linearly Weighted Neural Networks
Martin Anthony
Mathematics Department,
The London School of Economics
and Political Science (University of London),
Houghton Street, London WC2A 2AE, UK
Sean B. Holden
Cambridge University Engineering Department,
Trumpington Street, Cambridge CB2 1PZ, UK
Abstract
The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generalization in artificial neural networks. The "probably approximately correct'' learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural networks. First, we obtain bounds on the Vapnik-Chervonenkis dimensions of radial basis function networks with basis functions of several types. Secondly, we calculate the Vapnik-Chervonenkis dimension of polynomial discriminant functions defined over both real and binary-valued inputs.