Paper
2 September 1993 Statistical analysis of information content for training pattern recognition networks
Charles L. Wilson
Author Affiliations +
Abstract
Statistical models of neural networks predict that the difference in training and testing error will be linear in network complexity and quadratic in the feature noise of the training set. Models of this kind have been applied to the Boltzmann pruning of a large MLP (3786 weights) trained on 10,000 and tested on 10,000 Karhunen-Loeve (K-L) features sets derived from images of handprinted characters and to a fingerprint classification problem which has 17,157 weights and is trained and tested on 2,000 K-L feature sets. Using the information content to optimize network size, the pruned networks have achieved high rates of recognition and at the same time been reduced in size by up to 90%. In the pruning process the product of the network capacity and the recognition error can be used effectively to select an optimum pruned network. If, in addition to conventional Boltzmann weight reduction, a weight reduction method which takes the variance content of the K-L by weighting the features using the K-L eigenvalues is used, networks with optimal size and information content can be constructed.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Charles L. Wilson "Statistical analysis of information content for training pattern recognition networks", Proc. SPIE 1965, Applications of Artificial Neural Networks IV, (2 September 1993); https://doi.org/10.1117/12.152563
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Optical character recognition

Principal component analysis

Image classification

Neural networks

Pattern recognition

Statistical analysis

Image processing

Back to Top