Paper
16 December 1992 Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case
Behzad M. Shahshahani, David A. Landgrebe
Author Affiliations +
Abstract
The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes, supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Behzad M. Shahshahani and David A. Landgrebe "Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case", Proc. SPIE 1766, Neural and Stochastic Methods in Image and Signal Processing, (16 December 1992); https://doi.org/10.1117/12.130825
Lens.org Logo
CITATIONS
Cited by 9 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Machine learning

Matrices

Signal processing

Statistical analysis

Stochastic processes

Image processing

Lithium

RELATED CONTENT

Robust decomposition of 3-way tensors based on L1-norm
Proceedings of SPIE (May 14 2018)
Riesz wavelets and multiresolution structures
Proceedings of SPIE (December 05 2001)
Error probabilities of minimum-distance classifiers
Proceedings of SPIE (October 01 1991)
Lifting algorithm of discrete Hartley transform
Proceedings of SPIE (November 14 2007)

Back to Top