PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This paper presents a new model for focusing attention in hierarchical structured neural networks. Emphasis is devoted to determine the location of the focus of attention. The main idea is that attention is closely coupled with predictions about the environment. Whenever there is a mismatch between prediction and reality a shift of attention is performed. This mismatch can also be used to change (learn) the prediction and processing mechanism, so that the prediction will be better next time. In this sense attention and learning are closely coupled. We present a first application of this mechanism to classification of satellite image (Landsat TM) data. The usage of the attentional mechanism can reduce the processing time by 50% while maintaining the classification accuracy.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Horst Bischof, Karin Hraby, "Focusing attention in hierarchical neural networks," Proc. SPIE 2093, Substance Identification Analytics, (1 February 1994); https://doi.org/10.1117/12.172503