Paper
6 April 1995 ReverbaProp: simultaneous learning of credit and weight
Robert Chris Lacher, Douglas Alan Klotter
Author Affiliations +
Abstract
We introduce a supervised learning method for feed-forward networks that solves the credit assignment problem for error in concert with solving the error reduction problem normally associated with methods such as backpropagation. The method reverberates between forward and reverse activations of the network. Forward activation using an exemplar computes output for each node in the network using the connection weights as usual. Reverse activation using output error as input computes local error at each node using reverse weights, or responsibilities, on the reverse connections. Reverse-reverse activation (the same as forward activation with linear output functions) using reverse output error as input computes local reverse error at each node. Once local error and local reverse error have been assigned to each node, weights and responsibilities are modified using the standard delta rule and local error and local reverse error, respectively. The method relies on convergence toward an optimal set of responsibilities for reverse error distribution in concert with convergence toward an optimal set of weights, and thus avoids calculation of nonlinear terms in the usual error backpropagation method. Thus the method is free of derivative evaluations, and by allowing credit assignment to optimize simultaneously with error reduction, it promotes clustering of responsibility among the nodes.
© (1995) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Robert Chris Lacher and Douglas Alan Klotter "ReverbaProp: simultaneous learning of credit and weight", Proc. SPIE 2492, Applications and Science of Artificial Neural Networks, (6 April 1995); https://doi.org/10.1117/12.205159
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Error analysis

Binary data

Machine learning

Network architectures

Computer science

Computing systems

Neural networks

RELATED CONTENT


Back to Top