Paper
2 March 1994 Neural-based nonimaging vision system for robotic sensing
Timothy C. Edwards, Joe R. Brown
Author Affiliations +
Abstract
A multispectral, multiaperture, nonimaging sensor was simulated and constructed to show that the relative location of a robot arm and a specified target can be determined through Neural Network processing when the arm and target produce different spectral signatures. Data acquired from both computer simulation and actual hardware implementation was used to train an artificial Neural Network to yield the relative position in two dimensions of a robot arm and a target. The arm and target contained optical sources of different spectral characteristics which allows the sensor to discriminate between them. Simulation of the sensor gave an error distribution with a mean of zero and a standard deviation of 0.3 inches in each dimension across a work area of 6 by 10 inches. The actual sensor produced a standard deviation of approximately 0.8 inches using a limited number of training and test sets. No significant differences were found in the system performance where 9 or 18 apertures were used, indicating a minimum number of apertures required is equal to or less than nine.
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Timothy C. Edwards and Joe R. Brown "Neural-based nonimaging vision system for robotic sensing", Proc. SPIE 2243, Applications of Artificial Neural Networks V, (2 March 1994); https://doi.org/10.1117/12.169994
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Ruthenium

Eye

Sensing systems

Robotics

Computer simulations

Optical filters

RELATED CONTENT


Back to Top