Paper
17 December 1996 Multimodally controlled intelligent telerobot for people with disabilities
Zunaid Kazi, Shoupu Chen, Matthew Beitler, Daniel Chester, Richard Foulds
Author Affiliations +
Abstract
This paper reports on the current status of the multimodal user supervised interface and intelligent control (MUSIIC) project, which is working towards the development of an intelligent assistive telemanipulative system for people with motor disabilities. Our MUSIIC strategy overcomes the limitations of previous approaches by integrating a multimodal RUI (robot user interface) and a semi-autonomous reactive planner that will allow users with severe motor disabilities to manipulate objects in an unstructured domain. The multimodal user interface is a speech and deictic (pointing) gesture based control that guides the operation of a semi-autonomous planner controlling the assistive telerobot. MUSIIC uses a vision system to determine the three-dimensional shape, pose and color of objects and surfaces which are in the environment, and as well as an object-oriented knowledge base and planning system which superimposes information about common objects in the three-dimensional world. This approach allows the users to identify objects and tasks via a multimodal user interface which interprets their deictic gestures and a restricted natural language like speech input. The multimodal interface eliminates the need for general purpose object recognition by binding the users speech and gesture input to a locus in the domain of interest. The underlying knowledge-driven planner, combines information obtained from the user, the stereo vision mechanism as well as the knowledge bases to adapt previously learned plans to perform new tasks and also to manipulate newly introduced objects into the workspace. Therefore, what we have is a flexible and intelligent telemanipulative system that functions as an assistive robot for people with motor disabilities.
© (1996) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zunaid Kazi, Shoupu Chen, Matthew Beitler, Daniel Chester, and Richard Foulds "Multimodally controlled intelligent telerobot for people with disabilities", Proc. SPIE 2901, Telemanipulator and Telepresence Technologies III, (17 December 1996); https://doi.org/10.1117/12.263001
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Human-machine interfaces

Control systems

Object recognition

Computing systems

Intelligence systems

Robotics

Sensors

RELATED CONTENT

Flexible operator aids for telemanipulation
Proceedings of SPIE (December 21 1995)
Vision-Controlled Robotic Cell
Proceedings of SPIE (November 22 1982)
Web-based teleautonomy and telepresence
Proceedings of SPIE (October 13 2000)

Back to Top