Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-22T20:55:01.910Z Has data issue: false hasContentIssue false

A neural model for visual-tactile-motor integration in robotic reaching and grasping tasks

Published online by Cambridge University Press:  23 January 2002

J. López-Coronado
Affiliation:
Dept. of System Engineering and Automatic Polytechnical University of Cartagena (Spain).
J.L. Pedreño-Molina
Affiliation:
Dept. of Communication and Information Technologies, Polytechnical University of Cartagena (Spain).
A. Guerrero-González
Affiliation:
Dept. of System Engineering and Automatic Polytechnical University of Cartagena (Spain).
P. Gorce
Affiliation:
U483 INSERM, IUT Cachan, Université Paris Sud, Cachan (France).

Abstract

This paper presents a neural model to solve the visual-tactile-motor coordination problem in robotic applications. The proposed neural controller is based on the VAMC (Vector Associative Map) model. This algorithm is based on the human biological system and has the ability of learning the mapping that establishes the relationship between the spatial and the motor coordinates. These spatial inputs are composed of visual and force parameters. The LINCE stereohead carries out a visual detection process, detecting the positions of the object and of the manipulator. The artificial tactile skins placed over the two fingers of the gripper measure the force distribution when an object is touched. The neural controller has been implemented for robotic operations of reaching and object grasping. The reaching process is fed back in order to minimize the Difference Vector (DV) between the visual projections of the object and the manipulator. The stable grasping task processes the force distribution maps detected in the contact with the two surfaces of the gripper, in order to direct the object into the robotic fingers. Experimental results have demonstrated the robustness of the model and the accuracy of the final pick-and-place process.

Type
Research Article
Copyright
© 2002 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)