Home

Cooperative integration of vision and touch

Peter K. Allen

Title:
Cooperative integration of vision and touch
Author(s):
Allen, Peter K.
Date:
Type:
Articles
Department:
Computer Science
Permanent URL:
Part Number:
1198
Book/Journal Title:
Sensor fusion II : human and machine strategies : 6-9 November 1989, Philadelphia, Pennsylvania
Publisher:
Society of Photo-optical Instrumentation Engineers
Abstract:
Vision and touch have proved to be powerful sensing modalities in humans. In order to build robots capable of complex behavior, analogues of human vision and taction need to be created. In addition, strategies for intelligent use of these sensors in tasks such as object recognition need to be developed. Two overriding principles that dictate a good strategy for cooperative use of these sensors are the following: 1) sensors should complement each other in the kind and quality of data they report, and 2) each sensor system be used in the most robust manner possible. We demonstrate this with a contour following algorithm that recovers the shape of surfaces of revolution from sparse tactile sensor data. The absolute location in depth of an object can be found more accurately through touch than vision; but the global properties of where to actively explore with the hand are better found through vision.
Subject(s):
Robotics, Artificial intelligence
Publisher DOI:
http://dx.doi.org/10.1117/12.969990
Item views:
104
Metadata:
text | xml

In Partnership with the Center for Digital Research and Scholarship at Columbia University Libraries/Information Services | Terms of Use