Home

Hand-eye coordination for grasping moving objects

Peter K. Allen; Aleksandar Timcenko; Billibon Yoshimi; Paul Michelman

Title:
Hand-eye coordination for grasping moving objects
Author(s):
Allen, Peter K.
Timcenko, Aleksandar
Yoshimi, Billibon
Michelman, Paul
Date:
Type:
Articles
Department:
Computer Science
Permanent URL:
Part Number:
1383
Book/Journal Title:
Sensor Fusion III: 3D Perception and Recognition
Book Author:
Schenker, Paul S.
Publisher:
Society of Photo-optical Instrumentation Engineers
Publisher Location:
Bellingham, Wash.
Abstract:
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm.
Subject(s):
Robotics
Publisher DOI:
http://dx.doi.org/10.1117/12.25255
Item views:
120
Metadata:
text | xml

In Partnership with the Center for Digital Research and Scholarship at Columbia University Libraries/Information Services | Terms of Use