Articles

Integrating real-time vision and manipulation

Yoshimi, Billibon; Allen, Peter K.

Describes a system that integrates real-time computer vision with a sensorless gripper to provide closed-loop feedback control for grasping and manipulation tasks. Many hand-eye coordination skills can be thought of as sensory control loops, where specialized reasoning has been embodied as a feedback or control path in the loop's construction. Our framework captures the essence of these hand-eye coordination skills in simple visual control primitives, which are a key component of the software integration. The primitives use a simple visual tracking and correspondence scheme to provide real-time feedback control in the presence of imprecise camera calibrations. Experimental results are shown for the positioning task of locating, picking up and inserting a bolt into a nut under visual control. Results are also presented for the visual control of a bolt tightening task.

Subjects

Files

Also Published In

Title
Proceedings of the Thirtieth Hawaii International Conference on System Sciences
Publisher
IEEE
DOI
https://doi.org/10.1109/HICSS.1997.663173

More About This Work

Academic Units
Computer Science
Published Here
November 7, 2012