Academic Commons


Integrating real-time vision and manipulation

Yoshimi, Billibon; Allen, Peter K.

Describes a system that integrates real-time computer vision with a sensorless gripper to provide closed-loop feedback control for grasping and manipulation tasks. Many hand-eye coordination skills can be thought of as sensory control loops, where specialized reasoning has been embodied as a feedback or control path in the loop's construction. Our framework captures the essence of these hand-eye coordination skills in simple visual control primitives, which are a key component of the software integration. The primitives use a simple visual tracking and correspondence scheme to provide real-time feedback control in the presence of imprecise camera calibrations. Experimental results are shown for the positioning task of locating, picking up and inserting a bolt into a nut under visual control. Results are also presented for the visual control of a bolt tightening task.



Also Published In

Proceedings of the Thirtieth Hawaii International Conference on System Sciences

More About This Work

Academic Units
Computer Science
Published Here
November 7, 2012
Academic Commons provides global access to research and scholarship produced at Columbia University, Barnard College, Teachers College, Union Theological Seminary and Jewish Theological Seminary. Academic Commons is managed by the Columbia University Libraries.