Theses Doctoral

Improving Robotic Manipulation via Reachability, Tactile, and Spatial Awareness

Akinola, Iretiayo Adegbola

Robotic grasping and manipulation remains an active area of research despite significant progress over the past decades. Many existing solutions still struggle to robustly handle difficult situations that a robot might encounter even in non-contrived settings.For example, grasping systems struggle when the object is not centrally located in the robot's workspace. Also, grasping in dynamic environments presents a unique set of challenges. A stable and feasible grasp can become infeasible as the object moves; this problem becomes pronounced when there are obstacles in the scene.

This research is inspired by the observation that object-manipulation tasks like grasping, pick-and-place or insertion require different forms of awareness. These include reachability awareness -- being aware of regions that can be reached without self-collision or collision with surrounding objects; tactile awareness-- ability to feel and grasp objects just tight enough to prevent slippage or crushing the objects; and 3D awareness -- ability to perceive size and depth in ways that makes object manipulation possible. Humans use these capabilities to achieve a high level of coordination needed for object manipulation. In this work, we develop techniques that equip robots with similar sensitivities towards realizing a reliable and capable home-assistant robot.

In this thesis we demonstrate the importance of reasoning about the robot's workspace to enable grasping systems handle more difficult settings such as picking up moving objects while avoiding surrounding obstacles. Our method encodes the notion of reachability and uses it to generate not just stable grasps but ones that are also achievable by the robot. This reachability-aware formulation effectively expands the useable workspace of the robot enabling the robot to pick up objects from difficult-to-reach locations. While recent vision-based grasping systems work reliably well achieving pickup success rate higher than 90\% in cluttered scenes, failure cases due to calibration error, slippage and occlusion were challenging. To address this, we develop a closed-loop tactile-based improvement that uses additional tactile sensing to deal with self-occlusion (a limitation of vision-based system) and adaptively tighten the robot's grip on the object-- making the grasping system tactile-aware and more reliable. This can be used as an add-on to existing grasping systems.

This adaptive tactile-based approach demonstrates the effectiveness of closed-loop feedback in the final phase of the grasping process. To achieve closed-loop manipulation all through the manipulation process, we study the value of multi-view camera systems to improve learning-based manipulation systems.
Using a multi-view Q-learning formulation, we develop a learned closed-loop manipulation algorithm for precise manipulation tasks that integrates inputs from multiple static RGB cameras to overcome self-occlusion and improve 3D understanding.
To conclude, we discuss some opportunities/ directions for future work.


  • thumnail for Akinola_columbia_0054D_16518.pdf Akinola_columbia_0054D_16518.pdf application/pdf 5.91 MB Download File

More About This Work

Academic Units
Computer Science
Thesis Advisors
Allen, Peter K.
Ph.D., Columbia University
Published Here
May 3, 2021