PhD Defense: A FRAMEWORK FOR DEXTEROUS MANIPULATION THROUGH TACTILE PERCEPTION

Talk
Kanishka Ganguly
Time: 
11.18.2022 16:00 to 18:00
Location: 

IRB 4105

A long-anticipated, yet hitherto unfilled goal in Robotics research has been to have robotic agents seamlessly integrating with humans in their natural environments, and performing useful tasks alongside humans. While tremendous progress has been made in allowing robots to perceive visually, and understand and reason about the scene, the act of manipulating said environment still remains a challenging and incomplete task.For robotic agents to have capabilities where they can perform useful tasks in environments that are not specifically designed for their operation, it is crucial to have dexterous manipulation capabilities guided by some form of tactile perception. While visual perception provides a large-scale understanding of the environment, tactile perception allows fine-grained understanding of objects and textures. For truly useful robotic agents, a tightly coupled system comprising both visual and tactile perception is a necessity.Tactile sensing hardware can be classified on a spectrum, organized by form-factor on one end to sensing accuracy and robustness on the other. Most off-the-shelf sensors available today trade off one of these features for the other. The tactile sensor used in this research, the BioTac SP, has been selected for its anthropomorphic qualities, such as its shape and sensing mechanism while compromising on quality of sensory outputs. This sensor provides a sensing surface, and returns 24 tactile points of data at each timestamp, along with pressure values.We first present a novel method for contact and motion estimation through visual perception, where we perform non-rigid registration of a human performing actions and compute dense motion estimation trajectories. This is used to compute topological scene changes, and is refined to get object and contact segmentation. We then ground these contact points and motion trajectories to an intermediate action-graph, which can then executed by a robot agent.Secondly, we introduce the concept of computational tactile flow, which is inspired by fMRI studies on humans where it was discovered that the same parts of the brain that react to optical motion stimulus also react to tactile stimulus. We mathematically model the BioTac SP sensor, and interpolate surfaces in two- and three dimensions, on which we compute tactile flow fields. We demonstrate the flow fields on various surfaces, and suggest various useful applications of tactile flow.We next apply tactile feedback to a novel controller, that is able to grasp objects without any prior knowledge about the shape, material, or weight of the objects. We apply tactile flow to compute slippage during grasp, and adjust the finger forces to maintain stable grasp during motion. We demonstrate success on transparent and soft, deformable objects, alongside other regularly shaped samples.Lastly, we take a different approach to processing tactile data, where we compute tactile events taking inspiration from neuromorphic computing literature. We compute spatio-temporal gradients on the raw tactile data, to generate event surfaces, which are more robust and reduces sensor noise. This intermediate surface is then used to track contact regions over the BioTac SP sensor skin, and allows us to detect slippage, track spatial edge contours, and magnitude of applied forces.

Examining Committee

Chair:

Dr. John Aloimonos

Dean's Representative:

Dr. Miao Yu

Members:

Dr. Cornelia Fermuller

Dr. Dinesh Manocha

Dr. Nitin J. Sanket (Worcester Polytechnic Institute)