Link to Talk Video: https://mediaspace.illinois.edu/media/t/1_1c9xeiit
Abstract: In this talk, I will introduce the development of a high-resolution robotic tactile sensor GelSight, and how it can help robots understand and interact with the physical world. GelSight is a vision-based tactile sensor that measures the geometry of the contact surface with a spatial resolution of around 25 micrometers, and it also measures the shear forces and torques at the contact surface. With the help of high-resolution information, a robot could easily detect the precise shape and texture of the object surfaces and therefore recognize them. But it can help robots get more information from contact, such as understanding different physical properties of the objects and assisting manipulation tasks. The talk will cover our work on using GelSight to detect slip during grasping and perceiving object properties such as hardness and viscosity of the liquid. I will also present our work in simulating the tactile sensor and using the simulated sensor input to boost the robot’s capability to perform perception and grasping tasks in reality.
Bio: Wenzhen Yuan is an assistant professor in the Robotics Institute at Carnegie Mellon University and the director of the CMU RoboTouch Lab. She is a pioneer in high-resolution tactile sensing for robots, and she also works in multi-modal robot perception, soft robots, robot manipulation, and haptics. Yuan received her Master of Science and PhD degrees from MIT and Bachelor of Engineering degree from Tsinghua University. She also worked as a Postdoctoral researcher at Stanford University.