Human-centered robotics is the development of highly capable autonomous robotic collaborators designed to help humans in real environments across homes, factories, and outdoors. In recent years, machine learning has advanced robotic perception or control. Yet for state-of-the-art robots, such tasks prove challenging without humans' supervision to perform complex tasks with humans. The robots also require a means of safely executing requested tasks while inferring complete task plans from ambiguous orders. My research focuses on learning manipulation capabilities that enable robots to effectively assist people in various environments leveraging natural language understanding, multimodal perception, and representation learning. In this talk, I will introduce three core capabilities with task representations that advance collaborative manipulation in human environments: skill learning for complex tasks, safety monitoring for assistive manipulation, and planning for human-robot team operations. I will show how to provide highly scalable and reliable assistance when situated in novel environments.
Daehyung Park is a Postdoctoral Associate in Computer Science and Artificial Intelligence Laboratory at Massachusetts Institute of Technology. His research interests span mobile manipulation, healthcare engineering, and natural language understanding to advance collaborative robot technologies. He received a Ph.D. in Robotics at Georgia Institute of Technology. Prior to beginning his Ph.D., he served as a Robotics Researcher at Samsung Electronics Inc from 2008-2012. He received an M.S. from the University of Southern California and a B.S. from Osaka University. He is a recipient of the Korea-Japan Joint Government Scholarship. He has conducted over 14 projects sponsored by NSF, ARL, Lockheed Martin, Toyota, and other government/industrial agencies. See more at: https://daehyungpark.com
Faculty Host: Kris Hauser