PhD Defense by Daehyung Park

Title: A Multimodal Execution Monitoring System for Assistive Robots

Daehyung Park
Robotics Ph.D. Student
School of Interactive Computing
College of Computing
Georgia Institute of Technology

Date: January 8th, 2018 (Monday)
Time: 1:00pm to 3:00pm (EST)
Location: 3115 (McIntire Conference Room), BME Whitaker Building


Dr. Charles C. Kemp (Advisor), Biomedical Engineering, Georgia Institute of Technology & Emory University
Dr. Byron Boots, School of Interactive Computing, Georgia Institute of Technology
Dr. Sonia Chernova, School of Interactive Computing, Georgia Institute of Technology
Dr. James M. Rehg, School of Interactive Computing, Georgia Institute of Technology
Dr. Randy Trumbower, Harvard Medical School, Harvard University


Assistive robots have the potential to serve as caregivers, providing assistance with activities of daily living to people with disabilities. Monitoring when something has gone wrong could help assistive robots operate more safely and effectively around people. However, the complexity of interacting with people and objects in human environments can make challenges in monitoring operations. By monitoring multimodal sensory signals, an execution monitor could perform a variety of roles, such as detecting success, determining when to switch behaviors, and otherwise exhibiting more common sense.

The purpose of this dissertation is to introduce a multimodal execution monitor to improve safety and success of assistive manipulation services. To accomplish this goal, we make three main contributions. First, we introduce a data-driven anomaly detector, a part of the monitor, that reports anomalous task executions from multimodal sensory signals online. Second, we introduce a data-driven anomaly classifier that recognizes the type and cause of common anomalies through an artificial neural network after fusing multimodal features. Lastly, as the main testbed of the monitoring system, we introduce a robot-assisted feeding system for people with disabilities, using a general-purpose mobile manipulator (a PR2 robot).

We evaluate the monitoring system with haptic, visual, auditory, and kinematic sensing during household tasks and human-robot interactive tasks including feeding assistance. We show multimodality improves the performance of monitoring methods by detecting and classifying a broader range of anomalies. Overall, our research demonstrates the multimodal execution monitoring system helps the assistive manipulation system to provide safe and successful assistance for people with disabilities.


Here is a conference call for Skype:


However, the number of attendees will be limited to 25.

Event Details


  • Wednesday, January 17, 2018
    1:00 pm - 3:00 pm
Location: 3115 (McIntire Conference Room), BME Whitaker Building