Authors: Samuel Charles
Mentors: Marc Killpack
Insitution: Brigham Young University
Robots have an incredible potential to help humans in extreme or dangerous situations due to their significant and consistent durability, strength, endurance, replaceability, etc. However, humans and robots move very differently, leading to difficulties working intuitively with a robot partner when completing a task such as lifting a heavy object. We recently conducted studies in which human subjects moved a 60-lb table to several different positions in a room; we recorded force and torque data, along with many other aspects of the movement. In these studies of human-human co-manipulation, we noticed a trend during particularly difficult maneuvers; when lifting the table to high positions or acute angles, subjects switched their hand holds on the table’s handles. This was likely an easier method of holding the table, but it may have also communicated placement, stability, understanding, and strength to the other partner, leading to a smoother and more intuitive movement and experience overall. If this is the case, this data could be used to help a co-manipulation robot both effectively understand the subtle commands in human movement and intuitively communicate needed movement to the human partner. This is particularly useful in emergencies like natural disaster sites and war zones, in which immediate help is needed, but there is no time to troubleshoot an unclear or unintuitive robot.