Skip to main content
Utah's Foremost Platform for Undergraduate Research Presentation
2025 Abstracts

Integrating Haptic Feedback for Precision and Intuitive Robotic Control in Virtual Reality

Author(s): Daniel Jin
Mentor(s): Bing Jiang
Institution UTech

Remote robotic control can be used in a wide range of situations such as with unmanned vehicles or robot-assisted surgeries. Precise control is crucial for these operations, requiring accurate movements from human users, which usually requires a lot of training. Recent advances in Virtual Reality (VR) technology open up another promising method for remote operation training. Traditionally, training systems rely heavily on visual information to control robots. However, visual systems can struggle with tracking transparent, reflective, or partially hidden objects and are sensitive to poor lighting, glare, and shadows. They also have difficulty detecting subtle physical changes like vibrations, forces, or temperature, which are crucial for precise manipulation and physical contact. This can lead to errors in object detection and handling. We are developing a light-weighted closed-loop system that integrates haptic feedback with visual feedback for VR-based robotic control training. By integrating additional sensory feedback (e.g., proximity, contact, pressure, texture, and fraction), users can intuitively interact with the virtual object, addressing the challenges posed by traditional visual feedback. We hypothesize that introducing real-time haptic feedback will reduce human error and improve control accuracy leading to improved reliability, and performance in complex environments. The system consists of an Inertial Measurement Units (IMUs), pressure sensors, a vibrating motor, a microcontroller (MCU), and a VR headset (Meta Quest 2). A virtual robotic arm with grippers was developed in Unity. The MCU collects this data and displays real-time movements to the user. In VR, the system detects proximity, contact, and pressure, providing haptic feedback back to the user's fingertip through vibrations that vary with the detected interactions. We designed a sorting task to assess the effectiveness of sensory feedback in robotic control. Four color-coded bins and matching objects are placed around the robot, and the operator must sort the objects into the correct bins. The task will be performed with and without haptic feedback for comparison. The task completion time, error rate, gripper path track, and gripper contact points will be recorded to evaluate the performance of the operator. A subjective assessment (e.g., NASA Task Load Index) will be used to measure operator comfort and fatigue when using the system with and without haptic feedback. By integrating Virtual Reality and haptic feedback in robotic control systems, we aim to enhance operators' environmental awareness, making remote robotic control more intuitive and accurate, with the potential to be effectively applied in real-world scenarios.