Human Interfacing with Artificial Intelligence Skip to main content
Utah's Foremost Platform for Undergraduate Research Presentation
2025 Abstracts

Human Interfacing with Artificial Intelligence

Author(s): Sebastien Fregeau, Ella Hansen,Thomas Munro, Lucca Coelho, Rodrigo Armaza, Efe Sezer, Efe Kaya
Mentor(s): Masoud Malekzadeh
Institution SUU

Artificial intelligence (AI) is transforming many aspects of modern life, enabling advancements in areas like autonomous vehicles, natural language processing with tools like ChatGPT, and predictive maintenance for industrial systems. However, the potential for AI extends far beyond these applications, especially in creating seamless human-machine interfaces. Our research focuses on bridging the gap between human brain activity and AI-driven systems, using a Brain-Computer Interface (BCI) equipped with a 16-channel electroencephalogram (EEG) device. This technology captures electrical signals directly from the brain, allowing us to interpret neural activity related to specific intentions or actions. In this study, we aim to leverage deep learning models, specifically Recurrent Neural Networks (RNN) with Long Short-Term Memory (LSTM) layers, to interpret and analyze these brain signals. The LSTM layers are chosen for their ability to handle sequential data and capture long-term dependencies in time-series information, making them particularly suitable for the dynamic and complex nature of brain signals. We are conducting experiments to train our RNN model using brain data associated with directional thinking (e.g., imagining moving left or right). The model's performance will be evaluated based on its prediction accuracy in identifying the intended direction. Additionally, we are incorporating electromyography (EMG) signals from localized muscle activations to enhance the robustness of the system. This dual-signal approach aims to improve the prediction accuracy by integrating both neural and muscular cues. The ultimate goal of our project is to create a mind-controlled robotic system. By establishing a reliable link between live brain recordings and robotic actions, we hope to demonstrate a practical application of AI in human-robot interaction. Such a system has the potential to be transformative in fields like assistive technology, where it could enable hands-free control of devices for individuals with mobility impairments. Our research represents an exciting step towards realizing the vision of direct, intuitive control over machines using only human thought.