
Experimental paradigm. Credit: Natural Communication (2025). doi:10.1038/s41467-025-61064-x
Robotic systems can significantly enhance the daily lives of more than one billion individuals around the world who experience some form of impairment. Brain computer interfaces or BCIS present persuasive options by allowing direct communication between the brain and external devices and bypassing traditional muscle-based controls.
Invasive BCI demonstrates its ability to control robotic systems with high accuracy, but relying on risky surgical implantation and ongoing maintenance limits use in limited groups of individuals with severe medical conditions.
Professor Bin from Carnegie Mellon University has been investigating non-invasive BCI solutions for over 20 years, particularly non-invasive BCI solutions based on electroencephalography (EEG) and has not undergone surgery and is adaptable in a variety of settings.
His group achieved a series of groundbreaking milestones using EEG-based BCIs, including the first successful flight of the drone, the first control of the robotic arm, and the first flight to control the robotic hand for continuous movement.
As detailed in a new study in Nature Communications, he brings non-invasive EEG-based BCIs a step closer to everyday use by demonstrating real-time brain decoding of individual finger movement intentions and controlling dexterous robotic hands at the finger level.
“For both individuals with disabilities and healthy individuals, improving hand function is a top priority as even small benefits can significantly improve their abilities and quality of life,” explained Bin, a professor of biomedical engineering at Carnegie Mellon University.
“However, real-time decoding of dexterous individual finger movements using non-invasive brain signals remains an elusive target, primarily due to limited spatial resolution of EEGs.”
In the first achievement of EEG-based BCI, he is a group, employing a real-time non-invasive robotic control system that utilizes individual finger movements and motion images to drive corresponding robotic finger movements. Just thinking about it allowed human subjects to successfully perform the two-finger control task.
This was achieved with the assistance of a novel deep learning decoding strategy and a network fine-tuning mechanism for continuous decoding from non-invasive EEG signals.
The goal for the future is to build on this task to achieve more sophisticated, finger-level tasks, such as typing.
“The insights gained from this study have great potential to increase the clinical relevance of non-invasive BCIS and enable applications in more populations,” he added.
“Our research highlights its application beyond the potential for transformation of EEG-based BCIS and its fundamental communication to complex motor control.”
More information: Yidan Ding et al, EEG-based brain computer interface enables real-time robotic hand control at individual finger level, at Nature Communications (2025). doi:10.1038/s41467-025-61064-x
Provided by Carnegie Mellon University
Quote: Brain Computer Interface Robot Hand Control will reach the new finger level milestone (June 30, 2025) obtained from https://news/2025-06—–interface–Robatic Finger-Millestone from July 2, 2025.
This document is subject to copyright. Apart from fair transactions for private research or research purposes, there is no part that is reproduced without written permission. Content is provided with information only.
