However, reading brain signals has always been a problem
Despite achieving promising results, researchers have had major problems while working with BCIs.
Essentially, in order to convert neural signals into meaningful action, they first have to detect the signals, which revolves around invasive surgical implants or EEG headcaps.
Now, here's the thing, surgical implants are hardly used as they pose a risk to life, while headcaps don't offer highly accurate data.
Headcaps affect ultimate actions
The headcap-based technique is non-invasive but it captures brain signals with so much noise that the decoded information doesn't come out clearly and the resulting action is also not precise.
Mind-controlled robotic arm comes as a solution
To tackle the issue of data resolution, researchers from Carnegie Mellon University developed a BCI-linked robotic arm that achieves the middle ground.
They employed novel machine learning and sensing techniques to capture neural signals from outside the skull and control the robotic device with the same level of smoothness and precision that one expects from a brain implant.
Robotic arm followed a cursor by reading brain signals
In a recent test, the researchers used their non-invasive BCI on a few participants and asked them to follow a cursor on a computer screen.
As the cursor moved, the participants tracked it and the BCI clearly read the signals and replicated the same for the robot.
The machine followed the cursor's movement smoothly and kept pointing at in real-time.
Future of BCI assisted technologies
Now, the researchers plan to test this machine in clinical settings.
They hope that their system could aid paralyzed or physically disabled people by giving them an advanced way to interact with their environment.
"This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones," lead researcher Bin He concluded.