A team of researchers at the University of Minnesota has developed a new non-invasive brain-computer interface that allows people to control a robotic arm using only their minds.

This new technique, described in a paper published in the journal Scientific Reports, could potentially help millions of people who were paralyzed or suffering from neurodegenerative diseases.

"This is the first time in the world that people can operate a robotic arm to reach and grasp objects in a complex 3D environment using only their thoughts without a brain implant," said Bin He, a biomedical engineering professor at University of Minnesota and lead researchers of the study, in a press release. "Just by imagining moving their arms, they were able to move the robotic arm."

Dubbed as electroencephalography (EEG) based brain-computer interface, the non-invasive technique uses a specialized, high-tech EEG cap. This brain cap is fitted with 64 electrodes, making it possible to record weak electrical activity of the subjects' brains and covert their thoughts into actions with the help of advanced signal processing and machine learning.

To test out their brain-computer interface, the researchers recruited 8 healthy human subjects to complete experimental sessions wearing the cap.

At first, the subjects were asked to learn to control a virtual cursor on the computer screen using their minds. After that, they learned to control a robotic arm to reach and grasp objects in fixed location. Eventually, the participants were able to freely grasp objects in random position and move certain objects from a table to a three-layer shelf with only their minds.

The average success rate of the participants when it comes to picking up objects with their minds is above 80 percent, while the average success rate of moving an object from the table into the shelf is 70 percent.

The researchers noted that the brain-computer interface works due to the tiny electric current produced by the neurons in the motor cortex every time people think about movement. Different kinds of movement activates different assortment of neurons. By sorting out the different assessments using advanced signal processing, the researchers were able to convert those thoughts into movements that can be translated by the robotic arm.