Menu Close

Mind control skills are learnt like motor skills, study finds

We learn to use brain-computer interfaces in the same way we learn motor skills like swinging a golf club. Image from shutterstock.com

The patterns of brain activity people use to learn to move objects with their mind are similar to neurological activity that occurs when learning to ride a bike or swing a golf club, researchers have found.

Scientists have previously been successful in developing brain-controlled tools – such as robotic arms and motorised wheelchairs – but until now it has been unclear how users learnt to use them.

The research is published today in the journal PNAS.

The research team, from the University of Washington in Seattle, set out to test the mechanism by which seven people with epilepsy learnt to control a brain–computer interface (BCI).

Participants had sensors implanted into their brain, which translated electrical impulses into actions. They were then asked to move a ball across the screen to hit a target, using their mind.

The participants started out by consciously thinking about moving the ball and practised this activity. As they became more adept to the task, activity in some parts of the brain decreased, reflecting a mental shift from cognitive to automatic task processing.

This activity mimics the way the brain processes and learns motor movements.

The researchers hope the study will help users to perform increasingly complex tasks with their mind, such as moving realistic artificial limbs, or to simultaneously control several different tasks.

Dr Dean Freestone, postdoctoral research fellow at the University of Melbourne’s Centre for Neural Engineering, said the study focused on a relatively unexplored area of research.

“Previous work has focused on searching for very specific patterns in brain activity that indicate the user’s intent,” said Dr Freestone, who was not involved in the study.

“Understanding the brain’s learning mechanisms has the potential to lead to new types of brain machine interfaces, which can act to control prosthetic limbs and act as new communication channels from the brain,” he said.

But Dr Freestone said the results weren’t particularly surprising, since other studies have shown that the “brain has the ability to rapidly learn [to] change its output with goal-directed tasks”.

Rami Khushaba, visiting fellow at the University of Technology’s School of Electrical, Mechanical, and Mechatronics Systems, said the study was a step forward but much work needed to be done to iron out the technical problems of brain-computer interfaces before they could be offered to patients.

“Currently, wearable devices suffer from many limitations including that fact that they must be re-calibrated for each new user and even within the same user, as the signal’s characteristics tend to change after certain amount of time,” he said.

“We still don’t have any adaptable interfaces that you can just plug and play or take them and wear them and do whatever task you want to do with them.

"With these kind of limitations, it is very obvious that much more work is required to achieve our ultimate goals of practicality and applicability.”

Dr Khushaba said real-life uses, such as being able to move a wheelchair just by thinking about it, or similar brain-powered devices to assist people with other disabilities, were still a few years off.

Want to write?

Write an article and join a growing community of more than 182,100 academics and researchers from 4,941 institutions.

Register now