This robot arm can be controlled by your brain power

A robotic arm, a machine learning algorithm and a brain-computer interface have been combined to create a way to help quadriplegic patients (those who cannot move their upper or lower body) interact with their world. While this isn’t the first time a brain interface has been used to control a robot, it took the technology one step further by estimating and understanding brain signals without patient intervention.

This research was carried out by researchers from the Ecole Polytechnique Fédérale de Lausanne (EPFL). Professor Aude Billard, head of EPFL’s Learning Algorithms and Systems Laboratory, and José del R. Millán, former head of EPFL’s Brain-Machine Interface Laboratory, worked together to create a program computer capable of controlling a robot using electrical signals from a patient’s brain. .

The team used a machine learning algorithm to interpret signals from the patient’s brain and translate them into the joint of a robot arm.

The patient’s brain activity was monitored by an EEG cap – which effectively scans the electrical activity inside your head. These brain waves would then be sent through a computer to be interpreted by the machine learning algorithm. The algorithm translates brain signals when the patient notices an error, automatically inferring when the brain dislikes a certain action.

In the team’s research, they used the robot’s arm with a glass. The arm would move towards the glass and the patient’s brain would decide if it felt it too close or too far away. The process is repeated until the robot understands the optimal route for the individual’s preference – not too close to be a risk but not too far to lose motion.

“The brain signals we register will never be the same again. We have variability over time and that is natural. Why? Because if I move my hand, the brain doesn’t just focus on that, the brain processes a lot of other things,” Millán said. “So the fact that there is this variability means that our decoder will never be 100% accurate.

However, thanks to the machine learning algorithm used in this research, the robot can gain a better understanding of variability to predict brain signals in certain situations. For example, distance preference when walking past a drink or, in practical circumstances, the distance a quadriplegic patient in a wheelchair is willing to reach with other people on the street.

Implementing the algorithm in a wheelchair is an example of where the technology could go in the future. This would allow people in wheelchairs to have better control over their movements, speed and general safety. The algorithm could interpret brain signals to understand a user’s speed preference, how far away they are happy to be from obstacles and people and even how much risk they are willing to take in certain circumstances, for example if he is late or in a busy place.

It’s interesting to use this algorithm rather than using speech, for example, because there are things you can’t necessarily articulate easily,” Billard said. “A layman may not be able to articulate that he doesn’t like the acceleration of a wheelchair, for example. What exactly don’t you like? How does this translate into a control parameter? »

This is where the technology stands out from other disability aids available. By allowing the algorithm to understand signals from your brain, it can interpret exact feelings that an individual could not explain on their own. However, this requires consistency over time on the part of the algorithm and that the detection has proven to be statistically significant.

Without this consistency, the algorithm could fail in real-world situations. If, for example, someone was driving a wheelchair through a crowd and passed people in an argument, the person could generate an error that has nothing to do with the driving experience.

Read more:


Source link