Click to Skip Ad
Closing in...

Robots that read our minds will actually exist sooner than you think

Published Jun 20th, 2018 7:45AM EDT
MIT CSAIL Robots
Image: Joseph DelPreto, MIT CSAIL

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Robots are here to stay, and they’re only getting smarter. But it’ll be a while until they respond to our every verbal command, so until then we’ll have to use our mind to communicate with them.

That’s right, you’ll be able to control robots using your mind in the future, as MIT figured out a way to combine brainwaves and hand gestures to allow humans to interact with machines effortlessly.

The idea here is to allow machines to easily correlate and interpret the brain signals and hand movements of a person and turn them into quick robotic actions. That way, humans would not have to master coding skills required to preprogram robots to perform specific tasks in direct response to human interaction.

The technology is still in its infancy and requires a human to wear a couple of cumbersome devices that would measure the electrical activity of the brain and the muscular activity of the hand. That’s how the robot “reads” our minds and muscle movements.

But MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) was quite successful.

“This work combining [electroencephalograph (EEG) and electromyography (EMG)] feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” CSAIL director Daniela Rus said. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The team, led by Ph.D. candidate Joseph DelPreto, used a humanoid robot called Baxter, from Rethink Robotics, during testing, and the project was funded in part by the Boeing Company.

The robot went from choosing the correct target from 70% to 97%, MIT says — see video below. While robots can interpret EEG and EMG to trigger actions, it’s the combination of the two that makes it all possible.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” DelPreto said. “This helps make communicating with a robot more like communicating with another person.”

The system could be used in the future by various types of people, including workers with disabilities or limited mobility, but also the elderly, according to the team. And who knows, this type of robot control could one day be used to conquer space.

A paper detailing MIT CSAIL’s invention will be presented at the Robotics: Science and Systems (RSS) conference in Pittsburgh next week.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.