Skip to main content
Humans+Robots

Put down the joystick. Just use your brainwaves.

EEG technology has many practical uses. It also lets you race drones with your mind.

By Hannah Thomasy

The gym grows quiet as the audience in the bleachers watches a pair of drones, about to take flight.

“Three!”

“Two!”

“One!”

“Go!”

A high-pitched buzzing fills the room as the drones gently lift into the air and fly forward. They’re piloted by two university students, their brows furrowed with concentration — and their hands at their sides. They aren’t using controllers. In fact, they’re not moving at all. Instead, electrodes placed on their foreheads and scalps are picking up on the electrical signals produced by their brains, effectively letting them fly the drones using only the power of their minds.

The audience, seated behind a web of protective netting in case of a rogue drone, cheers as the first drone crosses the finish line.

This is the annual Brain-Drone Race, designed to test the pilots’ concentration abilities and demonstrate the power of brain-computer interface technology. (“Attending feels like a mix between a research conference talk and a basketball game,” says Chris Crawford, co-founder of the Brain-Drone Racing League, which held its first event at the University of Florida in 2016.)

In this instance, at the University of South Florida in February 2019, the interface is based on a technique called electroencephalography, or EEG. Different brain states — such as sleepiness or attentiveness — produce different patterns of electrical activity, also known as brain waves. Electrodes on the pilots’ scalps relay these electrical signals to computers, which analyze the activity patterns. In the Brain-Drone Race, the computers signal the drones to move forward (via Bluetooth) when they detect high levels of a type of brain activity called a beta wave, often considered a marker of concentration and focus.

So, when the race pilots enter a state of intense focus, their drones fly forward. The EEG headsets are consumer-grade, so they only have a few electrodes, which means they can only perform simple movements. (A research-grade device could have hundreds of electrodes.) On the flip side, these consumer-grade headsets are easy to take on and off, and they don’t require much training to use. “We lean toward consumer-grade devices, since this is an entertainment application, and our goal is to provide participants their first experiences with this technology,” says Crawford, who is also director of the Human-Technology Interaction Lab at the University of Alabama. “This also means that we break a lot of devices.”

Though the Brain-Drone Race has the air of science fiction, the technology behind EEG recording has been around for a very long time — nearly a century, in fact. Brain-wave readings have long been used by doctors to diagnose certain medical conditions such as epilepsy or sleep disorders. Researchers have also been developing brain-controlled wheelchairs and robotic arms and studying uses of EEG for military applications.

But in the past decade, decreasing costs of EEG recording devices and improvements in our ability to decipher their signals have made this technology much more accessible to the general public. Now, brain-controlled technologies are popping up for recreation — from toy helicopters you can buy on Amazon to mind-controlled flamethrowers made by creative DIYers. And technologists say those simple uses of brainwave technology are just the beginning. In the future, EEG could change how we play video games, experience cinema, and function in our daily lives.


Before new developments in EEG technology, creating a detailed link between brain activity and robotic movement — like with an ultra-precise robotic arm — required implanting electrodes directly into the brain. It was a risky procedure that was generally only used as a last resort.

But in 2019, Bin He, a professor of biomedical engineering at Carnegie Mellon University, published a paper describing the first non-invasive robotic arm that could track a moving cursor on a computer screen, using electrodes on a headpiece, attached to a user’s head.

Rather than assessing overall brain state, as in the Brain-Drone Race, the many EEG electrodes used to control He’s robotic arm pick up signals in the motor cortex, the area of the brain that controls movement. For example, if the participant thinks about moving their right hand, the computer will be able to detect this and move the robotic arm to the right.

By analyzing the EEG, a computer can detect when the film’s viewer is paying attention or not and alter the film accordingly, changing which scenes it shows.

In the near future, some artists believe that same kind of EEG technology will allow them to create the next generation of interactive films. Traditional interactive entertainment, like the 2018 TV movie Black Mirror: Bandersnatch, asks people to choose between different options by using a remote control or other handheld device. That interferes with the audience’s immersion in the film, says filmmaker Richard Ramchurn. “It takes you right out of the story.”

But Ramchurn’s futuristic short film The Moment, released in 2018, allows viewers to control the movie passively. They do so by wearing an EEG-recording headset, which resembles a pair of headphones with an extra band across the forehead. By analyzing the EEG, a computer can detect when the viewer is paying attention or not and alter the film accordingly, changing which scenes it shows.

The Moment follows three main characters as they navigate a dark future world in which certain people, called Outliers, are hunted and killed by militias. The film has three main narrative threads, but how much of each narrative is included depends on the viewer. Perhaps a viewer is bored of following Astrea, the Outlier on the run, but fascinated by the story of Andre, the former militia member. The film can alter itself accordingly. “The narrative which the viewer pays the most attention to will become more prominent in the next scene,” says Ramchurn.

The computer also analyzes a person’s attention during specific elements of the film, Ramchurn says, “and so it makes a decision on what elements are going to be in the next scene based on that. So the film really can play in lots of different combinations.”

Researchers also have high hopes for the innovations that non-invasive brain-computer interface could bring to gaming, although this exploration is still in the early stages of development. Mike Ambinder, an experimental psychologist at the video game company Valve, says that measuring players’ brain waves as they play — quantifying their emotion and cognition — could one day help creators design games that adapt, in real time, to what players are feeling or thinking.  

“If a player is feeling happy or sad, challenged or bored, frustrated or engaged, learning a concept, forgetting a concept, in an exploratory mood, dealing with a toxic player, entering into a flow state, struggling with accessibility, and so on,” Ambinder says, “the game will be able to respond to that information and shape gameplay accordingly.”

Ambinder says there’s a lot of work to be done before brain-controlled gaming can become a reality. “The trick is being able to transport these findings outside of the lab and into the real world — playing on your couch, sitting in front of your desktop,” he says.

Ambinder says that both hardware and software improvements are needed to allow EEG-based technologies to reach their full potential. Better hardware could improve signal quality. Larger data sets and improved algorithms could classify mental states in a wide variety of people with unique neuroanatomical and psychological quirks.  

But one day, these technologies could be as much a part of our daily lives as assistants like Siri and Alexa are today. Bin He, the biomedical engineer, is working on assistive devices for people with disabilities. He says he hopes someday to develop a non-invasive brain-controlled device for use as an everyday personal assistant for the public.

Theoretically, your mind could then control anything that a computer controls. A mind-powered phone could send text messages. A mind-powered drone could bring you a snack from the kitchen. And a computer hooked up to your brain signals could play the right music or movie to suit your mood. You could be in complete control over your environment without having to lift a finger, living in a science-fiction paradise.

Published on

Hannah Thomasy is a writer based out of Toronto and Seattle. She has written for Undark MagazineOneZeroHakai Magazine, and Atlas Obscura.

Humans+Robots

Why Africa needs a ‘Google Translate for science’

Some Indigenous languages don't have words for common scientific concepts. AI could help.

By Sibusiso Biyela