Skip to main content
Humans+Robots

One brain, two bionic arms

This man is learning to move two prosthetic arms directly, by thought alone.

By Eric Niiler

Inside a small hospital-room-turned-laboratory in Baltimore, a bald man sits in a wheelchair. Electrical wires lead from his head into a rack of computers, which are plugged into black prosthetic arms that seem to be moving on their own. The man’s actual arms lie folded on his lap. Two researchers call out commands to him — “left, up, again, right, down” — and record data as it streams from video monitors.

The man, Buz Chmielewski, grimaces as he tries to make the prosthetic arms reach for a set of red and blue cloth balls taped to a white board. “Pinch, pinch, extend,” he mutters. “Come on! There we go.” He grunts, bobs his head, and one of the bionic carbon-fiber arms slowly rises and extends. He is controlling the arm using signals from his brain.

“They don’t feel like they are connected to me,” says Chmielewski as he operates the arms using only brain signals. “It’s like I’m thinking, but I’m not thinking.”

This experiment could help quadriplegic patients perform tasks that they can’t today, and it could help with the development of future robot workers. It’s a collaboration between doctors at Johns Hopkins Hospital and engineers at Johns Hopkins Applied Physics Laboratory, a nonprofit research center. The experiment’s funding comes from the National Science Foundation, which is interested in what the team is learning about the brain, and the Pentagon, which wants to know whether soldiers could someday control drones or other vehicles using only their thought patterns.

Chmielewski, a volunteer, won’t directly benefit from the research or get a new set of arms, but he hopes the data being collected will help others. When Chmielewski was 16, a surfing accident on the Maryland coast left him paralyzed from the shoulders down, with partial use of his shoulders and wrists. Doctors recruited Chmielewski from university physical therapy programs in Maryland, where he has mentored younger spinal cord patients. In January 2019, 33 years after his injury, doctors at Johns Hopkins Hospital surgically implanted six devices, called multi-electrode arrays, into the surface of his brain. Each array is two and a half inches square and looks like a computer chip with little spikes underneath. The spikes are small, rigid electrodes that connect to a bundle of nerve cells. Three arrays connect to regions of Chmielewski’s brain that control his left and right arms. The others are connected to brain areas that relay sensory feedback from the prosthetic fingers.

The 10-hour surgery left the 49-year-old sales rep and teacher with three flat metal knobs on his head, the arrays’ connecting ports. Chmielewski occasionally likes to decorate the ports with tiny red horns, for a comically devilish look. Although he usually wears a baseball cap to business meetings, he occasionally gives his clients a friendly jolt by lifting his hat.

“I have no problem talking about what’s on my head,” says Chmielewski, who has been living independently for the past 30 years. “It’s a great conversation piece. If I’m talking to someone with a spinal cord injury, I tell them, here is some technology on the horizon that may be beneficial.”

Chmielewski, who lives north of Baltimore, visits Johns Hopkins three days a week for three- to four-hour sessions. Over time, he has built up the skills to move the prosthetic hands if he concentrates. “I have to relax my whole body,” Chmielewski says.

As he thinks about moving his arms, nerve signals travel from his brain to the computer and then to the prosthetic arm. When the prosthetic fingers touch something, tiny sensors send back information to his brain. To Chmielewski, it feels as if his actual hands are touching something, even though the signals are being rerouted through a computer and an artificial limb.

“Depending on where they touch, I get a different sensation,” he says. “It can range from pressure, like someone is grabbing your hand, to the feeling if you were to rub your finger over a fine-grit sandpaper.”

The research team reads Chmielewski’s brain signals through the multi-electrode arrays. Those signals help to develop a machine-learning algorithm that decodes the user’s intent and turns it into a pattern of specific nerve signals. The computer then translates the neural patterns to direct the arm’s motions. As Chmielewski does more training on the brain-computer interface, the algorithm “learns” what he’s thinking and then moves the arm, much like practicing tennis makes someone better at it.

It’s the first time that scientists have been able to connect someone’s brain to two bionic arms while also getting feedback from his fingers.

In time, patients won’t have to make a conscious decision to move a bionic arm; it’ll just happen.

If successful, this experiment could lead to a major advance for stroke patients or others who have lost their limbs or who suffer from debilitating illnesses. Most prosthetics operate by picking up nerve signals from the arm or leg. But a thought-controlled prosthetic could also work a computer keyboard or mouse, for example.

Pablo Celnik is the experiment’s principal investigator and a professor of physical medicine and rehabilitation at Hopkins. He thinks patients like Chmielewski will one day be able to open a beer, type, or shoot a basketball using two bionic arms. In time, he says, the thought-controlled arms could be attached to an exoskeleton around the patient that would help perform certain tasks.

In time, Celnik says, the goal is to make the connection between the brain and the bionic arms operate more like the way our brain operates. Patients won’t have to make a conscious decision to move an arm; it’ll just happen. An improved brain-to-computer interface also means that the two arms will eventually operate together as a unit rather than individual devices. Imagine one arm holding a nail while the other hammers it into a wall.

Chmielewski grimaces as he tries to make the prosthetic arms reach for a set of red and blue cloth balls. Photo by Johns Hopkins API

“One arm becomes the assistant; the other is the one to deliver fine movement,” Celnik says.

The Johns Hopkins project is one of several underway in the U.S. and Europe that aim to restore movement to paraplegics through direct brain control. The National Institutes of Health awarded $8 million last year to researchers at Carnegie Mellon, the University of Chicago, and the University of Pittsburgh Medical Center to expand an existing trial that allows disabled patients to regain arm and hand function with direct brain signals to prosthetics.

Since Chmielewski’s work with Johns Hopkins involves two prosthetic arms and finger sensations, many more patients could potentially benefit from his experiment. But the challenges are greater, too. Controlling two arms is more complex than controlling one. The computers have to process a lot more data.

Double-bionic arm systems could also be attached to robots someday, to take on dirty, dangerous, or dull missions that humans don’t like to do. Robots could explore and repair sewers or underground tunnels, or fly to Mars on a rover to dig up rock samples. “It’s a little bit of science fiction,” Celnik says, “but in reality.”

The mind-controlled bilateral arm experiment is a continuation of work started in 2006 by the Pentagon’s Defense Advanced Research Projects Agency, or DARPA. The military’s idea was to rapidly improve upper-extremity prosthetic technologies and provide new ways for users to operate them. One of the two arms that Celnik’s team and Chmielewski are testing went home in 2018 with a Florida man, who used it to work, play, and even learn a few tunes on the piano. He’s since returned it to its home at Johns Hopkins’ Applied Physics Laboratory — known as the APL — which conducts engineering work for the military.

In addition to grabbing cloth balls, Chmielewski has been helping the APL work on a project to track down enemy fighters from brain-controlled drones. It’s part of the program the Pentagon calls MAVRCS, for the Multiple Autonomous Vehicle Reasoning and Control System.

Some days, engineers from the APL visit the Hopkins lab and connect Chmielewski to a video game that features images of various rough-looking dudes in an open boat. Using his brain-control interface, Chmielewski evaluates how many men, how much cargo, and how many weapons are on board. He ranks each image with a “danger score.” A computer records Chmielewski’s decision-making brain signals.

Military researchers are using the data from Chmielewski’s gameplay to develop machine-learning algorithms for unmanned aerial vehicles, or UAVs. “So he’s basically training the UAV,” says Matt Rich, MAVRCS lead at the APL, based in nearby Laurel, Maryland. “He’s using his brain control interface to enrich the observations on which the machine learning algorithms act.”

Rich says the military will use the program to allow human operators to control drones using brain signals rather than a joystick. The technology could also be used to train drones to evaluate surveillance targets. The APL is getting closer to setting up a system that would allow Chimelewski to control a flying drone with his brain, a prospect that will probably mean moving to a bigger lab somewhere else. Rich says the testing drones would likely be small quadcopter devices available from Best Buy or Amazon. “These drones can be simulated or real,” Rich says. “We have experiments planned for both.”

The brain-controlled drone experiments are still in their infancy. Still, some human rights critics worry that one day, a weaponized drone equipped with fully autonomous artificial intelligence could make its own decisions to kill, rather than being guided by a human operator. In 2016, the U.S. Navy launched a 135-ton ship, the Sea Hunter, that can patrol the oceans without a crew — a project that a DARPA official described as “a new vision of naval surface warfare.” Built to detect and track submarines, the Sea Hunter could one day attack them directly, perhaps with help from the kind of surveillance drones under development at APL. In April, the Pentagon said it plans to spend $3.7 billion on unmanned weapons systems in fiscal year 2020, plus almost $1 billion on AI systems to counter threats from similar programs in China and Russia. But so far, U.S. military officials haven’t given machines full control, and they say there are no firm plans to do so, according a recent report in The Atlantic.

Chmielewski, who planned to join the Marines before his accident, says he’s happy to play Top Gun, in the service of both military and civilian scientists. But the real thrill of the experiment, he says, is experiencing the new technology as it develops. He compares the work he’s doing in the lab to his love of video games.

“I’m the typical guy who likes to open up a new toy and doesn’t want to read the instructions,” Chmielewski says. “That’s kind of what I did when I got in here and saw all the arms. I was like, ‘Oh, let’s play.’ ”

Published on

Eric Niiler is a writer based in the Washington, D.C., area.

 

Top photo of Buz Chmielewski using the prosthetic arms by Johns Hopkins API

Humans+Robots

How to prepare for sentient AI

And why the experts hope it never happens

By Schuyler Velasco