The dancer, spotlit on a dark stage, bends at the waist and lifts her head to face her partner. Her back to the audience, she pivots to her right, extending her arm at a roughly 45-degree downward angle in unison with the other dancer, who appears on a large video screen beside her. Slowly, smoothly, the two rotate forward, each swinging an arm, like mirrored images of a Flexo lamp bending to focus on a vast black desk.
Suddenly the two bodies break synchronization. Each finds its own pulse within the midtempo new age music, improvising movements while responding subtly to the other. It’s a physical call and response. They are truly dancing.
This is remarkable because there is only one human in the performance. She is Catie Cuan, a dancer and choreographer who has performed with the Metropolitan Opera Ballet in New York and the Lyric Opera of Chicago. Cuan, 31, is also a PhD candidate in the mechanical engineering department at Stanford University, where she specializes in robotics. Her dance experience clarifies why she is on this stage, while her studies explain the presence of her partner, whom she calls Wen, but who is also known as ABB IRB 6700 — a 15-foot, 500-pound industrial robot arm, patched in from a studio via webcast.
The performance is titled “Output,” and it’s intended to illuminate how humans and robots are represented through technology and to explore new ways in which the two can interact. ThoughtWorks Arts, an incubator of collaboration between artists and technologists, developed the software that made “Output” possible.
The idea behind Cuan’s work is not just to showcase a 21st-century ballet with a real dancing machine. It’s also to show the audience a gentler, more relatable side to the seemingly cold and soulless contraptions — the robotic surgeons, drivers, educators, and even police — that they’ll increasingly interact with in the future.
We may not see robot chorus lines or autonomous mechanical ballet companies onstage in the near future, yet robots of different physical types are already everywhere outside the theater, working among us.
“People gather their impressions of robots through the media,” says Cuan. “In the movies and on TV, robots are hyperbolic, usually male, and almost always adversarial. If you want to broaden the palate, and peoples’ perceptions, they need to see narratives that are different.”
For Cuan, the dance came first — almost, she jokes, before she was even born. Her father was a Cuban-born cinematographer living in Berkeley, California, and Cuan cites an old saying: “If you don’t know how to dance, you might not be Cuban.” Still, she took tap, jazz, and other dance classes starting at age 3. At 13, she entered the modern dance program at Berkeley High School, where she discovered choreography. After directing and choreographing her first production — a sweeping, breathy number about the passage of the seasons set to a moody Sia song — Cuan was hooked.
But computers were part of her childhood, too. Born in 1990, essentially next door to Silicon Valley, Cuan was 9 when her family got its first home PC. She remembers loving the sound of the machine booting up, the fans whirring to life inside. She spent hours hogging the phone lines on AOL dial-up, chatting and checking email. After high school, she interned at YouTube. While definitely a digital native, Cuan narrowly missed the social media onslaught that has left many of today’s teens ambivalently documenting their entire lives. She believes that freed her to view technology with a naïve optimism.
The unlikely intersection of these two interests came in 2014. Cuan’s father suffered a stroke that led to weeks in the hospital and time at home doing occupational therapy. Throughout his treatment, Cuan saw him surrounded by large, esoteric machines that made him feel alienated, scared, and disempowered. “I thought to myself, ‘I’m an expert in making meaning out of movement,’” Cuan says. “‘Why can’t we apply that knowledge to the machines in these unexpected places like hospitals?’”
So, she did. In a dual role as both artist-in-residence and research technician at the University of Illinois, she collaborated with the school’s Robotics, Automation, and Dance Lab to create a performance piece featuring humans and robots. She created a piece called “Time to Compile,” which contrasted the relatively short time it takes a human to learn a new dance by visual demonstration with the long process of programming a machine to do the same moves. She also invited the audience to engage with the machines and then talk about whether the experience changed their perception of robots. A later project involved adapting jazz and ballet moves into a robot’s rigid joint movements to make them appear more fluid.
“I could have made music videos with robots forever, and that could have been a career,” Cuan says. “But that’s not rich enough for me. It doesn’t touch upon the human-robot interaction. The arts could have a huge, scalable impact, if we could just unlock the secret that allows us to make new interfaces.”
Cuan’s work has attracted admiration from others in her emerging field, such as Sydney Skybetter, a dancer and Brown University professor who teaches choreographics — his term for the emerging study of dance, computation, spectatorship, and surveillance. “As a choreographer, she’s an expert in how Western dance tradition treats gesture and focus,” says Skybetter. “As a technologist, she’s thinking about bodies and technology differently. Your average tech bro isn’t going to be as well versed in bodily agency and choice-making.”
Some more traditional dance artists have been resistant to Cuan’s work with robots. She says other choreographers have told her to her face that “dance is something humans do.” Cuan’s work may have taught machines how to “create” new movements using datasets and algorithms, but a robot cannot be inspired. It doesn’t have emotions to express or exchange with a human partner.
From the technologist’s perspective, though, that’s missing the point. We may not see robot chorus lines or autonomous mechanical ballet companies onstage in the near future, yet robots of different physical types are already everywhere outside the theater, working among us. Is there any doubt that we’d be better served (quite literally) by having consumer-facing machines that are more approachable?
“In a health care setting, people are already more comfortable disclosing uncomfortable information, like substance abuse or sexual behavior, to a robot than a human,” says Timothy Bickmore, professor and associate dean for research at the Khoury College of Computer Sciences at Northeastern University. “There are certain verbal and nonverbal behaviors — displays of empathy, immediacy behavior, and more gestural frequency — that doctors use to make patients trust them. Why not make the robots more emotive to build more trust?”
Trust is the operative word. Cuan believes that for any trusting relationship to take root between human and machine, the humans need to at least feel like they are still in charge. “How can robots learn from humans in a way that enables the humans to remain in control?” says Cuan. “That seems like a research question, but it could be an artistic one.”
Back onstage, Cuan and Wen, the robot arm, embody this complex relationship. While the motions are fluid, there are momentary pauses of seeming uncertainty, as if one or the other is waiting to see what the other will do. But obviously, Cuan trusts the machine’s programming not to deviate in a way that would ruin the live performance. And for Wen’s part, its movements are smooth and attentive to Cuan’s. It’s easy to imagine Cuan entrusting a similar robot to pour her a cup of coffee or even to perform a life-saving surgery. And she hopes that, as more engineers design robots for the workplace, they will take some cues from the cooperative rhythms of dance.
“In the future, we’re going to see millions of people use robots, and not in the passive way they have been,” Cuan asserts. “And I don’t know how you can do this without choreography.”