Skip to main content
Humans+Robots

When AI joins the band

The dance-pop trio YACHT used machine learning to build an album. Here’s what happened.

By Tony Rehagen

In April, singer and composer Claire L. Evans found herself in unfamiliar territory. Her band, the Los Angeles-based post-pop trio YACHT, was playing onstage at a documentary film festival in Copenhagen. It was the group’s first live performance since February 2020. Despite the musicians’ rigorous preparation, the songs on the setlist, from their 2019 album Chain Tripping, weren’t coming back to them as easily as they should have.

“We had no muscle memory for this music,” says Evans, the lead vocalist. “The melodies are harder to retain, because they didn’t come from our bodies in the first place.”

The songs were YACHT originals, with words and melodies derived from the creative minds of Evans and electronic musicians Jona Bechtolt and Rob Kieswetter, the band’s regular songwriters. But all 10 tracks on Chain Tripping — songs with names like “Loud Light,” “Sad Money,” and “Stick it to the Station” — were completely written by machine learning.

How the band managed to use artificial intelligence as a collaborator, a sort of phantom band member, has already become part of music-industry lore. Starting in 2016, the members of YACHT took the music from their first six albums, 82 songs in all, and ran it through a series of algorithms that essentially chewed up their body of work, analyzed it, and spat out brand-new songs. Chain Tripping — the album’s name was also machine-generated — garnered national headlines for its use of the burgeoning technology, a Grammy nomination for best immersive audio album, and mostly positive reviews; one music magazine called it “an oddity worth exploration.”

Now, three years after the album’s release, the band isn’t converting fully to AI songwriting. But Evans says the experience has not only changed their creative process, but also their understanding of how music is made.

“We went into this naively and blindly,” says Evans. “Part of the intent of this project was that, maybe by running our back catalog through these mystical algorithmic processes, we will come to some better understanding of who we are, tap into the DNA of a YACHT song, and understand the formula of what makes us who we are. I think what doing this made us realize is that that formula doesn’t exist.”


With its AI experiment, YACHT tapped into a relationship between computers and music that goes back to the 1950s and 1960s, when engineers like Harry Olson, Herbert Belar, and later, Robert Moog and Bernie Krause began experimenting with the use of electronic synthesizers as accompanying instruments. In the 1990s, scientist and music professor David Cope developed algorithms in software called EMI — Experiments in Musical Intelligence — that could produce original works in the styles of famous composers, like Bach. Cope followed up that work a decade later with an interactive program that could build compositions from a database and shape it according to listener feedback.

That was the precursor to the technology YACHT used to create Chain Tripping, says David Bernstein, a professor of music at Mills College at Northeastern University who specializes in American experimentalism and avant-garde music. He predicts that YACHT’s high-profile collaboration with AI could encourage more experiments with music and technology.

“When I think about AI, I’m thinking about experimental music and the emphasis on spontaneity; in other words, to create music that’s coming out in real time without notation and without preparation,” Bernstein says. “Now, we’re arriving at the gate to go to the next step — to produce machines that create something out of nothing.”

I’ve left my life behind
Yeah, I’m back on the floor
I’m a mile to the ground
Walking in the sky

AI-generated lyrics from YACHT’s song “Blue on Blue”

YACHT, a punk-edged dance-pop band that formed in 2002, has always strived to be on the cutting edge. The band’s name is an acronym for Young Americans Challenging High Technology. Computers and processors have always been integral to its sound. Pushing boundaries and co-opting corporate tools to create art are central to its ethos.

In 2014, the band released “Where Does This Disco,” a five-song EP that came with a bonus compact disc encoded with the entire YACHT catalog to date. But the bonus disc was completely clear, with no reflective foil coating, and therefore completely unreadable on any CD player — a critique of the obsolescence of physical media. The following year, YACHT promoted their LP I Thought the Future Would Be Cooler with a website that faxed the album artwork to fans and a video that only played when ride-share prices were surging in Los Angeles.

In 2016, band members pushed their commentary too far, releasing a video and claiming it was an unauthorized sex tape of Evans and Bechtolt (who were romantically involved). Later, they admitted that the whole thing was a hoax, intended as a comment on celebrity culture. When they were pilloried in the press and social media for making light of revenge porn, the group apologized and decided to take some time off.

“It was the most devastating failure of our lives,” says Evans. “It was really cynical, and it made us as a band completely reevaluate what we were doing.”

The incident and the resulting fallout paradoxically left the band with a freedom of identity to explore something completely new — and yet a longing for some form of creative constraint to keep them from spiraling out of control. Delving into the realm of machine learning seemed to satisfy both needs. “We knew very little about the tech, but we had this collective creeping sense that this technology was coming, and it was going to be really important,” says Evans. “For us, making things is a way of understanding.”


As steeped as Bechtolt, Evans, and Kieswetter were in technology, none of them were coders or programmers. So they reached out to everyone they could think of at the intersection of AI and creativity, drawing partly on knowledge Evans had gained as a science and technology journalist. Bechtolt and Evans consulted with Cope and other engineers, musicians, and pioneers in the field, then cobbled together pieces of existing technology into a unique songwriting process. They painstakingly broke down all of their songs and inputted them note-by-note into MIDI files, then picked out melodies, riffs, and drum patterns to loop and feed into their machine-learning model.

About every five minutes, the model would spit out about eight bars of MIDI data, until they had hundreds of melodies to work with. Bechtolt then paired each loop with another, using random and semi-random methods. The result was a few thousand fragments with which the human artists could work and assign to different roles (bassline, guitar, keyboard, vocals, drum patterns). They then built each song piece by piece, like a collage. Once the songs were finally arranged, the band performed and recorded all the parts live.        

As they did so, they held themselves to a set of rigid but simple rules. The band could not add or improvise anything. No additional notes, harmonies, melodies, drum patterns, or lyrics. They could subtract here and there, though they rarely did. Everything had to come from the machine.

“It felt more like craftsmanship,” says Bechtolt. “So many times, I could hear a harmony in my head, but because of these rules we’ve self-imposed, I was not allowed to add a harmony. It was a strange push and pull of having all these wild new things we could use to push ourselves in other directions, but for this project we want something that is somewhat recognizable.”

For the lyrics, the band worked with AI poet and author Ross Goodwin, known for creating movie scripts through AI and programming a car to “write” a novel. Goodwin helped them train an algorithm on a database of more than 2 million words taken from their back catalog and the works of some of their favorite bands. They printed out the results on continuous-form paper and then went through with a highlighter and sticky notes to find phrases that resonated and syllable patterns that matched melodies from the music model. They never added or subtracted a word, nor allowed themselves to break up sentences. The results include this passage from “Blue on Blue,” the fifth song on Chain Tripping:

When I ran the rainbow and the night would shine in the back of

my mind I never saw a valley walking along the street and

following me and that’s okay

I’ve left my life behind

Yeah, I’m back on the floor

I’m a mile to the ground

Walking in the sky

Blue-eyed boy

For Evans, the lyricist, it was a revelatory experience. “I’m really married to meaning and narrative structure and thinking about what I’m trying to say, and that, I think, has held me back as a songwriter,” she says. “With this process, I began with the words and had to find the meaning those words expressed. Sometimes meaning emerged through singing. Sometimes it was fluid, changing from one performance to the next. I like that the lyrics on this album are open to interpretation, by both myself and the audience that hears them.”


After three years of experimentation and work, Chain Tripping was released to a fair amount of praise and national recognition. It drew attention not only to YACHT, but also to the presence and possibilities of machine learning in popular music. And perhaps most impressively, it managed to come off as sincere and original.

“They have produced something that doesn’t seem like a gimmick,” says Bernstein. “Even the most radical improvisation isn’t exactly spontaneous; it emerges from some place in your brain. To be able to create this with a machine is an extraordinary project.”

Ever the technological pluralists, the band filmed a documentary of the process of making Chain Tripping, called The Computer Accent, which they will release in the U.S. this fall. The pandemic cut short any plans for a full tour to support Chain Tripping. They will play the songs at a few shows to support the film. But other than that, YACHT has moved on to something completely different. “Our new music isn’t very AI-y at all,” says Evans. “After the exacting constraints we imposed on ourselves to make Chain Tripping, we wanted a release — to make songs that feel good to play and that emerge from a more spontaneous and unstructured composition process.”

While Evans says the new material “is less cerebral and more intuitive,” the band has kept up on advancing technology. The songwriters even occasionally return to their Chain Tripping algorithm for help breaking through creative blocks on a lyric or coming up with a variation on a particular theme. After the arduous process of putting together Chain Tripping, AI is, if not an ongoing member of the band, at least a trusted creative consultant.  

“When we hit a wall creatively, we have that resource to help generate something,” says Evans. “We want to think of the machines as humanized, but it’s nice to have an entity that can generate ideas and doesn’t get their feelings hurt. We don’t want it to do our job for us. We just want to find new ways to approach our work, so that we can continue to evolve.”

Published on

Tony Rehagen is a writer based in St. Louis. His work has appeared in Popular Mechanics, Bloomberg Businessweek, The Washington Post, The Boston Globe, and Pacific Standard.

Graphic by Chloe Prock

Humans+Robots

Why Africa needs a ‘Google Translate for science’

Some Indigenous languages don't have words for common scientific concepts. AI could help.

By Sibusiso Biyela