Last year, dance-pop band YACHT released Chain Tripping, a Grammy-nominated immersive album designed by band members Claire Evans, Jona Bechtolt, and Rob Kieswetter using Runway. Based in LA, the band used Runway to design the album’s songs and visual assets including music videos, band photos, an album cover, and live installations. They even named the album using GPT-2.
YACHT’s interest in using ML as part of their songwriting process began five years ago. With her background as a science editor and writer, Claire was curious about the implications AI might have on the arts and creativity. She mentions reading the writings of David Cope and seeing the work of artists like Mario Klingemann, Memo Akten, and Robbie Barrat.
“We don’t really know what’s the incorrect way of doing things … You start with brute force, try stuff, and end up making things maybe a trained programmer wouldn’t think to do.”
When it came to Chain Tripping, the band saw an album as an opportunity to better understand machine learning. “We’re always interested in starting a conversation with a new technology that’s not for us,” Claire says. ‘Runway was the first user-facing tool we saw.” Through music, visual art, photography, video, typography, design and layout, the band created what they call a “document” on the current status of art and ML.
With Runway, the band found an accessible tool with a marketplace of models they could experiment with. Since they’re not computer programmers, Claire and Jona previously outsourced to more experienced collaborators. Relying on other people was frustrating, Claire says. “Runway is the only tool that we were able to actually mess around with ourselves,” she adds. “That’s how you learn how systems work.”
Looking back at their process, Jona sees the creative value in misusing the models. “We don’t really know what’s the incorrect way of doing things,” Claire reflects. “You start with brute force, try stuff, and end up making things maybe a trained programmer wouldn’t think to do.”
In the Scatterhead music video, the duo pulled dance archive footage and inserted it into post mapping models to create glitchy dance forms, then added on forms from DensePose, a model that allowed them to map their own 3D forms from TV video images. “What’s interesting and important is that these tools are not perfect,” Jona says, describing how their limbs would disappear and re-appear on a different part of their body. “But where they fail is often where the most interesting aesthetic stuff happens.”
“We’re all making the meaning in time, in my performance, and listening to the song. What someone thinks the song is about is as valid as what I think the song is about.”
Listening back to the album now, the band describes the material as living in the space between traditional narrative songwriting, and AI’s anti-narrative generative processes. “The big question was how can we use ML to create something that isn’t just technically music,” Claire says, “but actually fits in a body of work we can be proud of.”
For most artists, a work’s meaning comes from the creator. ML disrupts this relationship. For YACHT, the creative process meant sitting in front of AI-generated text and language, aka “nonsensical gobbledygook.” The challenge is then to find meaning in this raw language, which arrived through a process created by humans from the input which was selected from humans.“That process of interpretation and arrangement is a huge part of where meaning actually works,” Claire tells us. In other words, it’s the difference between how you say something, compared to what you say.
Before Covid-19 canceled live shows, the band performed Chain Tripping to their fans with Runway-generated live projections. “Because [this album] came from the space between us and the machine, there’s this lovely collective experience that we get to have with our audience,” Claire says. “We’re all making the meaning in time, in my performance, and listening to the song. What someone thinks the song is about is as valid as what I think the song is about.”
Jona mentions the aftershow conversations, where the band would talk to their fans about the ML processes used in the performance. “We told them to just go home and download Runway and play with that,” Jona describes. “These tools create completely unpredictable results. You have to make the best of those, but it’s really fun.” Looking back, he admits, ML tools were a pretty weird conversation to have at 1 AM in a rock club.
“Runway just puts it in your pocket. It makes it so that you can make massive video installations using machine learning in your own backyard. That’s wild.”
Alongside their visual album, the band also found time to design a video installation for Dolby Gallery in San Francisco. Running 96 feet long, the band decided to use the project to understand what ML systems actually see, and what is lost in translation. By referencing Eadweard Muybridge and the beginning of the history of photography, the artists created a visual statement on how the world will, eventually, be reshaped by ML.
The challenge in this project was to stitch together one continuous image for 96-feet of space. They captured their original footage using an iPhone in their backyard. “We just showed it to the computer and the important data was pulled out,” Claire explains. “Runway allowed us to create something that was a very high production value with very little resources.”
As a band, Claire and Jona’s philosophy is to do as much as possible, with as little as possible. They describe Runway as both liberating and democratizing. “These tools usually have gatekeepers or a level of computational knowledge to access them,” Claire says, mentioning how hard it is for independent artists to engage with machine learning. “Runway just puts it in your pocket. It makes it so that you can make massive video installations using machine learning in your own backyard. That’s wild.”
Get their visual album on BluRay now!