Sasha Kasiuha is a New York based multidisciplinary artist with a focus on merging creativity and technology. Most recently he served as the Content Director for Madonna’s World Celebration Tour, which runs through April 2024 and spans over 80 shows in 13 different countries. In this conversation, Sasha discusses how he & his team brought new forms of visual expression to Madonna’s Celebration Tour with stage visuals that were entirely generated with Runway.
Tell us about the creative vision for Madonna’s Celebration Tour. Did you know from the beginning that you wanted to incorporate AI?
I’ve been working with Madonna for a couple years now. She’s always been open to embracing new technologies that can unlock new visual elements - that’s never been something she’s shied away from. I personally have been using different AI tools throughout my process for a long time, so together we knew we wanted to try to find a way to use AI if it made sense, and if it helped us achieve our specific vision for the tour.
What was the process like generating the stage visuals for Madonna’s iconic song, La Isla Bonita?
La Isla Bonita is a song that transports the audience to an ethereal, tropical place, and we wanted the visuals to match that type of heavenly vibe. We originally tried traditional CGI, but it looked pretty flat and cheesy. It just didn’t fit. It looked a bit too ordinary. That’s when we turned to AI.
To actually create the visuals we tried a few things in Runway. First we used different text to video prompts like “surreal sunset, clouds” to get a quick variety of options and to help us land on a stylistic direction. Once we had a sense of the creative direction, we used Gen-1 to transfer styles onto existing videos. What was interesting was that the input videos had nothing to do with clouds - they were videos of things like water, tunnels, animals flying, or nebulas - things that had really great movement and contrast. We turned those videos into clouds using a reference image. The end results weren’t quite real but they weren’t quite CGI - they were surreal and had a quality that’s totally different and hard to describe - that’s what we were going for.
Since the Tour began you reimagined the visuals for Take a Bow - the current visuals were created entirely with Gen-2. What led to that pivot and what was that process like compared to La Isla Bonita?
The reception to the generated visuals for La Isla Bonita was overwhelmingly positive which gave us the confidence to continue experimenting with new techniques. The creative vision for Take a Bow included more of a narrative with a cohesive theme - we wanted to bring the audience to a world beyond the screen. This led us down a slightly different production path than La Isla Bonita.
We started by creating base images with text-to-image to achieve a specific, cohesive style that would represent both the tangible world and an otherworldly sacred realm which is distinct through the doors on screen - a world inspired by the surreal realms depicted by Dali and Magritte. We wanted to take the audience on an exploration of infinity, so that’s what we were prompting for. We took the generated images into image-to-video within Gen-2 to fully bring that limitless world to life in video.
The camera and motion controls that exist in Gen-2 were indispensable to creating these visuals in a way that was consistent with our specific vision. These tools weren’t available when we first kicked off the creative planning for the tour a year ago, and they made all the difference helping us achieve the final visuals.
What timeline were you working with? How big was your team?
I had a small team of 5-6 people, and we had a timeline of a few months, which for a tour of this size was quite quick. One of the things we liked about Runway was how easy it is to use. There wasn’t any coding or computer experience needed to use it, which for us was extremely important because we were on a tight deadline. We needed something that was fast, easy to use, but still gave us the control we needed. We also appreciate how often Runway releases new features and updates - motion brush immediately comes to mind as something that was released that’s so helpful in the creative process.
What are you excited about with this technology going forward?
This technology is progressing so fast, and it’s making it easier to create videos. I couldn’t imagine doing any of this a year ago, and now this technology is something that can be incorporated seamlessly into filmmaking, music videos, commercials, and in this case, tour visuals.
Ultimately, for me, AI allows for the opportunity to visualize content in a much shorter time frame. You can write out some prompts and ideas and see what direction and aesthetic you like, and let your process evolve from there. It maximizes your time and efficiency in those early parts of the creative process and in this case, it’s also how the final video was created. I’m excited to see how this technology continues to evolve.