
What was the original vision for The Animal Breakdown?
The Animal Breakdown is a light-hearted history and science show about animals, packed with fun scientific facts and trivia on everything from dinosaurs to chickens. The concept has a very specific look: classic, old school textbook illustrations, but with bright, vibrant color.
Finding stock imagery that matched that style would have been nearly impossible. Traditionally, we would have needed artists to illustrate the animals and then animate them, which would have been a lengthy process. Realistically, we probably would have looked at the scope and said, "Maybe this show isn't worth it."
Each episode took us about two weeks to complete all the graphics. If we had to illustrate everything and then animate it, we would have needed more people, and it still would have taken at least double, if not triple, the time. Even then, we likely wouldn't have achieved the same level of quality we're getting from Runway.
How did you discover Runway could solve this problem?
We'd been tracking Runway closely for about a year before adopting it. During that time, we were thoughtful about how AI could complement our creative process and support our teams rather than disrupt them. Once we were confident it would be additive, we leaned in. Two years later, it's part of our workflow across many of our streaming shows.
From my perspective, the time savings aren't just on the motion side. The still imagery is a huge part of it too. We're constantly running into limitations with stock. When you're designing opens or green screen backgrounds, you're digging through options, and if the color or style isn't right, you end up compositing and piecing everything together. The hours Runway has saved us there alone have been enormous.
Walk us through the actual workflow for The Animal Breakdown. How did you use Runway?
At the start of each episode, producers would give us a graphics shot list. Each graphic was the scientific explanation, with notes on how it should look, how it should animate and timing based on the voiceover.
Traditionally, that would have meant digging through stock and making do with whatever we could find. With Runway, the process was different.
First, we built a consistent style in Runway: a vibrant, old-school textbook aesthetic. Once that was locked in, we prompted the animals we needed. We usually started by generating each animal on a green screen so we wouldn't end up with abstract backgrounds that would be hard to key later.
We'd generate a few options for each animal and send them to the producer for feedback. Once the static image was approved, we'd use Runway to animate it. A lot of the time, it nailed the motion on the first pass, and we got significantly faster at prompting as we learned the platform.
From there, we comped everything together: backgrounds, animals and any text. Most of the animation was created in Runway, then refined and finished in After Effects.
How big was the team working on this show?
It was a very small graphics team, just three people. We produce large-scale, premium content with a small crew, so we've learned to be resourceful. Typically, we only have one graphics artist fully dedicated to a show, and that's where Runway has been helpful.
What challenges did you face?
The biggest learning curve was prompting. Sometimes the animation would introduce elements that weren't in the static image, so we had to learn how to prevent that and get more specific with our prompts.
We wrapped the show graphics at the end of September, and the technology has moved incredibly fast since then. When we spoke with the Runway team recently and they walked us through their new feature, Workflows, our first reaction was, "Oh, this would solve a lot of the areas we struggled with last season." The platform has advanced so much, even since we finished.
How has Runway changed your capabilities beyond The Animal Breakdown?
We've been using it across many areas of the company. Our social team uses it often for thumbnails. When you're scrolling, you need compelling imagery that stops people in their tracks, and with UGC content that can be tough. Footage can be grainy, or you can't find a clean frame because there's so much motion, especially with FailArmy content. Runway has been a game-changer for thumbnail creation.
We also use it to remove burned-in text and graphics from acquired video, which is a major challenge when we're repurposing footage for streaming. Historically, blurring was the only workaround, but it's not a great viewing experience. With Runway, we've been able to remove those baked-in elements completely and deliver clean, distraction-free footage that looks significantly more premium.
Our marketing design team has been testing Runway's Workflows feature. We took a single image of our magazine and generated 20 product placement variations, each with room for copy. The speed is incredible, and the results hold up. The aspect ratios, angles and lighting are all on point. It's ideal for ads that require lots of variations across formats.
On the print side, some of our designers are using it for magazine cover concepting. What used to require manual compositing can now be explored in minutes, so they can test more directions, faster.
You mentioned season 2 is coming. How will that be different?
We're ramping back up in April. Runway's new Workflows feature should help us stay much closer to the initial shot lists because we can be more specific upfront. That means we can be more targeted this time around and save time overall. Runway has evolved a lot since Season 1.
This is also one of the most elaborate shows we've done, so we're already seeing what's possible for other shows across our streaming channels, too.
What advice would you give to other brands thinking about using AI tools?
Think about AI tools as a workflow accelerator, not just a content generator. We're a company built on UGC and trusted brands, so authenticity is our top priority. At the same time, as designers and animators, we've always had to be resourceful and find smart ways to create graphics. Anything that gets us to the final product faster is a win, as long as it doesn't compromise our brand standards.
What we're creating with Runway has been beneficial in so many ways, but it still feels like we have ownership of the work. It's a piece of the puzzle, not the whole puzzle, and it doesn't look like the generic AI slop you see everywhere.
Design knowledge, taste and creativity still matter. Runway isn't a replacement for humans; it's there to support the process. Once you move past the hesitation and focus on how it can speed you up, you become more productive and can put more energy into the parts that truly need a human touch.

