
Jon Erwin is a Writer, Director and Producer, and serves as CEO of Kingdom Story Company, an independent film and television studio. Most recently, Jon created “House of David,” a critically acclaimed series produced by The Wonder Project in collaboration with Amazon MGM Studios, with international distribution from Lionsgate.
Tim Moore is the founder and CEO of Vū Technologies, a VFX and creative technology studio that worked with Jon on House of David. In this conversation, Jon and Tim discuss how their teams used Runway to bring several ambitious, complex scenes from Episode Six to life in minutes.
Tell us about House of David and the story you’re telling through this show
JE: House of David is a historical epic about the legendary, true story of David, who slayed the giant Goliath and became one of the most famous kings in history. It’s a classic, coming of age, hero's journey and in the same genre of many stories that I love, like Lord of the Rings. In fact, I think David is the origin of some of those stories – he's one of the originals of the hero's journey as a genre.
What led you to use AI in House of David, and why were you able to have success?
JE: Overall, I really believe in living in a constant state of curiosity. I think it’s one of the most underrated virtues and so I love to constantly learn. I also love new technology, which has been a huge part of my career.
What led me to AI was my production designer, who was using some of these tools and blowing me away with his results. So about a year ago I started playing around with the tools myself and just felt directly tethered to my imagination. It felt magical, like that moment a filmmaker gets a camera for the first time.
I started using them as high-end previs, so that we could begin to demonstrate some of the images we wanted to create, and that was so powerful that I felt we could then up-res some of the work to finalize in the show – and I felt like we could be one of the first in the industry to do it. That’s what we did with Season One of House of David, and we just kept building from there.
These tools have become a massive part of our workflow. They allow us to work faster, they allow us to do work that we would not have otherwise been able to do and we're able to do it in a cost-effective, creatively exceptional process. It's an incredible set of tools that allow you to manifest your dreams – in real time.
What surprised you the most about using AI?
JE: I was blown away that these AI tools do physics simulations—water, rain, atmosphere, smoke, wind—far better than any other VFX tool I’ve ever used. And so I was blown away with what we were able to achieve.
These tools, I’ve found to be the most intuitive and most creative set of tools I’ve ever experienced. They’re iterating profoundly and quickly, at a rate and a scale that I’ve never seen. That’s how we’re able to achieve a level of photorealism that I think is very unique, and also a level of consistency that’s very unique.
Take us through the scenes you used Runway for - what were some of the biggest benefits?
JE: The entire origin sequence for Goliath is driven with generative AI tools as the horsepower to the scene. What we found is that these tools work well when combined with traditional tools. The sequence in Episode Six is truly a combination of all the difficult things in VFX – you have photoreal digital characters, and simulated elements like rain, smoke and wind. The angel wings have feathers on them. Just the asset build alone would have taken forever. Character consistency and environmental consistency was really the trick, and the hard part about the sequence is how do you make all of these shots match, and how do you make the sequence human and feel organic.
Runway’s tools allowed us to create these photoreal visuals and tell the story, but do it in a budget and time frame that we could afford. With AI, you can dream in real time and collaborate on the material much quicker. And so that enabled us to get through this scene in a couple weeks, where it would have taken four or five months in a traditional process. In minutes, you can have an army of 10,000 that’s still in the world of your show.
Tim, what was it like to work with Jon’s team, combining AI with your traditional VFX workflows?
TM: When you look at technology over the last couple of years, the entertainment world is at the forefront of a lot of new tech that’s come in, but AI is now a quantum leap in that. House of David is such an amazing project to work on because they’re really innovating a new workflow for the whole entertainment industry.
Working on this project, with Jon and his team and their vision, it really pushed us to think about, how do we make something that’s fast, more instant. That’s really what AI can do. It’s thanks to companies like Runway that this generative AI workflow is possible. We’re really pushing the limits of what can be done with generative AI and virtual production. And while we’re doing things we used to dream of, we’re just at the beginning of what’s possible with this new technology.
Season Two will be released on October 5.