How Havas Group’s AI production workflows are transforming global brand campaigns
How Havas Group’s AI production workflows are transforming global brand campaigns

KEY STATS:

$3B+

Havas Group annual revenue

100+

countries served by Havas Group

80%

of clients ask to use AI in production

25%+

Amount of AI-generated work for one Fortune Global 500 consumer brand

David Tamayo is Creative AI Director at Prose on Pixels, a global content division of Havas Group that blends advanced technologies with deep vertical expertise, helping Havas’ vast network of brands serve clients including Adidas, Burger King, Dominos and Santander. Havas Group—which operates in more than 100 countries and drives more than $3B in annual revenue—has been an early global adopter of generative AI for advertising, using Runway extensively for campaigns with major brands like Michelin and Woolite. In this conversation, David discusses his use of Runway as an essential production tool, and why AI is reshaping how agencies think about what's possible within any budget.

Tell us about Prose on Pixels and your role within the Havas ecosystem.

Prose on Pixels is Havas Worldwide's in-house production company. I'm the Creative AI Director at Prose on Pixels New York. My role bridges creativity and production – I'm not just ideating concepts, I'm actually producing content using AI to expand our creative capabilities.

I spent two years in France developing the AI studio for BETC, one of the most creative agencies in the world [and another Havas subsidiary]. We realized early on that AI needed to be integrated both on the creative side and the production side, because we could see production was going to evolve dramatically. So we implemented the studio at both BETC and Prose on Pixels to make sure we were positioned for that transformation.

Play

What types of projects do you work on?

We don't limit ourselves to any specific type of client or campaign. The goal is to identify where AI adds value across the board – whether that's social media content, print, video or automation. Sometimes AI is just 5% of the production, sometimes it's 90%. It really depends on what serves the creative vision and the client's needs.

Right now, I'd say 80% of incoming client requests specifically ask about AI. They're seeing competitors use it, they're seeing the tsunami of AI news and they want to understand what's possible. One Fortune Global 500 consumer brand we work with has a mandate from top management that at least 25% of their creative work be generated using AI.

How did you first start working with Runway?

I began exploring AI tools back in 2023, starting with open-source image generation models. I was using some of those early tools to really fine-tune and guide image generation. Eventually I started applying that same methodology to video, and that's when I turned to Runway – especially for enhancing basic 3D video inputs to create something more polished and dynamic.

Which Runway features have been most valuable in your workflow?

For a long time, video-to-video has been my go-to tool – I got really comfortable with the slider that lets you adjust how strongly the input video influences the final output. Now with Aleph, it's even better – I can do everything I was doing before with video-to-video, but with improved results.

The workflow is always the same for me: I create a 3D input for camera control, then use Runway for texturing and lighting. I can do a simple 3D model, without spending time on texturing or rendering, then use that base to achieve whatever look I want. It's incredibly powerful because you can change everything downstream. I did a video where a car evolves on a road – I created 3D rendering, then used Runway to generate the same sequence in different seasons. I created one for summer and one for winter, and could do a perfect split because the camera movement was identical.

What's been your biggest challenge working with AI video tools?

Control. And I'm not a control freak – it's just that from a cinematographic perspective, the camera angle, movement and composition are the most crucial elements. Even something as simple as applying the rule of thirds can completely change the shot or story you're telling.

My biggest challenge is getting that exact level of precision. 3D inputs help, but I don't want to spend forever on texturing and rendering. AI is a huge help, but there's still some inconsistency, especially in maintaining style and character across shots. Sometimes the 3D elements don't read perfectly, or the output varies depending on the shot. There are probably factors I'm still figuring out, so it's definitely something I'm continuing to refine as these tools improve.

Play
Play

Tell us about the Michelin campaign – that seemed like a perfect use case for Runway.

Michelin wanted to tell the story of a tire traveling through different environments to showcase their different tire types – one for the rainy season, one for all-season, one for summer, etc. The creative team at BETC wanted this beautiful, poetic video of a tire evolving through different landscapes. Initially they wanted to do it in stop-motion, but the budget was low because the ad was intended for social media.

This was back in 2023, when I first started. When we started out, we were using a variety of different early AI tools, and we had to keep switching between them to get the effects and control we needed.

When Runway released their video-to-video feature [in February 2023], we switched to that for the final three videos of the campaign because it saved us so much time and cost, and the results were significantly better. We got fantastic feedback from Michelin's team – they saw a big boost in engagement and views. Michelin is known for their innovative approach, so they let us work the way we wanted. It was a huge R&D opportunity for us as well – we learned so much, and kept using Runway from there.

What about the more recent campaign, with Woolite?

The Woolite campaign really shows how AI can expand creative possibilities. They initially wanted just a simple beauty shot – one goat with soft fur in the wind, minimal camera movement. Standard stuff. But when we told them what we could do with AI, it completely changed the script.

Instead of one static shot with one goat, we created a full minute-long piece with the goats in multiple different situations. AI didn't just save budget – it unlocked creative ambition that wouldn't have been possible with their original timeline and resources.

Play

You also did something really interesting at Cannes Lions. Tell us about that activation.

We created an installation where people could take a picture of themselves and then be featured in a fake advertisement. We developed 20 different fake commercials, each with fixed parts that were fully generated in Runway, and dynamic parts where we could insert the person's portrait plus a prompt so they'd appear naturally in the video.

People would take their picture, wait a few minutes, and then see themselves featured in the commercial. The success was crazy. We used Runway's reference image feature plus prompts – it automatically placed the person as they should appear in the video context.

It worked incredibly well – we never had anything crazy happen because the prompt and reference image helped ensure the AI stayed focused. The results were incredible. We got tons of requests to send people their video after, and we’re now planning to do the same activation at CES in Las Vegas.

What advice would you give to agencies or brands considering adopting AI?

There's a lot of debate about AI – people often think of it purely as a generative tool. But it’s also a powerful production tool, no different than 3D software or traditional shoots. AI has so much potential beyond just generating content. It can transform both production and post-production, letting you do things without a huge production house or team.

Sometimes I see young creators doing amazing things with just AI, a little 3D, motion design and video shot on their phones – sometimes even outshining big-budget productions. So if you're an agency or brand, my advice is: bring in someone who can show you the possibilities. Do a workshop, get educated and realize it's way more than just generating a quick video. It's a whole new way of thinking about production.

What excites you most about AI for creative work?

At the end of the day, AI is just a tool, nothing more, nothing less. It’s opening a new path for production – one that keeps getting better every six months. Sooner or later, the line between traditional video and AI-generated content will disappear.

Every creative field starts with references. Whether you're a photographer, designer or director, you look at inspiration, open books, scroll through Pinterest and build on what you see. AI is just another form of that – another way to bring your unique vision and aesthetics to life.

What excites me is seeing more people using AI as a creative force to unlock possibilities that might be impossible or really hard to achieve otherwise. It opens up new ways of thinking and creating, and that's why I think it's such an amazing tool.