Introduction
You generate the perfect AI video with a character who looks exactly right—the expression, the composition, the mood all work.
To continue telling your story, you want to use the same character in the next scene. But even with the same written description, your generation turns out looking completely different. You try again and again, but after fifteen attempts you've burned through credits and your character still looks like a different person.
That’s where reference images come in. Reference images tell the AI tool to keep a certain concept—clothing, location, or in this case, character—identical in every generation. With a single reference image, you can maintain character consistency across any number of generations. Place them in a rainy street, then a sun-drenched cafe, then on a rooftop at sunset—different settings and actions, but the same face every time.
In this guide, we’ll cover how AI character references work, what makes an effective reference image, and how to write prompts that maintain consistency across scenes.
What are character references in AI?
Character references tell AI image and video tools to maintain a specific person's appearance across every generation. Instead of describing your character from scratch each time and hoping the AI interprets it consistently, you upload one reference image. The AI analyzes and stores that visual information, then recreates that exact person in whatever scene you describe.
How character references work
When you upload a reference image, the AI tool encodes that visual information as a blueprint for future generations.
What you can preserve with references:
- Character appearance and identity
- Locations and environments
- Lighting and color grading
- Artistic styles and treatments
- Compositions and poses
- Specific objects or props
You typically reference one element at a time—a character, a location, or a style. The reference becomes your visual anchor. Write prompts describing what you want to change (new setting, different action, altered lighting) while the AI keeps your referenced element consistent.
Why character references matter
Without a reference image, you're describing your character fresh in every prompt. "Woman in her 30s with dark hair" gets interpreted differently each time—different face, different features, different person.
With character references:
- Place your character in any setting without reshooting or redrawing
- Build actual narratives with recurring characters instead of disconnected clips
- Keep brand mascots and virtual influencers consistent across all content
- Create professional content faster without coordinating shoots
This transforms AI video generation from a tool for random clips into something that enables real storytelling. Your protagonist appears throughout your short film looking like the same person. Your brand spokesperson stays identical across every marketing video. Your content series features the same recognizable character episode after episode.
How to use a character reference image
Using character references comes down to three straightforward steps.
Step 1: Upload your reference image
Start by uploading a photo of your character. This can be an AI-generated image, a photo of a real person, or even a frame from a previous generation you liked. The key is clarity—one character, visible face, good lighting. This image becomes your anchor for every generation moving forward.
Step 2: Write your scene prompt
Describe the scene you want to create. Include the setting, what your character is doing, lighting conditions, and camera angle. The AI uses your reference to maintain the character's appearance while following your scene description.
Example prompts:
- "Walking through a rain-soaked street at night, neon signs reflecting in puddles, cinematic lighting"
- "Sitting in a sun-drenched cafe reading a book, warm morning light streaming through windows"
- "Standing on a rooftop at sunset, city skyline in background, golden hour glow"
The prompt controls everything except the character's face and appearance—that stays locked to your reference.
Step 3: Generate and iterate
Generate your scene. Your character appears looking consistent with your reference image. If the first result isn't quite right, adjust your prompt and try again. The character stays the same while you experiment with different settings, angles, and scenarios.
Save outputs that work particularly well—you can use successful generations as new reference images for future scenes.
What stays consistent: Your character's face, distinctive features, and overall appearance
What you control: Setting, pose, lighting, action, camera movement, mood
What makes a good character reference image
The quality of your reference image directly affects how consistent your character looks across generations. A clear, well-lit reference gives you reliable results. A blurry, complicated reference leads to inconsistent outputs and wasted generations.
Image quality requirements
Technical requirements:
- High resolution—at least 1024px on the shortest side
- Sharp focus on the face with clear details
- Even lighting that shows features without harsh shadows
- Single character only, no other people in frame
- Front-facing or 3/4 angle captures features most effectively
- Simple or neutral background that doesn't compete for attention
What to avoid:
- Blurry or low-resolution images that lack detail
- Extreme angles like looking straight up or down
- Obstructions covering the face—hands, hair, objects
- Multiple people in the same frame
- Heavy filters or effects that obscure natural features
- Dark, underexposed photos where features aren't visible
The easiest way to get a perfect reference image is to generate one. Use text-to-image or image-to-image generation to create a clean character portrait specifically designed as a reference. This gives you perfect clarity and resolution from the start, with complete control over lighting, angle, and composition. Generate a simple portrait with neutral lighting and a clean background, then use that as your anchor for every future scene.
Using multiple reference images
Some tools let you upload 2-3 reference images of the same character, giving the AI a fuller understanding of what your character looks like from different angles. Upload a front view, profile, and 3/4 angle to help the AI understand the complete structure of the face. This is particularly useful for scenes requiring varied camera angles and reduces character drift when generating from unusual perspectives.
That said, a single high-quality reference image works well for most projects. Start with one good image before adding complexity.
Best practices for consistent AI characters
Getting consistent results with character references isn't about luck—it's about following a systematic approach that builds on what works.
Select strong character reference images
Generate or gather 3-5 potential reference images of your character. Test each one with identical prompts to see which delivers the most consistent results across multiple generations. The reference that works best might surprise you—sometimes a slightly different angle or lighting makes all the difference. Once you find your winner, save it with a clear, descriptive name in your reference library so you can recall it instantly.
Start with simple character prompts
Start simple. Test your reference with basic scenes before attempting complex compositions. A character standing in neutral lighting tells you more about consistency than an elaborate action sequence with dramatic lighting shifts.
Change one variable at a time. If you want to test different settings, keep the lighting and camera angle consistent. If you're experimenting with camera angles, maintain the same setting and lighting. Changing everything at once makes it impossible to know what's working and what's causing problems.
Keep detailed notes on which prompts produce the best results with your specific character. Some references respond better to certain types of descriptions or lighting conditions. Build a prompt library of what works—successful formulas you can reuse and adapt instead of starting from scratch every time.
Save references for reuse
Save your character references with @ tags for quick recall. Instead of re-uploading and explaining your character each session, type @protagonist or @brand_character and your reference loads instantly.
When you get a particularly good result, save that output. You can use successful generations as new reference images for future scenes, building on consistency rather than starting over. Create reference sheets showing your character from multiple angles—front view, profile, 3/4 angle—to give yourself options for different scene types.
Export and organize your best results by project or character. A well-organized reference library saves hours of searching through old generations.
Build on successful generations
Start with a basic prompt for your first generation. If the character looks good but the scene needs work, adjust your prompt and keep the same reference. If the character drifts, try a different reference image or simplify your prompt significantly. Once you nail a great result, save that output and potentially use it as an additional reference for similar scenes.
This approach prevents you from chasing problems in circles. You're always working from something that's already succeeding.
Practice combining multiple characters in one scene
Generate and test each character reference separately before combining them. This isolates problems to specific characters rather than creating a confusing mess. Name each reference clearly—@character_1, @character_2—and reference them specifically in prompts: "image 1 talking to image 2 in a coffee shop."
Expect multiple characters to be more challenging than single characters. Start mastering single-character consistency before attempting complex multi-character scenes.
Create a character reference guide
Build a document for each major character containing your best reference images, prompts that consistently work well, notes on what to avoid with this specific character, and example outputs showing successful results. This becomes your quick reference whenever you work with that character, eliminating guesswork and repeated trial-and-error.
Troubleshooting character consistency
Most AI character reference problems come from a handful of preventable mistakes. Here's what trips people up and how to fix it.
Using poor quality reference images
Blurry photos create blurry, inconsistent characters. Low resolution compounds across generations—each output slightly degraded from the last. The AI can only work with the information you give it, and a muddy reference produces muddy results.
Fix: Generate a clean AI reference image first. Create a high-resolution character portrait specifically designed as a reference rather than trying to salvage a poor-quality photo.
Overcomplicating prompts
Cramming too many details into one prompt confuses the AI. When you write paragraph-long descriptions with a dozen specific elements, character features get lost in the complexity. The AI struggles to balance your reference with your overwhelming prompt.
Fix: Keep prompts focused on 3-4 key elements. Character from reference + setting + action + lighting. That's enough. Add more only if your results are consistently good.
Contradicting your reference
Prompting for "blonde hair" when your reference shows dark hair creates an impossible task for the AI. Same with requesting a different age, ethnicity, or major facial features than what appears in your reference. The AI tries to satisfy both your reference and your prompt, resulting in inconsistent, confused outputs.
Fix: Let the reference handle character appearance completely. Your prompt handles everything else—setting, action, lighting, mood, camera work.
Not testing references first
Jumping straight into complex scenes without testing your reference is a recipe for wasted generations. You discover inconsistency only after burning through credits on elaborate prompts, with no idea whether the problem is your reference, your prompts, or both.
Fix: Generate 3-5 simple test scenes first. Character standing in neutral lighting, different basic settings. This tells you if your reference works before you invest in complex generations.
Giving up too quickly
Your first generation is rarely perfect. Consistency improves with iteration as you learn what works with your specific reference. Abandoning a reference after one attempt means you never discover what it can do.
Fix: Plan for 3-5 attempts per new scene type. Adjust prompts, regenerate, refine. Give your reference a fair chance.
Not organizing your references
Losing track of which reference produced which results means you can't replicate successes. You generate something great, then can't remember which reference image or prompt created it.
Fix: Name and tag everything systematically. Save references with descriptive names, note which prompts worked, organize outputs by character and project.
Advanced character reference techniques
Once you've mastered basic character consistency, these advanced techniques give you more control over complex projects and multi-scene narratives.
Use outputs as new references
Generate a scene where your character looks particularly good, then export that frame and use it as the reference for your next scene. This builds on successful consistency rather than returning to your original reference every time. Each good result becomes a stepping stone for the next generation, compounding what's working instead of starting fresh.
Control camera angles and perspectives
Prompt specifically for camera angles to explore your character from different viewpoints while maintaining consistency. "Low angle looking up," "overhead shot," "profile view," "over-the-shoulder"—clear angle descriptions keep your character recognizable even as the perspective shifts dramatically.
Test each major angle separately before combining them in complex scenes. Generate a front view, then a profile, then an overhead shot. Once you know your reference handles all these angles consistently, you can use them together with confidence.
Build action and motion consistency
Start with static poses before adding movement. A character standing still is easier to keep consistent than one in motion. Once static poses work reliably, add simple movements like walking or turning. Save complex actions like dancing or fighting for after you've established baseline consistency.
Describe motion clearly and specifically. "Slowly turning head to the left" works better than just "turning." The more precise your motion description, the better the AI maintains your character through the movement.
Combine different reference types
Layer multiple reference types for sophisticated control. Character reference plus style reference puts your character in a specific artistic treatment—anime style, oil painting, noir photography. Character reference plus location reference places your character in an exact setting you've defined. Strategic layering unlocks creative possibilities that single references can't achieve.
Generate character variations
Use your face reference to create different versions of the same character. Generate them in different outfits, seasonal clothing, or situational costumes while maintaining perfect face consistency. Your character stays recognizable even as everything else about their appearance changes.
Create scene-to-scene continuity
Use the last frame of one generation as the starting point for the next. This creates seamless visual transitions between scenes, particularly effective for longer narratives where you want smooth flow from moment to moment. The visual consistency between the end of one scene and the beginning of the next makes your story feel cohesive.
Maintain lighting and mood across scenes
Establish your lighting setup in the first scene, then reference that lighting in subsequent prompts. "Same lighting as previous scene" or "matching the warm interior lighting from image 1" helps maintain visual continuity. Consistent lighting ties scenes together even when settings change.
Manage your reference library
Create folders organized by project or character. Tag references with relevant keywords so you can find them quickly. Include prompt notes directly in file names—"protagonist_coffeeshop_workedwell.png" tells you more than "image_047.png." Export contact sheets showing your character from multiple angles as quick visual references for future work.
A well-organized reference library turns into a creative asset that accelerates every project instead of slowing you down with searches and guesswork.
Top use cases for character reference AI
Character references unlock creative possibilities across industries and project types. Here's where consistent AI characters make the biggest impact.
Narrative short films
Create a protagonist who appears throughout your entire film across multiple scenes, locations, and emotional moments. Your character walks through a forest in the morning, sits in their apartment at night, confronts someone on a city street—all while looking like the same person. Character references let you maintain continuity without reshooting, giving you the freedom to build actual stories instead of disconnected clips.
Social media content series
Build episodic content around a recurring character that your audience recognizes and connects with. A consistent face across your Instagram stories, TikTok series, or YouTube shorts creates familiarity and investment. Viewers return because they know the character, and character references ensure that recognition stays strong across every episode you create.
Marketing and advertising
Deploy virtual brand ambassadors who represent your company with perfect consistency across all content. Use the same spokesperson for product demonstrations, explainer videos, and promotional content without scheduling shoots or managing talent. Your brand character looks identical whether they're introducing a new feature or walking customers through a tutorial, building recognition and trust.
Animation and storyboarding
Visualize character concepts before committing to full production. Test how your character looks in different scenarios, compositions, and emotional states. Create pitch decks and proof-of-concept work that shows investors or stakeholders exactly what your finished project will look like, all without building out complete animation pipelines or hiring full production teams.
Music videos and artistic projects
Explore fantasy scenarios and experimental narratives with consistent characters that would be impossible or prohibitively expensive to film in real life. Place your character in surreal environments, impossible situations, or abstract visual concepts. Character references let you push creative boundaries while maintaining a recognizable visual thread through your artistic vision.
Game development and concept art
Develop character visualizations for pitch materials and investor presentations. Plan cutscenes showing your game characters in specific narrative moments. Create marketing materials featuring your game's protagonists before final character models are complete. AI character references speed up pre-production and help communicate your vision to stakeholders, publishers, and your team.
Education and training content
Create a consistent instructor or guide character across your entire tutorial series. Whether you're teaching software skills, explaining complex concepts, or building online courses, a recognizable character makes learning materials more engaging and helps students feel connected to the content. The same friendly guide appears in every lesson, creating continuity that supports the learning experience.
Virtual influencers and digital creators
Build an entire persona with consistent appearance across all content types and platforms. Your virtual influencer maintains the same look whether they're posting lifestyle content, collaborating with brands, or engaging with followers. AI character references make it possible to create and scale a digital personality without the limitations of traditional content creation.
Fashion and e-commerce
Use the same model across different outfits, products, and seasonal collections for your e-commerce site or lookbooks. Show how various pieces look on a consistent body type and style, helping customers visualize products more effectively. Generate dozens of product shots with the same model without organizing photoshoots or coordinating schedules.
Try character references with Runway today
Character references turn AI video generation from random outputs into actual storytelling. The ability to maintain a consistent character across scenes means you can finally build narratives instead of creating disconnected clips.
Start simple: one character, one clean reference image, basic prompts. Open Runway, upload your character image, and save it with a clear name using @ tags. Test your reference across 3-4 different scenarios—different locations, simple actions, varying times of day. This shows you what your reference handles well and where it struggles.
Save what works. Each successful generation teaches you which prompts produce consistent results with your specific reference. Build on those successes rather than fighting inconsistencies. You'll regenerate scenes and troubleshoot drift—that's normal. The technology isn't perfect, but the creative possibilities of recurring, consistent characters make the learning curve worth it.
Try achievable starter projects: place your character in three different locations to see consistency in action, generate simple actions like walking or sitting before complex movements, or test the same character across different lighting conditions. These straightforward projects build your understanding without overwhelming complexity.

