AI Commercial Production: Process, Tools, and What Brands Need to Know
- Jonathan Boden
- Feb 16
- 5 min read
The AI commercial production process is something most brands are curious about but few actually understand. There's a lot of noise out there — AI hype cycles, viral demos that look nothing like real production work, and agency decks that promise the moon while glossing over how things actually get built. After years of producing content at the network and studio level, and now running AI commercial production at Bizarre Bunny, I want to cut through the noise and explain what the process actually looks like from the inside.
Because the process matters. Any tool can generate a video clip. Producing a commercial — something that actually communicates a brand message and drives action — requires a structured approach that most people waving AI tools around don't have.
Phase One: Creative Development (This Doesn't Change)
Every project starts the same way it did when I was producing at Paramount: with a brief. What's the brand? Who's the audience? What's the message? What's the desired action? What's the platform? What's the tone?
We develop the concept, write the script, and create a detailed storyboard. This phase is purely creative. AI doesn't enter the conversation yet. This is intentional. I've seen too many producers start with the tools and work backward to a concept. That approach produces content that showcases technology instead of serving the brand. Nobody cares about your tools. They care about whether the ad makes them feel something.
The storyboard is particularly important in AI production because it becomes the blueprint for generation. Each frame needs to specify not just composition and action, but mood, lighting direction, color temperature, and camera movement. The more precise the storyboard, the more efficient the generation phase.
Phase Two: Tool Selection for the AI Commercial Production Process
This is where AI production diverges from traditional. Once the storyboard is locked, we go shot by shot and determine which AI tool will handle each one. This isn't arbitrary — each platform has specific strengths and weaknesses that matter for specific shot types.
Sora excels at scenes with coherent motion and complex environmental interaction. When we need a product moving through a space — a car on a mountain road, a bottle being poured in a kitchen — Sora tends to deliver the most natural-feeling results. Its understanding of physics and spatial relationships is strong.
Veo produces exceptional photorealistic environments. Landscape shots, architectural spaces, atmospheric scenes — Veo handles these with a level of detail and realism that's hard to match. When the script calls for sweeping establishing shots or detailed backgrounds, Veo is usually our first choice.
Runway brings unique strengths in style transformation and creative effects. When a spot calls for visual flourishes — transitions, stylized sequences, artistic interpretations — Runway offers control that the other platforms don't. It's also our go-to for spots that blend live footage with AI-generated elements.
Kling has made impressive strides in character consistency and human movement. Scenes that require people — walking, talking, interacting with products — benefit from Kling's focus on maintaining character coherence across frames. It's not perfect, but for commercial applications it's often good enough.
The critical insight: no single tool handles everything well. The producers who get the best results are the ones who understand the strengths of each platform and match them to specific shot requirements. It's a craft skill, not a technical one.
Phase Three: Generation and Iteration
Generation is where the actual content gets created, but the word 'generation' undersells the amount of human judgment involved. You don't type a prompt and accept what comes back. You type a prompt, evaluate the result against your storyboard, adjust, regenerate, evaluate again, and repeat until the output matches your creative vision.
A single shot might go through fifteen to thirty iterations before it's approved. Prompt engineering matters — the language you use, the specificity, the reference points all affect output quality. But more important than prompt skill is visual judgment. You need to recognize when a generated shot is close enough to refine versus when it needs a completely different approach.
This phase also involves compositing — combining outputs from different tools, blending AI-generated elements with real footage or photography, and ensuring visual consistency across shots that were produced by different platforms. Consistency is the hardest technical challenge in AI commercial production.
Phase Four: Post-Production and Delivery
Post-production follows a familiar path: editing, sound design, music selection or composition, voice-over recording, color grading, and final output. This phase is where AI-produced content gets polished into a finished commercial.
One area where AI production actually increases post-production complexity is in quality control. Because different shots come from different generation tools, there can be subtle inconsistencies in color space, motion cadence, or detail rendering. An experienced editor addresses these during the grade and final conform. It's the kind of detail that separates professional AI production from amateur hour.
What Brands Need to Know About the AI Commercial Production Process
First: AI production is not a shortcut to skipping creative development. If anything, it demands more creative clarity upfront because the generation tools need precise direction to produce usable output.
Second: not all AI production is equal. A freelancer with a Runway subscription and a social media producer with 15 years of studio experience using the full toolkit are not offering the same service. The tools are available to everyone. The craft knowledge isn't.
Third: revisions are faster and cheaper. In traditional production, changing a location means reshooting. In AI production, changing a location means regenerating. This flexibility is a genuine advantage, especially for brands that operate in fast-moving markets or have multi-layered approval processes.
Fourth: plan for the human elements. Voice-over, on-screen text, music — these still require the same attention and investment they always have. Don't blow your entire budget on generation and neglect the audio design that carries half the emotional weight of any commercial.
Vision First, Tools Second
The philosophy at Bizarre Bunny is straightforward: vision first, tools second. We figure out what needs to exist on screen to serve the brand and move the audience. Then we figure out which tools make that vision real. This sounds obvious, but it's the opposite of how most people approach AI production. Most people open a tool, see what it can do, and try to build a commercial around its capabilities. That produces content that impresses AI enthusiasts and leaves brand managers wondering where their ROI went.
If you're evaluating AI commercial production partners, ask about their creative process before you ask about their tech stack. The answer will tell you everything you need to know about the quality of work they'll deliver.
Bizarre Bunny combines deep production experience with AI fluency. We'd welcome the chance to walk you through our process and show you what it can produce for your brand.
.png)
Comments