I forgot to mention a super fun moment from Cannes last week. Zombie brand Toys "R" Us and OpenAI flexed a bit as they showed the "first-ever brand film" (aka commercial) using Sora, OpenAI's new (yet unreleased) text-to-video tool. The video showcased the talents of Native Foreign, a creative agency with alpha access to Sora. You can watch the video here.
No matter how jaded or cynical you are, no matter how much you love or hate this particular piece of work, no matter how snarky you want to be about the hackneyed writing or obvious shot choices or cliché storyline or stylistic ripoff of "The Polar Express," etc., you need to take a step back and think about how this was created.
Someone typed a version of this screen direction into Sora…
INT. DAY. DOLLY SHOT OF A KIND, HAPPY, BESPECKLED MAN IN HIS MID-THIRTIES WALKING THROUGH HIS BICYCLE SHOP DRESSED IN A BLUE WORK SHIRT. THE SUN STREAMS THROUGH A SIDE WINDOW HIGHLIGHTS HIM FROM BEHIND, GIVING THE HANGING BICYCLES A MAGICAL GLOW.
… and the scene was created. This isn't just awesome… it is absolutely magical!
Yes, it's early days. Yes, this could be much better. Yes, it has all kinds of technical issues. But… take a step back and consider what this means for the future of video production.
This demonstration is so "expected" in 2024 that it doesn't actually qualify as a PR stunt; it's simply a preview of the future of production. Get ready. In practice, everything has already changed.
As always your thoughts and comments are both welcome and encouraged. Just reply to this email. -s
|