Subtitles section Play video
If you're on the film side of TikTok like I am,
you may have seen videos going around
that show what famous movies
would look like if they were framed for our phones.
They're kind of sick!
And they all seem to have been released
right after the debut of Photoshops
new ‘generative fill’ tool.
As the name implies,
the tool allows you to generate imagery
with a simple text prompt.
And these videos, they're using this tool
as a really technologically advanced filmmaking technique,
one that's been used for over 100 years...
Albeit, with a totally different skill set.
So will A.I. be the next evolution of the matte painting?
when it comes to traditional matte paintings,
You might think of something like this from The Wizard of Oz.
Dorothy and friends are real,
but the Emerald City isn't really there.
It's just an illustration.
Using this technique,
filmmakers were able to make an impossible world
look real, at a much cheaper price tag
than building a full set.
This one was made with crayons,
but usually matte paintings are painted on glass,
like in Mary Poppins...
...or Star Wars,
which is maybe the best example, because, you know,
everybody like Star W-
I was never a Star Wars fan, which is hard to admit for me.
It was Alien, Blade Runner,
when The Matrix came out was th like... my final like, what is that?
I want to do that for a living.
I'm Maxx Berman, I spent over a decade working
as a matte painter in film and games
and I'm the co-founder of Kitbash3D.
Maxx is here to help me explain how matte paintings
went from oil paint on glass... to this.
in terms of the transition from traditional to digital,
there's a couple of chapters of this,
the first one is Photoshop.
We can paint in layers
and layers changes everything for us.
We can go backwards and forwards.
We can be working on the background
and the foreground separately, and having every brush
imaginable, and every color imaginable.
Now, this is the first digital matte painting.
And it still feels like a painting because it kind of was
The tools were different, but
we brought a painting technique to that digital.
Then we kind of moved into
what's called photo bashing.
I did a film called Her,
I was responsible for doing all the matte paintings of future L.A.
I had backpacked through Southeast Asia
for about six weeks with a camera, so I would be pulling
from all of those photographs from my travels to say,
what if we used, kind of, Bangkok a starting point?
And then we modified the architecture
to look more like the L.A.?
So did traditional matte painters see photo
bashing as a sort of cheat?
Like, whoa, you don't have to paint anything anymore?
That's not fair.
some of the other
more senior matte painters would be like,
“What the hell are you doing over there?”
Every era of a technology change,
it's looked at as cheating until it becomes commonplace.
We'll hear more about the
next big step forward in matte painting
after a message from this video’s sponsor.
This episode is presented by Microsoft Copilot
for Microsoft 365: Your A.I. assistant at work,
Copilot can help you solve your most complex problems at work.
Going far beyond simple questions and answers
From getting up to speed on a missed Team's meeting
in seconds, to helping you start a first draft faster
in Word, Copilot for Microsoft, 365
gives everyone an AI assistant at work
in their most essential apps, to unleash creativity,
unlock productivity, and up level skills.
And it's all built on Microsoft's
comprehensive approach to security,
privacy, compliance, and responsible A.I.
Microsoft does not influence
the editorial process of our videos,
but they do help make videos like this possible.
To learn more, you can go to Microsoft.com /copilotforwork
Now back to our video.
The next big step forward for matte painting
was to push those 2D layers into 3D space.
which Maxx did on Iron Man 3,
using the compositing program Nuke.
For those shots,
I painted a bunch of clouds and they were all on layers
and then I painted the landscape and the sky behind it.
All of that was set up as a projection
so that as we moved the camera with Iron Man back,
you could see the clouds moving, and him
getting further and further away from the shoreline.
Yeah, I actually found this example really interesting
because Iron Man is already being made in 3D,
so why not just make the whole shot 3D?
It is way cheaper and easier
and and more achievable to paint a photorealistic cloud
than it is to do a 3D simulation of a realistic cloud,
at least in that time.
So for a lot of those more expansive establishing
shots, bringing in a matte painter
could circumvent a lot of that.
But 3D tools caught up.
it is just as fast, if not faster today
to actually create
that environment in 3D than it would be in 2D.
Maxx's company actually helps
speed the process up in some ways.
Their pre-built assets are meant
to be dragged and dropped, helping
artists quickly build worlds
like you can see in this teaser for their Manhattan kit.
If you're using our Manhattan kit,
you're using the same exact thing that
Doctor Strange is using on their films.
Technology's getting a lot better these days.
Rendering and computing is cheaper than ever.
And because of that, you know, the need for a 2D artist
is becoming less and less valuable.
But what about a 2D artist
that can make a matte in seconds?
Now, I am not skilled enough to have ever pulled
anything like this off on my own.
But even artists who do have those skills
might one day have to start adding A.I.
to their toolbox.
I think it's just early days in this tool
we have some more work to do.
as someone who's learned how to communicate via
visuals composition, lighting, color,
all of these are my ways of expression,
and these are not the things that today
A.I. is picking up working with
Tools like Nvidia's A.I.
Canvas could begin to bridge that gap, as it takes
simple brushstrokes and transforms them
into photorealistic images.
But obviously, these tools aren't the same as a paintbrush.
Digital or physical.
Because with this, what you see is exactly what you get.
With these tools,
there's a gap between
what you put in and what you get out of it,
Getting exactly what you want
ends up being an iterative process,
often involving dozens of prompts, plus
some paintbrush tinkering in Photoshop or another app.
So right now,
these tools are thriving with individual creators.
But in Hollywood...
I have not seen anyone or any production
utilize it in production
and I think there's a couple of reasons for it.
One, it's not there yet.
It's close and it's getting better real quick,
but it's not there today.
The other part of it is the legalities
of you know, who owns the work that's being generated,
who owns the data that it was trained on.
ultimately, it's going to change everything.
I think we can't deny that.
And ultimately,
human plus computer beats
human and beats computer.
It is the marriage of these two
in which technology's always shined.