How A.I. Is Changing Hollywood
Released on 05/17/2022
[Narrator] Behind some of the coolest premium effects
in Hollywood content is the invisible aid of AI.
Artificial intelligence.
It is just blowing the doors wide open
on opportunities for new ways to tell stories.
This is a good technology to hang our hat on
because it is getting so much better
every single year.
[Narrator] Machine learning is being baked into workflows
helping create previously unimaginable moments
from big blockbusters to non-fiction TV.
I think where AI really is impactful
is getting it to do things that human beings can't do.
[Narrator] Including raising the dead?
As if you know, you had Andy Warhol
standing in the studio right in front of you,
and you looked at him and said,
I want you to say it like this.
[AI Voice] I wasn't very close to anyone
although I guess I wanted to be.
[Narrator] Let's examine a few specific use cases
of how AI is changing Hollywood's creative workflow.
[gentle music]
The entertainment industry was spawned by new technology.
So it makes sense that from talkies to television
to digital video, Hollywood has a history
of leveraging new tech,
especially in the world of visual effects.
When I saw Jurassic Park
that was the moment that I realized
that computer graphics would change the face
of storytelling forever.
In the last 25 years that I've been working in film
we've been conquering various challenges
doing digital water for the first time in Titanic,
doing digital faces for the first time
in a movie like Benjamin Button.
[Narrator] And now the state of the art
is machine learning AI applications,
like the kind Matt's company Mars develops in house.
You can throw it, you know, infinite amount of data
and it will find the patterns in that data naturally.
[Narrator] Thanks to thirsty streaming services,
Hollywood is scrambling to feed demand
for premium content rich in visual effects.
Budgets time are not growing in a way
that corresponds to to those rising quality expectations.
It's outpacing the number of artists
that are available to do the work.
[Narrator] And that's where AI comes in.
Tackling time consuming, uncreative tasks
like de-noising, rotoscoping,
and motion capture tracking removal.
This was our first time ever trying AI in a production.
We had a lot of footage just by virtue
of being on the project and doing 400 shots for Marvel.
When we received the footage, which we call the plates,
in order to manipulate Paul Bettany's face
there needed to be tracking markers
during principal photography.
We looked at it.
We said, Okay, well, removing tracking markers
is going to take roughly one day per shot.
In order to replace or partially replace Vision's head
for each shot, and a shot is typically defined
as about five seconds of footage.
The tracking marker removal itself was about a 10th of that.
So on a 10 day shot,
one day was simply removing tracking markers.
We developed a neural net where we are able to identify
the dots on the face
where the artificial intelligence averaged out
the skin texture around the dot, removed the dot,
and then infilled with the average
of the texture surrounding it.
Now Marvel loved it because it's sped up production.
They saved money.
It's exactly what we wanted these solutions to do.
Where the solution was faltering
was whenever there was motion blur.
When Paul Bettany moves his head very quickly
to the right or to the left,
there's moments where those dots will reappear
partially because in the dataset itself
we didn't have enough motion blur data.
Another example would be whenever the character
turned his head where his eyes were out of the screen
you would see those dots reappear as well.
And the AI recognition, it's using the eyes
as a kind of a crucial landmark to identify the face.
And so if I turn my head this way and you can't see my eyes
well, the AI can't identify that as a face.
Again, you can fix those things with more data,
the more data you feed these things,
typically the better, right?
[gentle music]
[Narrator] There wasn't a lot of clean data
available on our next AI use case.
The star of the film had been dead for 25 years.
Yet the director wanted more than 30 pages of dialogue
read by iconic artists, Andy Warhol himself.
So what do you do?
You could hire like a voice actor
to do like a great impersonation but we found with his voice
you kind of wanted to retain that humanness
that Andy had himself.
You can get fairly close with the voice actor
but you really can't get it.
And that's where AI technology really helps.
Generative audio is the ability for a artificial agent
to be able to reproduce a particular voice
but also reproduce the style, the delivery,
the tone of of a real human being and do it in real time.
[AI Voice] Welcome to Resemble a generative audio engine.
When the team initially reached out to us
they proposed what they were going to do.
We asked them like, okay, well
what kind of data are we working with?
And they sent us these audio files
like recordings over a telephones.
They're all from the late seventies, mid seventies.
The thing about machine learning
is that bad data hurts a lot more than good data.
So I remember looking at the data we had available
and thinking this is gonna be really, really difficult
to get right with three minutes of data.
We're being asked to produce six episodes worth of content
with three minutes of his voice.
So with three minutes,
he hasn't said every word that's out there.
So we're able to extrapolate to other phonestics
and to other words, and our algorithm
is able to figure out how Andy would say those words.
That's where neural networks are really powerful.
They basically take that speech data
and they break it down and they understand hundreds
and thousands of different features from it.
Once we have that voice that sounds like Andy
from those three minutes of data
then it's all about delivery.
It's all about performance.
[AI Voice] I went down to the office
because they're making a robot of me.
And Andy's voice, it's highly irregular.
And that's where the idea of style transfer really came in.
So style transfer is this ability
for our algorithm to take input as voice
and someone else's speech.
[Voice Actor] I wasn't very close to anyone
although I guess I wanted to be.
But we're able to say that line.
And then our algorithms are able to extract certain features
out of that delivery
and apply it to Andy's synthetic or target voice.
The first one was like automatic generated.
No, touch ups.
[AI Voice] I wasn't very close to anyone.
Although I guess I wanted to be.
The second one was like touch up by adding a pause.
[AI Voice] I wasn't very close to anyone,
although I guess I wanted to be.
And then the third one was basically
adding the final touch where it's like, okay, you know what?
I really want to place an emphasis
on this particular syllable.
So yeah, let's get a voice actor to do that part
to actually place that emphasis
on the right words and the right syllable.
And then the third output has those features extracted
from that voiceover actor and to Andy's voice.
[AI Voice] I wasn't very close to anyone
although I guess I wanted to be.
You have definitely heard AI voices
being used in the past for touch ups
for a line here or there.
This is probably the first major project that's using it
so extensively.
Most VFX is still a very manual process.
Characters can be extremely challenging,
creatures, things like fur hair.
Those things can be extremely challenging
and time consuming.
[Narrator] One notable example of where the technology
is headed are the scenes involving advanced 3D VFX
in Avengers: Endgame.
Josh Brolin plays Thanos.
We capture tons and tons of data in this laboratory setting
with Josh.
And then we use that data to train neural networks
inside of a computer to learn how Josh's face moves.
They'll say lines, they'll look left, they'll look right.
They'll go through silly expressions.
And we capture an immense amount of detail
in that laboratory setting.
Then they can go to a movie set
and act like they normally would act.
They don't have to wear any special equipment.
Sometimes they wear a head camera
but it's really lightweight stuff, very unobtrusive
and allows the actors to act like they're in a normal movie.
Then later when the animators go to animate
the digital character, they kind of tell the computer
what expression the actor wants to be in.
And the computer takes what it knows
based on this really dense set of data
and uses it to plus up,
to enhance what the visual effects animator has done
and make it look completely real.
[gentle music]
So there will come a time in the future.
Maybe it's 10 years, maybe it's 15 years,
but you will see networks that are going to be able to do
really creative stuff.
Again, that's not to suggest
that you remove talented artists from the equation,
but I mean, that's the bet
that we're taking as a business.
Is AI gonna take over my job?
What I see happening right now
is actually quite the opposite
is that it is creating new opportunities
for us to spend the time on doing things
that are creatively meaningful.
Rather than spending lots of time doing menial tasks,
we're actually able to focus on the creative things
and we have more time for iteration.
We can experiment more creatively
to find the best looking result.
I think that the more that AI can do the menial stuff
for us, the more we're gonna find ourselves
being creatively fulfilled.
Again, the argument for us is
like really creating content that isn't humanly possible.
So, you know, we're not interested in
like creating an ad spot that your real voice actor would do
because in all honesty,
that real voice actor would do way better
than the AI technology would do.
It would be way faster
if you're just delivering a particular sentence
or a particular line.
The technology to do deep fakes is so prevalsent.
You can get apps on your phones now
that pretty much can do a rudimentary deep fake.
It's gonna be interesting in the future.
Are we gonna have to put limits on this technology?
How do we really verify what's authentic
and what isn't?
There are sort of social repercussions for it as well
that I think that we don't quite understand yet.
I absolutely believe that this technology
could be misused.
Our number one priority is to make everyone feel comfortable
with what we're doing.
I think it comes down to educating
the general population eventually
and making them understand that they should think through
whatever they are looking at
wherever they're reading and now whatever they're hearing.
We feel we're directionally correct in our bet
that this is a good technology to hang our hat on
because it is getting so much better every single year.
And we don't wanna miss what we see
as like a once in a lifetime opportunity here.
How the Disco Clam Uses Light to Fight Super-Strong Predators
Architect Explains How Homes Could be 3D Printed on Mars and Earth
Scientist Explains How Rare Genetics Allow Some to Sleep Only 4 Hours a Night
Scientist Explains Unsinkable Metal That Could Prevent Disasters at Sea
Is Invisibility Possible? An Inventor and a Physicist Explain
Scientist Explains Why Her Lab Taught Rats to Drive Tiny Cars
Mycologist Explains How a Slime Mold Can Solve Mazes
How the Two-Hour Marathon Limit Was Broken
Research Suggests Cats Like Their Owners as Much as Dogs
Researcher Explains Deepfake Videos
Scientist Explains How to Study the Metabolism of Ultra High Flying Geese
Hurricane Hunter Explains How They Track and Predict Hurricanes
Scientist Explains Viral Fish Cannon Video
A Biohacker Explains Why He Turned His Leg Into a Hotspot
Scientist Explains What Water Pooling in Kilauea's Volcanic Crater Means
Bill Nye Explains the bet365体育赛事 Behind Solar Sailing
Vision Scientist Explains Why These Praying Mantises Are Wearing 3D Glasses
Why Some Cities Are Banning Facial Recognition Technology
Scientist's Map Explains Climate Change
Scientist Explains How Moon Mining Would Work
Scientist Explains How She Captured Rare Footage of a Giant Squid
Doctor Explains How Sunscreen Affects Your Body
Stranger Things is Getting a New Mall! But Today Malls Are Dying. What Happened?
The Limits of Human Endurance Might Be Our Guts
Meet the First College Students to Launch a Rocket Into Space
Scientist Explains Why Dogs Can Smell Better Than Robots
A Harvard Professor Explains What the Avengers Can Teach Us About Philosophy
NASA Twin Study: How Space Changes Our Bodies
What the Black Hole Picture Means for Researchers
Scientist Explains How to Levitate Objects With Sound
Why Scientists and Artists Want The Blackest Substances on Earth
Biologist Explains How Drones Catching Whale "Snot" Helps Research
Researcher Explains Why Humans Can't Spot Real-Life Deepfake Masks
Doctor Explains What You Need to Know About The Coronavirus
VFX Artist Breaks Down This Year's Best Visual Effects Nominees
How Doctors on Earth Treated a Blood Clot in Space
Scientist Explains Why Some Cats Eat Human Corpses
Voting Expert Explains How Voting Technology Will Impact the 2020 Election
Doctor Explains What You Need to Know About Pandemics
ER Doctor Explains How They're Handling Covid-19
Why This Taste Map Is Wrong
Q&A: What's Next for the Coronavirus Pandemic?
Why Captive Tigers Can’t Be Reintroduced to the Wild
How Covid-19 Immunity Compares to Other Diseases
5 Mistakes to Avoid as We Try to Stop Covid-19
How This Emergency Ventilator Could Keep Covid-19 Patients Alive
Why NASA Made a Helicopter for Mars
Theoretical Physicist Breaks Down the Marvel Multiverse
Former NASA Astronaut Explains Jeff Bezos's Space Flight
Physics Student Breaks Down Gymnastics Physics
What Do Cities Look Like Under a Microscope?
Inside the Largest Bitcoin Mine in The U.S.
How Caffeine Has Fueled History
How Mushroom Time-Lapses Are Filmed
Why You’ll Fail the Milk Crate Challenge
Why Vegan Cheese Doesn't Melt
How 250 Cameras Filmed Neill Blomkamp's Demonic
How Meme Detectives Stop NFT Fraud
How Disney Designed a Robotic Spider-Man
How Online Conspiracy Groups Compare to Cults
Dune Costume Designers Break Down Dune’s Stillsuits
Korean Phrases You Missed in 'Squid Game'
Why Scientists Are Stress Testing Tardigrades
Every Prototype that Led to a Realistic Prosthetic Arm
Why the Toilet Needs an Upgrade
How Animals Are Evolving Because of Climate Change
How Stop-Motion Movies Are Animated at Aardman
Astronomer Explains How NASA Detects Asteroids
Are We Living In A Simulation?
Inside the Journey of a Shipping Container (And Why the Supply Chain Is So Backed Up)
The bet365体育赛事 of Slow Aging
How Nose Swabs Detect New Covid-19 Strains
Samsung S22 Ultra Explained in 3 Minutes
The bet365体育赛事 Behind Elon Musk’s Neuralink Brain Chip
Every Prototype to Make a Humanoid Robot
Chemist Breaks Down How At-Home Covid Tests Work
A Timeline of Russian Cyberattacks on Ukraine
VFX Artist Breaks Down Oscar-Nominated CGI
Why Smartphones Night Photos Are So Good Now
We Invented the Perfect WIRED Autocomplete Glue
How Everything Everywhere All at Once's Visual Effects Were Made
How Dogs Coevolved with Humans
How an Architect Redesigns NYC Streets
Viking Expert Breaks Down The Northman Weapons
J. Kenji López-Alt Breaks Down the bet365体育赛事 of Stir-Fry
How A.I. Is Changing Hollywood
How Trash Goes From Garbage Cans to Landfills
Veterinarian Explains How to Prevent Pet Separation Anxiety
The bet365体育赛事 Behind Genetically Modified Mosquitoes
How Scientists & Filmmakers Brought Prehistoric Planet's Dinosaurs to Life
All the Ways Google Gets Street View Images
How Public Cameras Recognize and Track You
How the Nuro Robotic Delivery Car Was Built
Biologist Explains the Unexpected Origins of Feathers in Fashion
Surgeons Break Down Separating Conjoined Twins
Former Air Force Pilot Breaks Down UFO Footage
Bug Expert Explains Why Cicadas Are So Loud
The Best of CES 2021
Health Expert Explains What You Need to Know About Quarantines
Scientist Explains How People Might Hibernate Like Bears
Could a Chernobyl Level Nuclear Disaster Happen in the US?
Neuroscientist Explains ASMR's Effects on the Brain & The Body
Why Top Scientists Are Pretending an Asteroid is Headed for Earth
Epidemiologist Answers Common Monkeypox Questions
Bill Nye Breaks Down Webb Telescope Space Images
How This Humanoid Robot Diver Was Designed
Every Trick a Pro GeoGuessr Player Uses to Win
How NASA Biologists Plan to Grow Plants on the Moon
How FIFA Graphics & Gameplay Are Evolving (1993 - 2023)
How a Vet Performs Dangerous Surgeries on Wild Animals
This Heart is Not Human
How Entomologists Use Insects to Solve Crimes
Former NASA Astronaut Breaks Down a Rocket Launch
Chess Pro Explains How to Spot Cheaters
Why Billionaires Are Actually Ruining the Economy
How to Keep Your New Year’s Resolutions for More Than a Week
The Biology Behind The Last of Us
English Teacher Grades Homework By ChatGPT
All the Ways a Cold Plunge Affects the Body
Spy Historian Debunks Chinese Spy Balloon Theories
A.I. Tries 20 Jobs | WIRED
Mathematician Breaks Down the Best Ways to Win the Lottery
Why Music Festivals Sound Better Than Ever
Pro Interpreters vs. AI Challenge: Who Translates Faster and Better?
Why The Average Human Couldn't Drive An F1 Car
Atomic Expert Explains "Oppenheimer" Bomb Scenes
Every 'Useless' Body Part Explained From Head to Toe
How Pilots and Scientists Are Thinking About the Future of Air Travel
How To Max Out At Every Fantasy Football Position (Ft. Matthew Berry)
All The Ways Mt. Everest Can Kill You
How Fat Bears Bulk Up To Hibernate (And Why We Love To See It)
Why Vintage Tech Is So Valuable To Collectors
8 Photos That Tell The History of Humans In Space
How Every Organ in Your Body Ages From Head to Toe
Why AI Chess Bots Are Virtually Unbeatable (ft. GothamChess)
How Mind-Controlled Bionic Arms Fuse To The Body
Historian Breaks Down Napoleon's Battle Tactics