Menu Close
Star Wars: The Force Awakens has been praised for avoiding computer generated effects, but why does CGI deserve such a bad name? Image courtesy of Lucasfilm.

Friday essay: Star Wars, Mad Max and the ‘real’ vs digital effects furphy

There has been a lot of talk recently about the superior results achieved with “real” effects and stunts in Mad Max: Fury Road (2015) and Star Wars: The Force Awakens (2015).

Both films have cleverly traded on audience nostalgia. Indeed uber- geek Kevin Smith (Clerks, 1994) said in an interview of The Force Awakens, “The moment they put Han, Luke, Leia and Chewie in it, we knew that he was crafting the fountain of youth, and how much would you pay to drink from the fountain of youth?”

Of course it wasn’t just the casting of aging actors that got fans excited. Throughout production and in the lead up to release, director JJ Abrams and Disney were careful to promote the return to “real sets and practical effects”.

The creation of an Imperial Walker model from ‘The Empire Strikes Back’ (1980).

In other words, they were doing everything possible to distance themselves from George Lucas’s much derided prequel trilogy, which wholeheartedly embraced and helped advance the emerging digital technology of the late 1990s and early 2000s.

Meanwhile, Aussie director George Miller relaunched his much-loved dystopian sci fi franchise with Fury Road, a film universally lauded for its prolific use of in camera stunts, made without computer generated imagery (CGI). The stuntwork in Fury Road is undeniably impressive, but so too is the vast amount of digital effects work done by Australian VFX provider Iloura. (Iloura received a Visual Effects Society award for their work on a sequence depicting a toxic storm.)

What both Fury Road and The Force Awakens have in common are thousands of digital effects seamlessly integrated into the film - amongst an array of real sets and practical stunts and effects. Yet while exciting, this is not particularly new or revolutionary.

Films are baked from a mixture of ingredients. In very rare cases, a filmmaker like Alfonso Cuaron will choose to test the limits of technology with an almost entirely computer animated film such as Gravity (2014). But for every Gravity there is a Moon (2009), Duncan Jones’ sci fi film that made extensive use of miniatures, practical sets and props.

Once upon a time, Blue-screen was king, but with the advent of digital technology Green-screen now dominates. With Green-screen, the green areas of an image can be digitally selected and deleted. This allows what remains to be pasted onto a new background. Digital cameras usually have twice as many green sensors as red or blue ones allowing for greater detail around the edges when separating elements from a green background.

Most films made today fall somewhere between the two extremes, with real props and effects augmented by digital effects.

However there is a growing vocal minority who blame poor CGI for “ruining movies”. Their premise is that at one time movies were real, but now computers make everything look fake and unconvincing.

Common complaints are that CGI images result in a plastic look that is too clean and perfect. This can be true, but only to the extent that everything made for a film will look new and clean unless someone makes the effort to “dirty it up”.

There are many examples of realistically aged CGI effects that are rarely noticed by viewers, especially on a show such as The Walking Dead, which must create a believably decrepit vision of a post zombie apocalypse.

A brief history of visual effects

1993 saw a turning point in the world of visual effects filmmaking. For almost 70 years beforehand, they were created in more or less the same way.

The silent film The Lost World (1925) - a technological marvel produced by Willis O'Brien - set the standard for decades to come. O'Brien revolutionised filmed special effects by integrating 3-dimensional animated puppets into scenes with footage of actors. While clearly fake by today’s standards, at the time the realistic interaction of light and shadow on the 3D puppets was utterly convincing to audiences who had never seen anything like it. Indeed, the technique was only challenged eight years later when O’Brien refined his own techniques to make King Kong (1933).

The techniques pioneered and refined in these early blockbusters would remain broadly unchanged throughout the 20th Century - until Steven Spielberg filmed Michael Crichton’s 1990 novel Jurassic Park. Spielberg and his creative team had expected that the visual effects would be assembled from the usual matte paintings, miniatures, stop motion animated creatures, and life sized animatronic robot dinosaurs, which would interact with actors.

However, a small group of technicians and artists at Industrial Light and Magic, George Lucas’s ground-breaking effects company, thought they just might be able to do something with computers. By the early 1990s computers were being used to simulate the interactions of light with various objects and surfaces. The new technology resulted in more convincing images due to highly detailed and extremely accurate digital lighting.

After viewing a short computer animated clip of a dinosaur skeleton running that had been produced entirely by computer (without physical models), Spielberg was convinced to forego stop motion puppets in favour of CGI. At the time this was a risky move that could easily have backfired.

As it happens, Jurassic Park (1993) did deliver. Amongst the animatronic robots that made up the bulk of the movie’s dinosaurs, were 63 CGI shots - mostly of dinosaurs viewed from a distance or travelling together in vast herds. These 63 shots changed the world of filmmaking forever. By 2014, Guardians of the Galaxy had CGI enhancements on 2750 shots - equivalent to 90% of the film.

Of course CGI is a highly misleading term. Computers “generate” images the same way that paintbrushes do, as tools being manipulated by highly skilled artists. When less skilled artists attempt to use the same tools the results are invariably inferior.

Le Voyage dans La Lune (1902) was the first special effects blockbuster.

CGI has been lauded as a cheap and efficient alternative to traditional effects. This is only partially true. It is certainly much faster and cheaper to preview images on computer than to go through multiple stages of photographing elements, sending the film to a lab for processing, rephotographing the processed film through an optical printer to combine the elements and sending the new film back to the processing lab. And if something didn’t quite work, having to start all over again.

But how cheap is CGI actually? In 1993 Jurassic Park cost US$63 million. Just four years later James Cameron’s CGI-heavy Titanic (1997) cost a record-breaking US$200 million. These days US$200 million-plus budgets for visual effects laden blockbusters are common. The rumoured budget for Batman vs Superman: Dawn of Justice (2016) is US$410 million, a total that is admittedly arrived at by combining the US$250 million production cost with another US$160 million in marketing and distribution.

If you wonder what causes modern movies to be so immensely expensive just try sitting through the entire credits of any of the latest Marvel or DC comic book adaptations.

It takes literally hundreds of people to make Superman fly, or to have The Avengers save/destroy New York. All of those people need to be paid and they also need to be provided with expensive computers and even more expensive software. CGI may be ubiquitous but it has most certainly not resulted in cheaper movies.

Road to ruin – or path to success?

What about the argument that CGI is “ruining the movies”?

Well firstly, most viewers have no clue just how much CGI is used in films and TV programs. If you are in any doubt as to the prevalence of effects being used in “average” movies take a look at this visual effects breakdown from Gone Girl (2014) or this one from TV program Ugly Betty (2006-2010):

I have yet to hear of anyone who saw either Gone Girl or Ugly Betty complaining that CGI ruined their viewing experience.

Sometimes CGI will be used to create fantastical characters or otherworldly locations. But more often it is used for digitally extending sets, removing unwanted elements in a scene or even eliminating unsightly blemishes from a performer’s skin.

There was a time when something as simple as a boom microphone dropping into frame meant that a shot was unusable. Now a digital cleanup artist can simply erase the offending microphone from the image. Of course, as directors became aware of these possibilities they demanded more and more from CGI.

Smart directors consult with visual effects supervisors to plan their effects shots carefully before shooting. Other directors fly by the seat of their pants in the mistaken belief that computers can fix anything.

The result of this failure to plan is usually an obvious CGI shot such as this nude scene in Game of Thrones that even the most forgiving of viewers will feel is somehow wrong.

Still, only obvious CGI is noticed and remarked upon, such as the infamous digital baby from the Twilight Saga: Breaking Dawn part 2 (2012), while an overwhelming majority of digital effects go completely unnoticed. When a VFX artist has done their job well their work is invisible.

Secondly, those supposedly “real” movies of yesteryear never actually existed. Movies have never been real. What has changed over the years are the methods by which reality is faked.

Wind and rain machines, studio sets replicating exterior locations, shooting day for night, or dry for wet - all of these techniques have been around for decades. The earliest days of cinema treated film as a stage play, with large painted backdrops and two dimensional mechanical illusions as seen in George Méliès Le Voyage dans La Lune (1902) - the first effects driven sci-fi blockbuster.

Believable or not?

Visual effects do not have to be completely realistic to be effective, they simply have to be believable in the context in which they are presented.

Just as 3D puppets and miniatures took over from 2D paintings due to their superior interaction with light, digital effects have mostly taken over from practical models for the same reason.

Computer simulated effects are able to be scaled up or down while simultaneously adapting to real world physics. This has always been a problem with filmed effects involving water, fire and smoke - just look at any film made before the advent of CGI that used miniatures to depict a dam bursting, a building on fire or a ship sinking.

CGI, like any other tool in a filmmaker’s repertoire, can be and often is used badly. But that is hardly the fault of the technology.

Movies inspired by comic books often include characters performing feats so far removed from reality that we should not be surprised when the results appear “cartoony”.

This is less a problem with how the scene is realised technically than with what is being realised. No matter how strong an individual character may be, if that character grabs a passenger jet by the nose and attempts to gently place it on the ground, the weight of the plane would surely tear the metal skin away from his or her grip and disaster ensue.

Jet planes are not engineered to be manhandled in such a way. So no matter how intricately detailed the scene is rendered, it will always feel somehow wrong.

For a great visual summary of how CGI can be used to enhance all kinds of film, take a look at this video from RocketJump Film School:

When skilled artists apply digital effects in a way that both supports the narrative and honours the laws of physics, the results are much more likely to be accepted as “real”, and more often than not will go completely unnoticed.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now