14 Special Effects, the Digital Age, & the Future of the Movies

Special effects

What we call “special effects” have been a part of filmmaking from the very beginning. If a special effect is a kind of magic trick, designed to make us see what is not actually happening, then film  is itself a “special effect”: that original projection of a sequence of still images of a horse running such that it creates the illusion of motion is the magic trick at the origin of everything we have been studying this semester:

image

Of course, early on narrative film largely made a shift away from highlighting the magic of the new medium towards an illusion of realism—towards, that is, encouraging the magician and the magic to be rendered invisible so that the illusion of watching “actual” events unfold on screen would be as complete as possible.

Early on in our time together, we highlighted this as a fork-in-the-road, with the realism of the Lumière brothers on one side and the magic show of Georges Méliès on the other. While the Lumière films sought to create on screen a representation of life as it was (realism), Méliès explored the potential of film to create utterly impossible realities (magic). Both of the approaches resulted in scores of films before the 20th century even began.

For a range of reasons, the realism of the Lumière brother’s would predominate for much of the 20th century. After all, film was a photographic medium, lending itself to representations of the real world, and the kind of special effects necessary to make Méliès’s brand of film magic happen was intensely time-consuming. Nonetheless, from early on, special effects would be enlisted in the service of recreating a believable reality. Here, for example, is what is often considered the first use of special effects (from 1895) in film to represent the beheading of Mary, Queen of Scots:

Here the effect—a simple cut with the substitution of a dummy for the human actor, carefully spliced to hide the cut as much as this very early technology would allow—is not deployed to bring to life alternative realities, but in the service of recreating a historical event from 1587 that no living person had ever seen. This first special effect was called at the time a “shot trick” but soon became known as a substitution splice.

At the start of the 20th century, experimentation with special effects increased apace. The novelty of the original “invention” of projected film was wearing off and film producers were looking for new  visual wonders to bring in customers to the new nickelodeons and other early movie theaters. One of the most innovative pioneers of the period was the director Edwin S. Porter. Porter is more often celebrated for his role in directing some of the first longer-form narrative films in the U.S., but he is also a pioneer in special effects, solving problems that emerged as his films grew longer and more complex. In The Great Train Robbery (1903), for example, he deployed double-exposure to composite film of a train driving by with the set in which the robbers assault the switch-operator. The result is that the train seems to be driving by the window (if you watch the window closely you can see it does not line up perfectly, but it is nonetheless impressive given how limited the technology was at the time).

A couple of years later, Porter would engage in his most daring special effects film to date, The Dream of a Rarebit Fiend (1906), a very early comic adaptation of a strip by the same name by the cartoonist Winsor McCay. In this film, Porter and his team brought out every trick in the playbook, including, the promotional material promised, “photographic ‘stunts’ have … seen or attempted before, and but few experts in photography will be able to understand how they are done.” In truth, most of what was on display here, as our protagonist drunkenly makes his way home and then out into his nocturnal dreamscape, is  double-exposure photography. But it is executed  precisely and towards fantastical effects (designed to subjectively recreate the drunk man’s delirium). More importantly, we can see in the film company’s promotion of the special effects that these were eagerly sought out by fans of the new medium and were selling points as much as story … and much more than movie stars, who did not yet exist.

image image

The star system in Hollywood that we discussed in the last chapter is a later invention. In fact, the early industry went out of its way to prevent stars and celebrity culture from emerging, fearing it would drive up costs (as it had on stage) and make the new industry less profitable. Eventually, in the teens, the industry realized that the benefits of celebrity as a form of branding and differentiation could outweigh the costs. But it is fair to say that for the early years of the industry, the spectacle—especially the special effect—was the true star.

The following year, Porter directed  a pioneering stop-motion animation, The ‘Teddy’ Bears (1907), which highlighted a new time-consuming but relatively straightforward illusion that could be created through film.

In the 1910s, matte photography systems became a site of innovation and focus. In 1907 the pioneering cinematographer Norman Dawn had begun to experiment with a new form of matte photography. Previously, matte shots involved a challenging process of matting off part of the film for exposing one piece of photographic film on a frame and the other part for exposing another. We see this on display in the 1903 scene from Great Train Robbery in which the window is matted off while the scene of the assault is filmed, and then the rest of the film is matted off for the filming of the train. We see something very similar in another 1903 film by Porter, Life of an American Fireman in which the fireman daydreams of his family before being called back to the front lines of duty:

image

Norman Dawn introduced what became known as the glass shot, which provided a new way of combining multiple images together without the challenges of matting and double-exposure. Now, instead of taking their live action footage to a real location, filmmakers would shoot the live action as before with the cut-out cards in place, then rewind the film and transfer it to a camera designed to minimize vibrations. Then the filmmakers would shoot live action against painted glass . The resulting composite was much tighter than in previous matte shots, as you can see here in this scene (with visual explanation of the effect) from Charlie Chaplin’s Modern Times (dir. Chaplin, 1936).

image

Glass shots allowed for a radically new approach to what were previously called matte shots, forgoing the double-exposure, with all the inevitably jitters and shakes, in favor of a one-shot approach to compositing. This would be foundational to the remarkable innovations of Disney animation in the 1920s and 30s, but it also would remains the basis of much matte work going forward until the dawn of the digital age.

I pause over this early history of special effects from a century ago to make a couple of points. First, special effects are fundamental and instrumental to film history from the beginning to the present. Some contemporary critics in  like to suggest that the increasing prominence of effects in film is a new contamination of a previously “pure cinema.” This is a nostalgic fantasy and bears no relationship to actual film history. Special effects are perhaps less “special” than essential to film. They are only more visible now than ever before because the digital age has opened up a whole new age of innovation and enthusiasm for the visual spectacle. Despite what some might suggest, however, this is neither antithetical nor foreign to film history: it is, in fact, deep in its genome. The history of film is every bit as much the history of special effects innovation as it is the story of narrative innovation or any of the many elements we have discussed over the previous chapters. Despite the claims that special-effects driven films are somehow antithetical to the “true” definition of cinema, such claims have more to do with cultural snobbery or nostalgia for an imagined past than with any realities in the actual story of the medium.

I am not going to try to offer a comprehensive history of special effects here: there is too much to cover and indeed special effects could be the subject of a course unto itself. But the experimentation of a century ago helps us make sense of the remarkable period of experimentation and innovation in which we currently reside. After all, for all the continuities between traditional analog film and digital production and post-production, the 21st-century digital age is undeniably a new era of film production, one still in its early decades.

Before we move into digital age, however, I want to highlight a few other remarkable innovations and discoveries of the pre-digital era. In the interest of time, I will pick some favorites:

  • King Kong (1933): Merian C. Cooper and model animator Willis O’Brien used stop-motion, miniatures, rear-projection, and optical compositing, combining  actors, puppets, and miniatures in a remarkable hybrid of stop-motion and live-action photography. This film would inspire the great master of stop-motion effects, Ray Harryhausen, to launch his own lifetime of devotion to bringing the impossible to the screen in films from Mighty Joe Young (1949) to Clash of the Titans (1981).

  • 3-D: experiments in 3D film exhibition began as early as the 1930s, although it would not be until the rise of television in the 1950s that it would take off, especially in science fiction and horror films. 3D in the 1950s was a desperate attempt to get TV-owners back to the move theaters, promising something they could not get at home. By 1954, 3D film began to fade in movie theaters across America, as exhibitors rebelled against the costly and cumbersome system. Of course, in the early 21st-century, 3D would come back into prominence, now in the service of selling digital projection to a wary and cash-strapped movie theater industry. Once the conversion to full digital projection was largely complete, however, 3D would disappear from prominence once again.
  • Slit-screen effect: first experimented with in the title sequence to Alfred Hitchcock’s Vertigo, this technique was most famously deployed by Douglas Trumbull for 2001: A Space Odyssey (dir. Stanley Kubrick, 1968). In the digital age, creating such effects would be relatively easy, but in the analog era, this was like nothing seen before on screen.

  • In the early 1970s, a cycle of disaster films take the art of using models against matte backgrounds for dramatic effects that would not be topped until well into the digital era. The Poseidon Adventure (1972), Earthquake (1974) and The Hindenburg (1975) all would win special effects awards at the Oscars.

We should also add to this category the dramatically expanding expertise in prosthetic makeup in the 80s that contributed heavily to the resurgence in horror and science fiction films. An American Werewolf in London won the Oscar for best makeup in 1981, the first year the prize was regularly given by the Academy Awards. The following year, Rick Baker’s masterful makeup and prosthetic work on The Thing caused a sensation.

 

All of these effects can be broken down into rough categories. For example, in-camera effects: visual effects that are accomplished solely during principal photography, involving no additional post-production, such as double-exposure and stop-motion; or mechanical effects: a special effect created for motion pictures and television programs using scale models or other objects built to represent objects of a different scale on screen.

An example of mechanical effects was seen in the clip from Rashomon in Chapter 8, in the torrential rainstorm that has driven our protagonists into the shelter of the looming gate. It was not actually raining as they were filming. Instead the rain was generated by fire hoses drawing water from an off-screen tank. Kurosawa was initially dissatisfied with the way the rain was registering during the test shots;  the crew solved the problem by adding some black ink to the water tanks so the rain’s vertical movement would be more legible on screen.

All of  the special effects we have discussed so far are what are called practical effects. That is, these are effects created during preproduction and production, requiring no digital post-production development and editing. We often have a mistaken sense that in the digital era, practical effects have gone by the wayside, but this is far from the case.

Practical effects have continued to exist side-by-side with digital (postproduction)  effects, and will continue to do so. As always in filmmaking, there are several primary considerations: cost, impact, efficiency (i.e., how long it will take to get it done). Mad Max: Fury Road (dir. George Miller, 2015), despite being made in an age when more and more special effects can be rendered convincingly in post-production, is very deliberate in how it combines practical effects with post-production CGI. Here, for example, is a small clip in which the process of filming the tanker explosion towards the end of the film is explained:

As we can see in the raw footage from the explosion, in which a real truck is blown up on the desert, this was all recorded in production. Of course, because this is a very dangerous explosion, the pursuit vehicles we see in the final version of the film are not part of the explosion but were composited in in postproduction. That is, two shots were filmed in production: one of the tanker exploding and another of the vehicles pursuing. And then the two were digitally composited to make one unified shot that presents them as happening simultaneously.

This is what digital editing made newly possible beginning in the 1980s and 90s. Now, instead of double-exposure or rear-projection—techniques which even at their best always called attention for the attentive viewer to the seams between the various layers—digital compositing in post-production allowed the two (or more) separate shots to be blended into one whole.

Even before considering digital special effects of the more spectacular kind, the range of what digital editing alone can accomplish is breathtaking. Take, for example, the ability to erase in digital editing. This has been especially significant in terms of the technique known as wire removal. The ability to set up more elaborate stunts and fights knowing that the safety wires could be edited out—either automatically or filmed against a green screen or manually through digital painting in post-production—dramatically raised the bar in terms of the kinds of physical pyrotechnics filmmakers could put safely on screen.

In Back to the Future, Marty’s hoverboard ride is realized through wire removal in postproduction

 

Perhaps the aspect of digital special effects with which we are most familiar today is what we today call “green screen”—originally referred to as bluescreen—filming, in which action is filmed in front of a green (or earlier blue) screen, so that the uniform background can be extracted and replaced with a matte and/or digitally rendered characters or other special effects. In 2023 in our age of Zoom this effect is so common as to seem mundane, but it was (and remains) revolutionary in terms of what it has made possible on screen. We would not have superhero films and the vast majority of blockbuster movies without it, for one; and without the dominance of those films Hollywood would be a very different thing today.

In fact, the bluescreen, or chroma-key, predates digital editing by many years, used as far back as the 1940s to accomplish what were known as traveling mattes—mattes that move across the screen, for example, to make room on the filmstock for the superimposition of a flying object, such as a spaceship—or, in 1978, Superman:

Superman’s costume had to be lightened to make sure he could be separated from the color of the bluescreen background.

image

The original use of chroma-keying and the traveling matte allowed for more possibilities in terms of imagining new kinds of movements not possible in camera. But all we have to do is compare Superman flying (above) in 1978 with any post-2000 Spider-man swinging through the city to see the difference the digital era makes:

THE DIGITAL AGE

Between 1978 and the rise of the current wave of superhero movies, of course, lies the introduction first of digital editing and then of rapidly developing technologies and possibilities in digital special effects. The various technologies and new forms of artistry to emerge out of the digital revolution have taken on a range of abbreviations that can get pretty confusing, so here’s a quick glossary:

  • CGI: computer generated imagery, or any 3-D characters or objects you see on screen which were generated using computer animation software and composited in during post-production
  • VFX: the process by which imagery is created or manipulated outside the context of a live action shot in filmmaking; integrating of live action footage and CGI elements to create a believable integrated image.
  • SFX: sound effects recorded or digitally produced in post-production and matched during the mix to the visual image. SFX is also used to refer to on-set special effects of the kinds discussed above. Sometimes now referred to as “FX”

All of this gets murky when we consider the fact that in recent years movies such as Disney’s remake of The Lion King (dir. Jon Favreau, 2019) are marketed as “live-action.” Clearly, this is not a “live-action” film. In fact, what marks The Lion King out as relatively unique in film is that its visual track is almost entirely produced by CGI. (Of course, for anyone who plays video games, an extended narrative world made up entirely of computer graphics is itself not new.)

There has been much debate in the film world about Disney’s attempts to call their latest round of remakes “live action,” in part because it is (rightly) understood by some in animation as an end-run on the part of the studio around their own unionized animators by working with a newer generation of digital shops to whom this work can be outsourced. In truth, however, the distinction between animation and “live action” is becoming increasingly porous as we get deeper into the 21st century, and will likely continue to do so f. After all, while there is only one shot in the new Lion King that has no CGI at all, there are photographic references, live action footage of landscape, and other photographic elements buried deep beneath all the CGI throughout. Where, today, does the line lie between photographic film and computer generated imagery (or what we used to call “animation”)?

Let’s back up and look briefly at the rise of CGI, which took off slowly at first, primarily in the realm of animation. In the early 1980s, experimenting with CGI began around the margins of film, much of it emerging out of the special effects company George Lucas set up in support of his Star Wars films. A division was set up to focus entirely on solving the problem of digital animation, originally called the Graphics Group. A messy divorce for Lucas forced him to sell off this division in 1986, which was purchased by a group with former Apple founder Steve Jobs as a majority shareholder. This company became reorganized as Pixar.

Meanwhile, at Disney animation, a young animator named John Lasseter was experimenting with computer animation. In a company dedicated since its founding in the 1920s to traditional hand-drawn animation, this was a risky move, but Lasseter received permission to make a test reel as proof-of-concept for a possible adaptation of Maurice Sendak’s Where the Wild Things Are. This 1986 short test reel is one of the first successful experiments with combining vector animation with hand-drawn digital painting:

By all measures this was a success and one that pointed towards revolutionary changes to come. But at a company that had been doing things one way for a long time (and which had an entrenched animator’s studio committed to that way) this was not likely to prove a popular revolution. Lasseter was let go following the screening, after which he would find his way to Pixar. Pixar would eventually be purchased by Disney in 2006, some 20 years after its founding as an independent company, but not before it had revolutionized animation and effectively brought about an end to the cel animation technique that had dominated the industry for generations.

While Disney animation might have been threatened by Lasseter’s experiments with digital vector animation, in some parts of Disney, computer generated animation was already an area of interest. In 1982, Disney had produced Tron (dir. Steven Lisberger), one of the first feature films to make extensive use of computer generated animation. Although the effects look decidedly pedestrian (corny?) to our eyes in 2021, at the time they were an exciting promise of future wonders:

Beginning in 1989, Disney, which had been in the doldrums since Walt Disney’s death in the early 1970s, went through a period of remarkable creative and commercial success that became known as the “Disney Renaissance” (effectively from Little Mermaid [1989] through the end of the 90s). The success of these films put increased pressure on the animation division to produce bigger and more elaborate spectacles, and also to generate the animated features more quickly than the traditional schedule had ever allowed. So it was that Disney animation began experimenting, cautiously at first, with incorporating some elements of computer generated animation. For example, the famous ballroom scene at the end of Beauty and the Beast (dir. Gary Trousdale & Kirk Wise, 1991), involved elaborate use of computer generated architecture. Mulan (dir. Tony Bancroft and Barry Cook, 1998) made use of computer generated animation techniques to create the hoards of enemy horsemen chasing Mulan and her comrades during the climatic battle sequences:

image

Animating by hand each of the horses and riders would have been so time- and labor-consuming as to drive up the expense of the production and put profitability at risk (always a big no-no in Hollywood). Skimping on the effect would have made the effect of the battle scenes paltry at best (also a no-no). Digital animation sold itself, therefore, in providing that combination of benefits—spectacle and efficiency—Hollywood can never resist. If even the most traditional of the Hollywood studios by this point in film history (in fact, the only studio that still operated like a studio from the classical Hollywood era) was ready to embrace CGI, it was clearly the future heading into the 21st century.

While CGI was taking off, filmmaking itself still remained largely analog at this time. Live action footage didn’t register well on the big screen with the cameras available in the 1990s.[1] So it was in the digital intermediary editing process that live-action footage shot on analog film and computer generated imagery were brought together via VFX processes. Here is a discussion of how some of the pioneering effects in The Matrix (dir.  Lana & Lilly Wachowski, 1999) were composited:

To make the difference between analog and digital clearer, I like to think of the difference between making a flyer for a band or a zine in the 1980s and making a digital flyer or webpage today. Back in my younger days, zines involves cutting and pasting pieces of materials from other sources onto a new page, maybe adding new elements, and then copying them on a xerox machine. Every copy of a copy degraded the quality of the original image, and no matter how tightly we tried to cut and how cleanly to paste, the seams were always visible. Indeed, that DIY effect was part of the charm of the zines of those days. Today, of course, I can edit out anything I want from any image, change hair or skin-color or put lizard wings on a celebrity. And I can make infinite copies of copies and no degradation from the “original” will take place

Once we had cameras such as the Red and the Arri capable of shooting cinema quality bigscreen footage, new possibilities opened; and of course the technologies and processing power for computer generated imagery had been racing along at a breakneck pace during this time. Instead of having to translate analog film into digital formats for editing, now everything was digital from the start. As you know, everything digital is effectively a series of 1s and 0s: binary elements that can be manipulated, composited, and transformed on as detailed a level as one chooses to. Which means even the “photographic” elements are no longer fixed, nor is digital post-production solely reserved for science fiction or superhero films.

Let’s take a look back at Moonlight, a film as far from science fiction as can be imagined. As we discussed, that film was shot with a very high resolution digital camera, in part to capture all the detail and nuance of the individual faces that were the subject of the film. But this also allowed for very sophisticated color grading to take place in digital post-production. Thus Laxton, the cinematographer, worked in consort with the digital color correct specialist Alex Bickel to come up with an exposure level for the production that would leave Bickel with plenty of information (visual detail) and room to work in digital correction. Here we see a couple of examples of the raw photography from the film in relation to the final image we see on screen:

image

In addition, director Barry Jenkins wanted each of the three parts of the film to have a slightly different look in terms of film stock. Now, in the digital age, there are no actual  film stocks—the material piece of film whose physical and chemical composition necessarily shaded the final image in certain ways. Instead, such subtleties are recreated via algorithms in digital post-production. For Moonlight, Bickel created three different look-up tables or LUTs designed to emulate different filmstocks from the analog era. The three chapters were run through these LUTs—mathematical formulas that modify images—in order to create the look of traditional Fuji film stock for chapter one, an old Agfa film stock for chapter two, and a Kodak stock for three. These changes are likely so subtle we don’t even fully register them, and yet we feel the “look” of the film transform with each chapter, along with our protagonist.

I pause over color grading—a relatively knew but exciting area of digital post-production—to point out that special effects are not only about explosions or making heroes fly. We are still in the early years of experimentation and innovation with digital special effects, especially in a world of filmmaking in which almost all production, distribution, and exhibition is now wholly digital. Like all the topics in this text, there is far more to talk about here than can be captured in a single chapter, and indeed this topic can be a course in itself. More than any chapter in this book, this is one that will need to be updated most frequently, as we have only begun to see all that can be put on screen through the combination of practical and digital special effects and postproduction techniques that are still in their infancy.

As a parting treat before we turn to some concluding thoughts projecting into the future, here are a couple of examples of all the remarkable things happening in this realm:

The Future of The Movies

Around 1999, with the coming of the new millennium and just a few years after celebrations of the first century of film, many film critics and scholars spilled massive quantities of ink predicting the future of film. 1999 had been an especially good year. Strong box office and many movies that remain highly regarded over 20 years later, including  The Matrix,  American BeautyFight Club, and The Sixth Sense. But there was also a feeling that the Hollywood that made small movies like Being John Malkovich and The Virgin Suicides was nearing its end.

That same year, George Lucas pushed the coming digital age forward dramatically by exhibiting Star Wars: The Phantom Menace in four theaters using digital projection. Today that sounds… utterly unimpressive. And well it should: after all, today virtually every commercial theater in the United States has traded in its traditional film projectors for digital projectors. However it was Lucas who lit the fuse to make this happen, first by proving the concept of digital exhibition in 1999, and then, in 2002, by releasing the first Star Wars film shot 100% digitally.

When he began work on Attack of the Clones, Lucas believed that he would be able to open his 100% digital film on the 2000 theaters equipped for the new age of digital exhibition. As it turned out, that 1999 prediction had proved optimistic. In fact there were only around 19 theaters in 2002 ready for digital projection in the whole country (including two in Ohio). So while Lucas had proved that big budget digital filmmaking could work (previously there had been only smaller independent films produced 100% digitally), and while digital editing had look been adopted by this point, digital exhibition remained out of reach.

The reasons were fairly simple. Despite the critically-celebrated films released at the end of the 90s, overall theater box office remained weak, and projections for the new millennium did not look better. The rise of the DVD at the end of the 90s promised a new source of revenue for the industry, but further hits for theater owners. The studios had good reason to want to make this shift to digital production, distribution, and exhibition: it would dramatically reduce costs for making and shipping hundreds, even thousands of prints; and it would prevent the wear-and-tear that traditional film made visible after numerous playings (digital films do not deteriorate with repeated use or copying). However, theater owners were wary of digital distribution and exhibition, fearing it would lead to widespread piracy akin to that which had decimated the traditional record industry earlier in the 90s resulting in even lower box office for their businesses.

Back in the days of the old studio system, this would have been an easy issue to resolve. After all, in 1927 the studios owned the theaters. Thus it was that when the conversion to sound happened, the studios could front the cost of installing sound systems in the theaters across America, bringing about the complete conversion in a matter of a few short years. But now the movie theaters—big chains and small independent theaters alike—were independent of Hollywood studios, and any collusion between studios and theaters to convert the theaters would be a violation of the anti-trust order that had broken up the studios’ vertical monopoly. What to do?

What to do turned out to be 3D. As mentioned in passing above, it was the revival and digital updating of 3D film that encouraged theater owners to bite the bullet and invest in digital projection. The carrot the studios held out was a simple one: for all 3D movies, theater owners could keep the 3D upcharge on tickets from day 1. For most new releases, theater owners received no cut of ticket sales in the opening weekend, depending entirely on concessions for revenue. So this promise of revenue for new 3D releases of films like Avatar (2009) and Gravity (2012) on opening weekend offered hopes of a liferaft at a time when more and more people were watching movies on DVD and new streaming platforms. By 2013, 93% of theaters had converted to digital (today that number is close to 100%). And roughly around the same time, 3D movie production began to wind down. It had served its purpose: film exhibition and distribution was now digital, like production and post-production.

I pause over this story from the recent past of movies in the early 21st century to underscore a larger point. Everything about film has changed dramatically from where things were 100 years earlier. These changes have been brought about by a range of technological, social, industrial, economic, and creative changes. And these changes will continue into the future, as the larger media ecology continues to change and as new technologies unlock new audience behaviors and desires.

And yet, drop a time-traveler from 1921 into a theater (or even behind a laptop) in 2021, and they would still be able to recognize that what they were watching was in fact a movie. Yes, they would be dazzled (or horrified) by the sound, the color, the special effects. But the object itself—despite being no longer made on film, projected on film, and more and more no longer even being watched in a theater—would nonetheless still be recognizable. Movie-making and movie-watching have changed in almost every respect, and yet movies persist.

Here are just a few of the ways in which movies have changed in 100 years:

1921 2021
film is silent sound film
the vast majority of  film is black & white the vast majority of  film is in color
movies recorded on celluloid film movies recorded on digital storage devices
movies edited with scissors and tape movies edited on computers
movies watched in theaters movies watched in theaters, but more often on TVs, computer monitors, or mobile screens
movies largely disappear after theatrical run movies available via streaming services into the future
actors, director, and other personnel owned by studio all creative and technical personnel are free agents and each film effective a new “studio”
Hollywood movies primarily made for domestic audience Hollywood movies primarily made for global audiences

 

And yet, as we discussed at the beginning of this book, other things remain relatively untouched by time. Narrative continues to be the mode of commercial film. The fundamental structures of the screenplay—the “rules” first concocted by the earliest studios in the years following WWI—remain for the most part operational a century later. The star system in 2021 looks largely as it did in Gloria Swanson’s day, even if stars today are now managed by agents and not by studios. And as audiences did a century earlier, decisions on what movies to see tend to be motivated by genre, star, word-of-mouth—the desire to escape or the desire to feel deeply. These are the reasons our time-traveler, once over the shock of what is new, would ultimately have no trouble recognizing what they saw as genetically continuous with the movies they had watched in 2021.

 

 

 

 

 


  1. cinema quality digital cameras do not become widely available until after 2003, whereas today many of our phones can shoot cinema-quality video, a sign of how fast things have evolved in this area
definition

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Close-Ups: An Introduction to Film Copyright © 2023 by Jared Gardner & The Ohio State University is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book