A Quick Guide to Film Directing Read online




  Copyright © 2014 by Ray Morton

  All rights reserved. No part of this book may be reproduced in any form, without written permission, except by a newspaper or magazine reviewer who wishes to quote brief passages in connection with a review.

  Published in 2014 by Limelight Editions

  An Imprint of Hal Leonard Corporation

  7777 West Bluemound Road

  Milwaukee, WI 53213

  Trade Book Division Editorial Offices

  33 Plymouth St., Montclair, NJ 07042

  Printed in the United States of America

  Book design by Mark Lerner

  Library of Congress Cataloging-in-Publication Data

  Morton, Ray, 1961-

  A quick guide to film directing / Ray Morton.

  pages cm

  ISBN 978-0-87910-806-9 (pbk.)

  1. Motion pictures--Production and direction. I. Title.

  PN1995.9.P7M585 2014

  791.4302'33--dc23

  2014008765

  www.limelighteditions.com

  For Erin, Jack, and Sean Morton

  and

  Caitlin Hoey

  and

  Aiden James Masterbone

  Contents

  Introduction

  1. A Brief History of Film Directing

  2. How to Become a Film Director

  3. A Few Things a Film Director Should Know

  4. A Few Skills a Film Director Should Have

  5. Getting the Job

  6. The Film Director in Preproduction

  7. The Film Director in Principal Photography

  8. The Film Director in Postproduction

  9. The Director and the Film’s Release

  10. Directing in Other Modes

  Acknowledgments

  Introduction

  The director is the pivotal figure in the creation of a motion picture.

  It takes an army of talented people to make a movie, but it takes a director to lead that army. The director devises the overall creative concept for the production, hires the cast and key members of the creative team, sets the tone and calls the shots on the set and in the editing room, and has final say in all creative matters affecting the film.

  The contributions of everyone working on a movie—the producer, the screenwriter, the cinematographer, the editor, the production and costume designers, the composer, the technical crew, and the actors—are all filtered through the director’s concept, judgment, and taste to create the final cinematic work.

  Directing a film requires a unique combination of artistic vision, technical expertise, and managerial skill. This book will provide you with a comprehensive look at the essential talents and tasks required to successfully helm a motion picture.

  1

  A Brief History of Film Directing

  The job of film directing was born with the cinema itself.

  The first movies were short documentaries—brief clips of real-life situations such as a train pulling into a station, workers leaving a factory, a man sneezing, and so on. These scenes were filmed by the various men around the globe who invented the movie camera—men such as Louis Le Prince and William Friese-Greene in England, Louis and Auguste Lumière in France, and William Kennedy Laurie Dickson in the United States. These inventors figured out how to transform still cameras capable of recording only one static image at a time into machines that could record (on a strip of flexible celluloid) a series of images in rapid succession that, when projected back at the same rate at which they were shot, could create the illusion of a picture that moved. Initially, the Lumières, Dickson, and their fellow innovators created their moving images by simply setting up their cameras and recording whatever happened in front of them. Before long, however, the men began choosing their subjects more deliberately. As they made their decisions about what subjects to photograph, where to place the camera, and when to begin and end the recording, these technicians inadvertently became film directors.

  The cameras created by these inventors were soon acquired by others—businesspeople, showmen, and artists—who began to make movies for public consumption, and thus, the film industry was born. Audiences soon grew tired of documentary scenes, and so moviemakers began using their cameras to tell fictional stories—comedies, dramas, romances, and action spectaculars—in the form of five-, ten-, and twenty-minute shorts. The director was the key figure in this process.

  Early movie directors were total filmmakers—they would usually dream up and write the scenarios, organize and run the production, help build the sets and find the locations, cast the actors and tell them what to do, photograph the scenes, create the special and visual effects, and edit the results. In the process, directors such as Edwin S. Porter and D. W. Griffith began to pioneer the various techniques—close-ups, intercutting, and so on—that would become the foundation of the “language” of film.

  As movies grew longer—eventually into ninety-plus-minute features—and more complex, and the production process became more involved, individual specialists (screenwriters, cinematographers, art directors, editors, and so on) began to assume responsibility for the various tasks required to make a movie, leaving directors to function more as a creative overseers than as hands-on functionaries. Directors remained, however, the primary artistic drivers of the filmmaking process.

  For American directors, this began to change with the rise of the studio system in the 1920s. During the approximately thirty-year-long studio era, company-designated producers working for a strong production chief became the prime movers of individual film projects—the producers found the properties, hired the writers, and developed the stories and scripts. They also cast the films, selected the key members of the creative team, and supervised the production process. Directors, most of whom were under long-term contract to the studio, became hired hands—important ones, to be sure, but still subservient to producers. Directors in this era usually did not participate in the creative development of a project, but instead were simply assigned to a particular film a few days before shooting began and then reassigned to another project as soon as the picture wrapped. The producer would oversee the editing and completion of the final product.

  Studio-era directors had little or no say in what pictures they were assigned to, and it was not unusual for several directors to be put to work on a single movie; if a director fell ill, or a producer was unhappy with his work, or the director was unavailable for reshoots because he was working on another project, then another helmer (a nickname for director coined by the show-business trade paper Variety) would be assigned to take over. While some higher-profile directors such as Frank Capra, John Ford, and Howard Hawks had more control over their work than their brethren, even they had to operate under relatively tight constraints.

  As the studio system came to an end in the 1950s and early 1960s, most directors who were formally under contract went freelance, moving from one studio to another as they took on different projects. This samurai status gave helmers who made pictures that were successful at the box office more clout, allowing them to choose the projects they wanted to do and to negotiate greater creative input and freedom from the producers and companies who were eager to hire them.

  During this period, American directors also began to gain more critical respect. This was due in large part to the influence of la politique des Auteurs, a.k.a. “the auteur theory,” a critical perspective devised by a group of French movie critics (led by future directors François Truffaut and Jean-Luc Godard) writing for the respected film journal Cahiers du Cinéma and later popularized in the United States by critic Andrew Sarris
. The theory saw directors as the primary creative auteurs (authors) of the films they made, and as a consequence of it, movie directors in the U.S. began to be regarded by critics, viewers, studio executives, and themselves not just as competent technicians (the prevailing view for most of the studio era), but also as legitimate visionary artists.

  This view was bolstered by the increased distribution and popularity in America of foreign films—movies from England, France, Sweden, and other countries where directors had retained their creative clout over the years and were definitely the artistic authors of their movies—as well as by the beginning of an independent film movement in the U.S. that saw young directors who very much considered themselves auteurs making personal films that reflected their own unique creative ideas and artistic points of view.

  All this approbation came to a head in the late 1960s and early 1970s, when the incredible success of innovative films such as Bonnie and Clyde, The Graduate, Easy Rider, The Last Picture Show, The Godfather, The Exorcist, American Graffiti, and Jaws turned their directors (Arthur Penn, Mike Nichols, Dennis Hopper, Peter Bogdanovich, Francis Ford Coppola, William Friedkin, George Lucas, and Steven Spielberg) into superstars.

  The 1970s became the Decade of the Director: during this period, movie helmers became celebrities as well known, lauded, and sought after as the star actors who appeared in their movies. Directors were taken seriously as creative artists and were regarded as significant cultural figures and commentators by the intelligentsia, by academia, and by the general public. Most important, directors were given almost unlimited creative freedom and control over their projects by producers and studios—conditions that led to what many consider a golden age in American and world filmmaking.

  Unfortunately, all that freedom also led to financial and artistic self-indulgence on the part of many directors, which resulted in a string of big-budget flops in the latter half of the 1970s. When the creative and box-office failure of Michael Cimino’s wildly over-budget western epic Heaven’s Gate brought about the collapse of United Artists (the company that financed the film) in 1980, other studios began to take back the control they had ceded to directors over the previous fifteen or so years.

  Throughout the 1980s and 1990s, studios tightened their grips on project selection and development, as well as on budgets and finances. The director was still seen as the key creative figure in the making of a film, and successful helmers still had a great deal of latitude in making their pictures, but nowhere near the unfettered free rein they’d enjoyed in the 1970s. Filmmakers in this era had more freedom in the independent movement, which thrived throughout the 1980s and 1990s and continued to afford enterprising auteurs the opportunity to create unique personal films, as long as they could do so on reasonably modest budgets.

  The modern era is presenting film directors with a great many challenges. The big studios are making fewer movies on a narrower range of subjects and exercising greater and greater control over the productions. At the same time, the independent theatrical market has all but evaporated. All these developments leave helmers with fewer opportunities and outlets to ply their trade in traditional fashion. On the bright side, new technologies such as digital motion picture cameras and editing programs for home computers are making it easier for directors to make movies outside of the conventional arenas, and new release platforms (including Internet-based distribution through venues such as YouTube, iTunes, and the various streaming services, as well as cable-television-based options such video-on-demand) provide numerous new ways to deliver films to receptive viewers.

  So, while forms and formats may be changing, the need and desire to tell stories on film persist, which means that people are going to go on directing movies for a long time to come.

  2

  How to Become a Film Director

  There are many paths one can take to become a film director:

  Make an interesting short

  Films students in undergraduate and graduate programs and independents working on their own who make striking short films frequently attract the attention of agents, managers, producers, and studio executives, who may offer them opportunities.

  Direct in other media

  Directors who do notable work in other arenas—television, theater, commercials, or music videos—are often recruited to make feature films.

  Come up through the ranks

  People who do well in other areas of filmmaking—producers, screenwriters, actors, editors, cinematographers, and even stunt coordinators—can sometimes leverage their success into opportunities to direct.

  Just do it

  Some people become directors by making their own movies from scratch. These folks write their own scripts, raise the money to make the films, and then go out and shoot them.

  3

  A Few Things a Film Director Should Know

  A film director’s primary function is to tell a story on the big screen. To do this successfully, he must have a working knowledge of the following:

  1. Dramatic storytelling

  Movie storytelling is dramatic storytelling—the presentation of a narrative in which a protagonist in pursuit of a significant goal becomes involved in a conflict that leads to climax, resolution, and ultimately transformation—so a director must have a basic understanding of dramatic structure. Dramatic structure is the template based on the core principles of dramatic writing first set down by Aristotle in ancient Greece and refined by dramatists across the millennia; it contains all the key elements of dramatic storytelling: exposition, rising action, suspense, surprise, reversal, climax, falling action, and denouement. According to this template, a dramatic narrative is divided into three sections called acts, which unfold as follows:

  ACT I

  •The world in which the story takes place is introduced.

  •The Protagonist is introduced and his circumstances are laid out.

  •Key supporting characters, the relationships between the characters, and additional important story elements are also established.

  •A crucial event occurs that sets the story in motion. This event is called the inciting incident.

  •At the end of Act I, something happens that changes the Protagonist’s situation in some very drastic way. This event is called the catalyst. It is also known at the first plot point, the first turning point, the first plot twist, the Act I plot twist, or the complication.

  •As a result of this catalyst, the Protagonist develops a significant goal he becomes determined to achieve. That goal can be big (to save the world) or small (to save a local landmark); it can be internal (to overcome a trauma) or external (to find a buried treasure); it can be personal (to find love) or public (to stop global warming).

  ACT II

  •The Protagonist—usually working against some sort of tension-generating, “ticking clock” deadline—develops a plan for accomplishing his goal and then sets out to follow it.

  •The Protagonist’s quest to achieve his objective brings him into contact with the Antagonist, who is or becomes determined to stop the Protagonist from accomplishing his goal.

  •During his quest, the Protagonist encounters a series of obstacles—primarily generated by the Antagonist—that stand between him and his objective.

  •The Protagonist usually begins Act II at a disadvantage (thanks to the catalyst), but uses his inner and outer resources—which can include special skills and abilities and help from unusual allies—to overcome these obstacles (which become bigger, more complex, and more difficult to deal with as the narrative progresses) and begins to march toward victory.

  •Near the end of Act II, the Protagonist closes in on that victory. He reaches a point where it appears that he is about to achieve his goal. Success seems to be within his grasp.

  •At this point, something happens that once again drastically changes the Protagonist’s circumstan
ces. This event—usually instigated by the Antagonist— robs the Protagonist of his impending triumph and leaves him in a defeated (and often precarious) position, facing an obstacle so formidable that it appears he will never be able to overcome it and, as a result, will fail to ever accomplish his goal. This event is known as the catastrophe. It is also known at the second plot point, the second turning point, the second plot twist, the Act II plot twist, the second complication, or the crash-and-burn.

  ACT III

  •As Act III begins, all hope appears to be lost. The Protagonist has been defeated and it seems as though he will never be able to achieve his objective.

  •At this point—when the Protagonist is at his absolute lowest—something happens that allows him or motivates him to rally.

  •The Protagonist now does one of two things: he either comes up with a new plan to achieve his original objective, or he abandons that objective and comes up with an entirely new goal (and a plan to achieve it).

  •The Protagonist sets out to put his new plan into action.

  •This leads to the story’s climax: a final confrontation with the Antagonist in which the Protagonist is finally able to overcome the seemingly insurmountable obstacle—usually by defeating the Antagonist—and accomplish his goal (or not, if that’s what the story calls for).

  •Act III concludes with the resolution, which shows how all the story’s problems are resolved and how things work out for all the characters as a result of the climax. The resolution often indicates how things are expected to go for the characters after the story ends.

  •A key function of the resolution is to show how the Protagonist has changed. At its core, drama is about transformation, and the Protagonist of a dramatic tale always undergoes a profound change as a result of his experiences in the story. This change is usually for the better—the Protagonist solves a personal problem, repairs a broken relationship, learns an important life lesson, achieves fame and fortune, etc.—although sometimes it can be for the worse—e.g., a good cop becomes corrupt; an idealistic woman becomes cynical; a sane man descends into madness. This transformation is often called the Protagonist’s “arc.”