- •CONTENTS
- •INTRODUCTION
- •1 Getting Started
- •Better, Cheaper, Easier
- •Who This Book Is For
- •What Kind of Digital Film Should You Make?
- •2 Writing and Scheduling
- •Screenwriting
- •Finding a Story
- •Structure
- •Writing Visually
- •Formatting Your Script
- •Writing for Television
- •Writing for “Unscripted”
- •Writing for Corporate Projects
- •Scheduling
- •Breaking Down a Script
- •Choosing a Shooting Order
- •How Much Can You Shoot in a Day?
- •Production Boards
- •Scheduling for Unscripted Projects
- •3 Digital Video Primer
- •What Is HD?
- •Components of Digital Video
- •Tracks
- •Frames
- •Scan Lines
- •Pixels
- •Audio Tracks
- •Audio Sampling
- •Working with Analog or SD Video
- •Digital Image Quality
- •Color Sampling
- •Bit Depth
- •Compression Ratios
- •Data Rate
- •Understanding Digital Media Files
- •Digital Video Container Files
- •Codecs
- •Audio Container Files and Codecs
- •Transcoding
- •Acquisition Formats
- •Unscientific Answers to Highly Technical Questions
- •4 Choosing a Camera
- •Evaluating a Camera
- •Image Quality
- •Sensors
- •Compression
- •Sharpening
- •White Balance
- •Image Tweaking
- •Lenses
- •Lens Quality
- •Lens Features
- •Interchangeable Lenses
- •Never Mind the Reasons, How Does It Look?
- •Camera Features
- •Camera Body Types
- •Manual Controls
- •Focus
- •Shutter Speed
- •Aperture Control
- •Image Stabilization
- •Viewfinder
- •Interface
- •Audio
- •Media Type
- •Wireless
- •Batteries and AC Adaptors
- •DSLRs
- •Use Your Director of Photography
- •Accessorizing
- •Tripods
- •Field Monitors
- •Remote Controls
- •Microphones
- •Filters
- •All That Other Stuff
- •What You Should Choose
- •5 Planning Your Shoot
- •Storyboarding
- •Shots and Coverage
- •Camera Angles
- •Computer-Generated Storyboards
- •Less Is More
- •Camera Diagrams and Shot Lists
- •Location Scouting
- •Production Design
- •Art Directing Basics
- •Building a Set
- •Set Dressing and Props
- •DIY Art Direction
- •Visual Planning for Documentaries
- •Effects Planning
- •Creating Rough Effects Shots
- •6 Lighting
- •Film-Style Lighting
- •The Art of Lighting
- •Three-Point Lighting
- •Types of Light
- •Color Temperature
- •Types of Lights
- •Wattage
- •Controlling the Quality of Light
- •Lighting Gels
- •Diffusion
- •Lighting Your Actors
- •Interior Lighting
- •Power Supply
- •Mixing Daylight and Interior Light
- •Using Household Lights
- •Exterior Lighting
- •Enhancing Existing Daylight
- •Video Lighting
- •Low-Light Shooting
- •Special Lighting Situations
- •Lighting for Video-to-Film Transfers
- •Lighting for Blue and Green Screen
- •7 Using the Camera
- •Setting Focus
- •Using the Zoom Lens
- •Controlling the Zoom
- •Exposure
- •Aperture
- •Shutter Speed
- •Gain
- •Which One to Adjust?
- •Exposure and Depth of Field
- •White Balancing
- •Composition
- •Headroom
- •Lead Your Subject
- •Following Versus Anticipating
- •Don’t Be Afraid to Get Too Close
- •Listen
- •Eyelines
- •Clearing Frame
- •Beware of the Stage Line
- •TV Framing
- •Breaking the Rules
- •Camera Movement
- •Panning and Tilting
- •Zooms and Dolly Shots
- •Tracking Shots
- •Handholding
- •Deciding When to Move
- •Shooting Checklist
- •8 Production Sound
- •What You Want to Record
- •Microphones
- •What a Mic Hears
- •How a Mic Hears
- •Types of Mics
- •Mixing
- •Connecting It All Up
- •Wireless Mics
- •Setting Up
- •Placing Your Mics
- •Getting the Right Sound for the Picture
- •Testing Sound
- •Reference Tone
- •Managing Your Set
- •Recording Your Sound
- •Room Tone
- •Run-and-Gun Audio
- •Gear Checklist
- •9 Shooting and Directing
- •The Shooting Script
- •Updating the Shooting Script
- •Directing
- •Rehearsals
- •Managing the Set
- •Putting Plans into Action
- •Double-Check Your Camera Settings
- •The Protocol of Shooting
- •Respect for Acting
- •Organization on the Set
- •Script Supervising for Scripted Projects
- •Documentary Field Notes
- •What’s Different with a DSLR?
- •DSLR Camera Settings for HD Video
- •Working with Interchangeable Lenses
- •What Lenses Do I Need?
- •How to Get a Shallow Depth of Field
- •Measuring and Pulling Focus
- •Measuring Focus
- •Pulling Focus
- •Advanced Camera Rigging and Supports
- •Viewing Video on the Set
- •Double-System Audio Recording
- •How to Record Double-System Audio
- •Multi-Cam Shooting
- •Multi-Cam Basics
- •Challenges of Multi-Cam Shoots
- •Going Tapeless
- •On-set Media Workstations
- •Media Cards and Workflow
- •Organizing Media on the Set
- •Audio Media Workflow
- •Shooting Blue-Screen Effects
- •11 Editing Gear
- •Setting Up a Workstation
- •Storage
- •Monitors
- •Videotape Interface
- •Custom Keyboards and Controllers
- •Backing Up
- •Networked Systems
- •Storage Area Networks (SANs) and Network-Attached Storage (NAS)
- •Cloud Storage
- •Render Farms
- •Audio Equipment
- •Digital Video Cables and Connectors
- •FireWire
- •HDMI
- •Fibre Channel
- •Thunderbolt
- •Audio Interfaces
- •Know What You Need
- •12 Editing Software
- •The Interface
- •Editing Tools
- •Drag-and-Drop Editing
- •Three-Point Editing
- •JKL Editing
- •Insert and Overwrite Editing
- •Trimming
- •Ripple and Roll, Slip and Slide
- •Multi-Camera Editing
- •Advanced Features
- •Organizational Tools
- •Importing Media
- •Effects and Titles
- •Types of Effects
- •Titles
- •Audio Tools
- •Equalization
- •Audio Effects and Filters
- •Audio Plug-In Formats
- •Mixing
- •OMF Export
- •Finishing Tools
- •Our Software Recommendations
- •Know What You Need
- •13 Preparing to Edit
- •Organizing Your Media
- •Create a Naming System
- •Setting Up Your Project
- •Importing and Transcoding
- •Capturing Tape-based Media
- •Logging
- •Capturing
- •Importing Audio
- •Importing Still Images
- •Moving Media
- •Sorting Media After Ingest
- •How to Sort by Content
- •Synchronizing Double-System Sound and Picture
- •Preparing Multi-Camera Media
- •Troubleshooting
- •14 Editing
- •Editing Basics
- •Applied Three-Act Structure
- •Building a Rough Cut
- •Watch Everything
- •Radio Cuts
- •Master Shot—Style Coverage
- •Editing Techniques
- •Cutaways and Reaction Shots
- •Matching Action
- •Matching Screen Position
- •Overlapping Edits
- •Matching Emotion and Tone
- •Pauses and Pull-Ups
- •Hard Sound Effects and Music
- •Transitions Between Scenes
- •Hard Cuts
- •Dissolves, Fades, and Wipes
- •Establishing Shots
- •Clearing Frame and Natural “Wipes”
- •Solving Technical Problems
- •Missing Elements
- •Temporary Elements
- •Multi-Cam Editing
- •Fine Cutting
- •Editing for Style
- •Duration
- •The Big Picture
- •15 Sound Editing
- •Sounding Off
- •Setting Up
- •Temp Mixes
- •Audio Levels Metering
- •Clipping and Distortion
- •Using Your Editing App for Sound
- •Dedicated Sound Editing Apps
- •Moving Your Audio
- •Editing Sound
- •Unintelligible Dialogue
- •Changes in Tone
- •Is There Extraneous Noise in the Shot?
- •Are There Bad Video Edits That Can Be Reinforced with Audio?
- •Is There Bad Audio?
- •Are There Vocal Problems You Need to Correct?
- •Dialogue Editing
- •Non-Dialogue Voice Recordings
- •EQ Is Your Friend
- •Sound Effects
- •Sound Effect Sources
- •Music
- •Editing Music
- •License to Play
- •Finding a Composer
- •Do It Yourself
- •16 Color Correction
- •Color Correction
- •Advanced Color Controls
- •Seeing Color
- •A Less Scientific Approach
- •Too Much of a Good Thing
- •Brightening Dark Video
- •Compensating for Overexposure
- •Correcting Bad White Balance
- •Using Tracks and Layers to Adjust Color
- •Black-and-White Effects
- •Correcting Color for Film
- •Making Your Video Look Like Film
- •One More Thing
- •17 Titles and Effects
- •Titles
- •Choosing Your Typeface and Size
- •Ordering Your Titles
- •Coloring Your Titles
- •Placing Your Titles
- •Safe Titles
- •Motion Effects
- •Keyframes and Interpolating
- •Integrating Still Images and Video
- •Special Effects Workflow
- •Compositing 101
- •Keys
- •Keying Tips
- •Mattes
- •Mixing SD and HD Footage
- •Using Effects to Fix Problems
- •Eliminating Camera Shake
- •Getting Rid of Things
- •Moving On
- •18 Finishing
- •What Do You Need?
- •Start Early
- •What Is Mastering?
- •What to Do Now
- •Preparing for Film Festivals
- •DIY File-Based Masters
- •Preparing Your Sequence
- •Color Grading
- •Create a Mix
- •Make a Textless Master
- •Export Your Masters
- •Watch Your Export
- •Web Video and Video-on-Demand
- •Streaming or Download?
- •Compressing for the Web
- •Choosing a Data Rate
- •Choosing a Keyframe Interval
- •DVD and Blu-Ray Discs
- •DVD and Blu-Ray Compression
- •DVD and Blu-Ray Disc Authoring
- •High-End Finishing
- •Reel Changes
- •Preparing for a Professional Audio Mix
- •Preparing for Professional Color Grading
- •Putting Audio and Video Back Together
- •Digital Videotape Masters
- •35mm Film Prints
- •The Film Printing Process
- •Printing from a Negative
- •Direct-to-Print
- •Optical Soundtracks
- •Digital Cinema Masters
- •Archiving Your Project
- •GLOSSARY
- •INDEX
46 The Digital Filmmaking Handbook, 4E
Timecode (continued)
If you’re shooting with a camera that doesn’t have timecode, such as a DSLR, then each shot will start at zero. But luckily, your camera will still record metadata in each video file that includes timestamp information that your computer can use in postproduction.
Time-of-day timecode and correct time stamps in the metadata can be very useful in postproduction, so make sure your camera’s internal clock is set to the correct time and date.
For a more detailed discussion of timecode and also key code for film, check out the timecode document in the Chapter 3 folder at www.thedigitalfilmmakinghandbook.com.
Digital Image Quality
Let’s face it, the primary reason most of us care about all this techno stuff is because we care about image quality. All of us want our projects to look as good as possible. And even though all types of HD video have the potential for a very high quality of image, in practice, it doesn’t always work out that way.
That’s because as you work to create your film, the image you first record will go through a series of processes, all of which affect image quality (see Figure 3.7).
Figure 3.7
The image quality food chain: subject (includes lighting, location and production design), camera (lens, sensor, digitization, acquisition format), editing (intermediate format[s] and delivery format) and display.
That brings us to a very simple question: why is there a difference between shooting 1080/24p with an HDV camera and shooting 1080/24p with a Sony XDCAM? After all, they share the same image size and frame rate. Lens quality and image sensors in the camera are part of the reason, but the more significant reason is compression. If you have an HDTV, you’ve probably already noticed how much difference in quality there can be from one show to another. Sports broadcasts typically look significantly better than the HD commercials that air alongside them, and often much better than dramatic shows, local newscasts, or documentaries. This is due in large part to the different formats used, each of which uses a different type of compression.
Full-resolution digital video files are very large. A single frame of 1080p HD video contains over 2 million pixels. Multiply that by 24 frames in a second, and suddenly you have a lot of data to store and manipulate. Computers today are very powerful, but dealing with images of this size is still a challenge. That’s where compression steps in. The software that handles
Chapter 3 n Digital Video Primer |
47 |
this task is called a codec, short for COmpressor/DECompressor. Codecs discard or diminish unnecessary (mostly) visual information to reduce the total amount of video data, which helps ensure that your video files can be captured and viewed in real time.
As you create your project, you’ll encounter various types of compression along the way. Compression happens in the camera, as video is captured, or acquired, It can also happen if you import, or transcode media, into your editing application. It happens again when you output or master your final project to various digital formats such as videos for the Web, Blu-ray Discs, and uncompressed HD for digital projection, to name only a few of the many options.
Earlier, we talked about the various components of digital video (tracks, frames, scan lines, pixels, and audio samples), but digital video is more than just the sum of these parts. That’s because every digital video compression scheme, or algorithm, determines how color is handled (color sampling and bit-depth), exactly how much compression is used (compression ratios), and how quickly the information captured is moved around (data rate).
Color Sampling
In grade school, you might have learned about the primary colors that can be mixed together to create all other colors. What they probably never explained in school was that those are the primary subtractive colors, or the primary colors of ink. When you talk about the color of light, things work a bit differently.
Red, green, and blue are the three primary colors of light; you can create any other color of light by mixing those primaries together. However, whereas mixing primary colors of ink together results in a darker color, mixing light creates a lighter color. Mix enough ink, and you eventually get black, but if you mix enough light, you eventually get white. (See Color Plates 1 and 2.)
Video cameras and televisions create color by mixing the three primary colors of light. In addition to red (R), green (G), and blue (B), the video signal has another element, which is lightness, or luminance (Y). The lens on your camera sees each of these four elements as separate, continuous analog waves, and digital cameras must first convert the waves into numbers, through a process called sampling. Each wave is broken into a series of bits that can be stored on the digital tape as 0s and 1s. The denser the samples, the better the perceived quality of the image will be (see Figure 3.8).
Figure 3.8
As the number of samples increases from left to right, the image becomes clearer.
48 The Digital Filmmaking Handbook, 4E
When a digital camera samples an image, the degree to which it samples each primary color is called the color sampling ratio. A fully uncompressed video signal (also known as RGB color) has a color sampling ratio of 4:4:4. The first number stands for the luminance, or luma, signal (abbreviated y'), and the next two numbers stand for the color difference components (Cb and Cr), which together add up to the full chroma signal. 4:4:4 means that for every pixel, four samples each are taken for the luma signal and the two chroma signals.
To make the resulting data smaller, and therefore more manageable, usually about half of the color information is discarded. Uncompressed digital cinema formats can have a color sampling ratio of 4:4:4, but many very high-quality digital video formats actually throw out half of the color information. These formats have a color sampling ratio of 4:2:2, which means that for every four luma samples, there are two of each type of color samples. The human eye is more sensitive to differences in light and dark (luminance) than to differences in color (chrominance). In theory, the discarded color information is detail that the human eye cannot perceive, so it’s worth throwing it out for the sake of saving storage space. (See Color Plate 3.)
The color sampling ratio of most HD formats is 4:2:2. Some HD formats, like HDV, use 4:2:0 color sampling, which reduces the color information of 4:2:2 by 25 percent, which is an amount of color reduction that is considered visible to the viewer. For more about light and color, see Chapter 6, “Lighting,” and Chapter 16, “Color Correction.”
Bit Depth
Digital video usually has a bit depth of 8, 10, or 12 bits. Digital devices speak in ones and zeros (two “digits,” hence the term digital ). A single one or zero is called a bit, and a group of bits can be grouped together to represent a single number. When it comes time to assign a numeric value to the color of a particular pixel, then the number of bits that are used to make up that number becomes something of an issue. With more bits, you can record a greater range of numbers, which means you can represent more colors. This bit depth of a particular format refers to how many bits are used to represent the color of each pixel and is also sometimes referred to as quantization.
Basically, it’s just the same as boxes of crayons. If you only have eight crayons, the pictures you draw don’t have nearly as much color detail and variation as if you have a box of 64 crayons. Similarly, if you only have 8 bits available for representing the color of each pixel, you don’t have nearly as much color detail and variation to work with as if you have 10 bits per pixel.
Higher-end cameras capture three streams of video, one for each color: red, green, and blue (see Color Plate 4), and each channel usually has 8 bits per pixel, for a total of 24 bits or 10 bits per pixel for a total of 30 bits, which is partly why cameras with 3 chips are touted above those with only one. (In Chapter 4, “Choosing a Camera,” there is in-depth information about cameras and image quality.) For blue-screen work, or other compositing tasks, or for projects where you really want to be able to manipulate the color of your final image, a format that uses a higher bit depth will allow higher-quality, cleaner adjustments.
