Deep Thoughts on Multiple-Camera Projects

 

I recently shot my 20h multi-camera shoot, a concert by jazz singer René Marie. After you do anything 20 times, you have a good idea what you’re doing and why. As it turned out, this experience helped crystallized my thoughts to a level that I hope will benefit others who shoot and edit with multiple cameras.

Before the event, I met with my crew to discuss the rules of the shoot, the most important of which (with any event shot with multiple cameras) is never to stop the camera until the end of a set or other natural break. This limits the number of times you have to synch the streams during editing.

Rule number two is that even though I had one “safe” camera covering the entire stage, the individual videographers should shoot so that their video is always usable. No jump zooms or pans to different locations; all movements should be measured and smooth. This helps if/when my safe camera breaks, and also helps ensure that no bad video sneaks into the final cut.

We covered sight lines and camera angles; Scott shooting from the left couldn’t see the bass player or drummer, so Gary on the right should prioritize those. I was the boring “safe” guy in the back, next to Chuck freelancing handheld with the XL H1 with its wonderful 20X lens. We also discussed my preference to minimize close-ups early in most songs and prioritize them towards the end.

Finally, I advised them to keep the camera moving; stay on a shot for 5-10 seconds or so, then slowly move in or out, left or right. Even with the flexibility and versatility that a multiple-camera shoot gives you in the final cut, a static camera can quickly get boring.

I wasn’t familiar with René before the concert, but she blew me away with her soulful original music and evocative covers, like Bob Seger’s “Turn the Page,” all performed with the charisma of a Broadway actress. Though lighting in Galax’ Rex Theater was poor that day, which hurt us on the visual side of things, Cliff, the local sound guy, absolutely nailed the audio, which was crisp and clear, almost studio-quality.

When I heard the audio, I was psyched. I can fix, or at least minimize the effects of bad lighting, but it’s really hard to improve bad audio in post. Besides, in my experience, performers care much more about audio quality than video.

When I started editing René’s show, however, I felt a new responsibility and reverence, a sense that I had the raw materials to produce an absolutely stunning audio/visual experience, like someone had dropped footage from a ’50s Sinatra concert in my hands and said, “Here, see what you can do with this.”

When I first started editing multiple-camera shoots, I switched camera angles to eliminate bad camera work and otherwise when it “felt” right. While this approach had evolved over time, the quality of Renés performance made my early edits seem crude, and I quickly noticed that indiscriminate switching got in the way of the performance, rather than enhancing my presentation of it.

Over the weeks it took to complete the edit, I formulated the following rules, some old, some new, which I share as 0.9 work-in-progress release, comments definitely appreciated. Still first, of course, was to switch cameras to eliminate bad camera work, though my polished crew delivered very little of that.

Second was never to interrupt the performance. This meant waiting to switch cameras until René had finished a phrase or a gesture, rather than in mid-speech or motion. In retrospect, this was probably the rule I had violated most in the past.

Third was to avoid switching between similar views, say from a close-up of René to another close-up, which felt gratuitous. Rather, I tried to switch only when I had a different view, say of the bass player or drummer, or a view of René from a completely different angle. In other words, don’t switch just because you can; switch because you have a new angle to show the viewer.

Fourth was to develop a custom style for camera views during each song and stick to it; for example, beginning and ending each song with a full view of the stage. As described earlier, I avoided close-ups early in most songs, and tended to favor them towards the end, or during mid-stream emotional highlights.

Fifth was to match editing tempo to music tempo, switching more quickly with fast songs, and less frequently with slower songs. At times I tried to use switching to enhance the viewing experience, for example switching very quickly on-tempo during fast songs, or maintaining the same camera angle for 30-40 seconds during slow, mournful songs.

Sixth was matching transitions to tempo. On very fast songs, I used straight cuts between changes in camera angles. On all other songs, I used dissolves ranging from 7 frames to 45 frames, customized by song and location within the song. If you’re switching camera angles in Adobe Premiere, a killer keystroke combination is Page Down, to jump to the next camera angle, and then Ctrl+D, to insert the default transition. Check your NLE for similar time-saving keystrokes.

Most importantly, I realized that changing camera angles was one of the most important artistic decisions an editor can make during a live performance, whether concert, interview, or wedding. Accordingly, I built in a review cycle specifically for camera angle switches, testing each switch multiple times, and making many minor adjustments.

About Jan Ozer

Avatar photo
I help companies train new technical hires in streaming media-related positions; I also help companies optimize their codec selections and encoding stacks and evaluate new encoders and codecs. I am a contributing editor to Streaming Media Magazine, writing about codecs and encoding tools. I have written multiple authoritative books on video encoding, including Video Encoding by the Numbers: Eliminate the Guesswork from your Streaming Video (https://amzn.to/3kV6R1j) and Learn to Produce Video with FFmpeg: In Thirty Minutes or Less (https://amzn.to/3ZJih7e). I have multiple courses relating to streaming media production, all available at https://bit.ly/slc_courses. I currently work as www.netint.com as a Senior Director in Marketing.

Check Also

How GPAC Technology is Shaping the Future of Streaming: Key Insights from Fred Dawson

In a recent interview available on YouTube, Jan Ozer sat down with Fred Dawson, an …

The Reality of Codec Adoption in Six Pictures

Most people reading this post want new codecs to succeed as soon as possible, and …

VAST Serving process

Video Ad Standards: VAST, VMAP, VPAID, and Beyond

The VAST, VMAP, and VPAID video ad standards ensure smooth ad delivery, proper tracking, and …

Leave a Reply

Your email address will not be published. Required fields are marked *