Chris Portal attended the Boston Supermeet of the Final Cut Pro Users and reports:

Walter Murch, a long time Final Cut Pro user, and editor of Apocalypse Now, The Godfather Part III, The English Patient, Cold Mountain, Tetro, among many other films, headlined the Boston Supermeet on Thursday October 27, 2011. It marked his first public appearance since the launch of Final Cut Pro X.

Hemingway & Gellhorn is his latest project for HBO, and is edited on Final Cut Pro 7. The film is a celebration of the tactility of film, yet a film that wouldn’t have been possible without the digitization of film. It uses archive material existing on a wide variety of film mediums, all with different grain sizes, in which actors are dropped in digitally, while trying to preserve the grain of the original element. The film takes you on a roller coaster ride diving in and out of this world, going into the grain and sprockets, and out into the digital world.

His Final Cut Pro project consisted of 22 video tracks and 50 audio tracks, combining sound elements ranging from 8 tracks of dialogue, to 24 tracks of mono and stereo sound effects with and without low frequency enhancements (LFE)!!

Another piece of the workflow was the integration of Filemaker Pro, which he uses to gain a different insight into his film. Using a dependency diagram of sorts, he associates every shot to a specific scene, what music and effects should belong to it, etc. It’s not a time line in any way, but more a view of all the relationships between your media assets.

As far as other equipment Walter used on the project, he used 2 Arri Alexas, outputting to codex materials. The codex downloaded into a ProRes 1280 LT, DPX “negative” (to do the final color timing), and H.264 with internet via PIX (to share assemblies with HBO). There were 5 editing stations, using an XSAN with 28 TB on XRaid running XServe.

There were 1862 shots in the finished film:

  • 482 visually manipulated
  • 227 visual effects
  • 255 repositioned or blown up

While there used to be a rule of not blowing up an image beyond 120% to avoid introducing noise and grain, with the Alexa footage, he was able to take the film and blow it up 240% without being noticeable.

He used FCP7, which he acknowledged may be the last time he uses Final Cut Pro. He considers many professionals to be at a juncture where we need to come to terms with what the software can do in the time the film is being developed.

Walter was in Cupertino when Final Cut Pro X was first dangled in front of a few editors. It was a beta version, and Apple highlighted things like 64 bit support. After that initial exposure to FCPX, he dove into making a film, and it wasn’t until June when FCPX was published that he revisited it. He quickly looked at it, and said he couldn’t use it, wondering where the “Pro” had gone. It didn’t have XML support which he depended on, the ability to share projects on a raid with people, etc. He was confused and wondered what was happening.

He wrote Apple a letter asking what was behind everything that was happening, especially since they had end-of-lifed the current version, as well as a list of things he needed. Like a report card children often get, without XML, Walter explained to Apple that FCPX “did not play well with others”. The lack of tracks was another killer for him. While he doesn’t really need to work with 50 tracks, he does need to leverage the ability to selectively raise or lower the levels very specifically.

Walter sees there having been a shift at Apple over the last 10 years. They have benefited from the professional market, and we all have made a lot of noise about Apple, but starting with the iPod, iPhone, and iPad, Apple has broadened out into a mass-market creature, wanting to democratize capabilities even further.

While Walter is encouraged by the updated FCPX version last month, he hasn’t used it on any real work yet, so he is cautiously optimistic (and still traumatized he says). “Do they love us? No…I know they like us….but they keep saying they love us??”

Things wrapped up with a Q&A, mostly comprised of questions attendees had submitted that evening prior to his talk. A few interesting ones were:

Q: When is it time to walk away from the work?

A: ”When you see dailies, that is the only time you are seeing the images for the first time. There will be no other time for a first. It is the closest you can get to experiencing what the audience will experience. It’s a precious moment. I will sit and watch the dailies in the dark, holding a computer where I’ll type anything the image makes me feel or think, in order to preserve that first moment. Doing so will help clear the fog down the road when you’re feeling you’re getting lost.”

Q: How do you know if a scene works or doesn’t?

A: ”A scene may work on its own, but not in the context of the movie. It can be very dangerous to preemptively strike a scene from a film before you’ve seen the entire film. You can say you don’t agree with where the scene is going, but you don’t know if in the larger picture it may still have a shot.”

Q: Is there one piece of advice you can impart to sound designers?

A: ”Always go farther than you think you can go. Try to bend the literalness. Literalness doesn’t light the fire in the audiences mind. Levitate the film. Ignite the imagination.”

Q: Thoughts on 3D?

A: ”In 2D, your eyes focus on the plane of the screen while they converge towards the plan of the screen, but when you have something coming out of the screen in 3D, you not only need to focus on the screen, but you also need to converge on the detail protruding out of the screen. The mind can do it, but we’re not programmed for it. It requires processing many frames before your mind figures it out, and by then you’ve missed information. It’s analogous to the moment when the fan on your computer starts up.”

Q: If you didn’t use FCP, where would you go?

A: “I’ve used Avid in the past, so I know it well. There are some very good things that Avid has, but I’m also curious about Premiere since I’m interested in technology.”

Bookmark and Share