Archive for the ‘DI Workflow’ Category

Walter Murch on the demise of FCP

Chris Portal attended the Boston Supermeet of the Final Cut Pro Users and reports:

Walter Murch, a long time Final Cut Pro user, and editor of Apocalypse Now, The Godfather Part III, The English Patient, Cold Mountain, Tetro, among many other films, headlined the Boston Supermeet on Thursday October 27, 2011. It marked his first public appearance since the launch of Final Cut Pro X.

Hemingway & Gellhorn is his latest project for HBO, and is edited on Final Cut Pro 7. The film is a celebration of the tactility of film, yet a film that wouldn’t have been possible without the digitization of film. It uses archive material existing on a wide variety of film mediums, all with different grain sizes, in which actors are dropped in digitally, while trying to preserve the grain of the original element. The film takes you on a roller coaster ride diving in and out of this world, going into the grain and sprockets, and out into the digital world.

His Final Cut Pro project consisted of 22 video tracks and 50 audio tracks, combining sound elements ranging from 8 tracks of dialogue, to 24 tracks of mono and stereo sound effects with and without low frequency enhancements (LFE)!!

Another piece of the workflow was the integration of Filemaker Pro, which he uses to gain a different insight into his film. Using a dependency diagram of sorts, he associates every shot to a specific scene, what music and effects should belong to it, etc. It’s not a time line in any way, but more a view of all the relationships between your media assets.

As far as other equipment Walter used on the project, he used 2 Arri Alexas, outputting to codex materials. The codex downloaded into a ProRes 1280 LT, DPX “negative” (to do the final color timing), and H.264 with internet via PIX (to share assemblies with HBO). There were 5 editing stations, using an XSAN with 28 TB on XRaid running XServe.

There were 1862 shots in the finished film:

  • 482 visually manipulated
  • 227 visual effects
  • 255 repositioned or blown up

While there used to be a rule of not blowing up an image beyond 120% to avoid introducing noise and grain, with the Alexa footage, he was able to take the film and blow it up 240% without being noticeable.

He used FCP7, which he acknowledged may be the last time he uses Final Cut Pro. He considers many professionals to be at a juncture where we need to come to terms with what the software can do in the time the film is being developed.

Walter was in Cupertino when Final Cut Pro X was first dangled in front of a few editors. It was a beta version, and Apple highlighted things like 64 bit support. After that initial exposure to FCPX, he dove into making a film, and it wasn’t until June when FCPX was published that he revisited it. He quickly looked at it, and said he couldn’t use it, wondering where the “Pro” had gone. It didn’t have XML support which he depended on, the ability to share projects on a raid with people, etc. He was confused and wondered what was happening.

He wrote Apple a letter asking what was behind everything that was happening, especially since they had end-of-lifed the current version, as well as a list of things he needed. Like a report card children often get, without XML, Walter explained to Apple that FCPX “did not play well with others”. The lack of tracks was another killer for him. While he doesn’t really need to work with 50 tracks, he does need to leverage the ability to selectively raise or lower the levels very specifically.

Walter sees there having been a shift at Apple over the last 10 years. They have benefited from the professional market, and we all have made a lot of noise about Apple, but starting with the iPod, iPhone, and iPad, Apple has broadened out into a mass-market creature, wanting to democratize capabilities even further.

While Walter is encouraged by the updated FCPX version last month, he hasn’t used it on any real work yet, so he is cautiously optimistic (and still traumatized he says). “Do they love us? No…I know they like us….but they keep saying they love us??”

Things wrapped up with a Q&A, mostly comprised of questions attendees had submitted that evening prior to his talk. A few interesting ones were:

Q: When is it time to walk away from the work?

A: ”When you see dailies, that is the only time you are seeing the images for the first time. There will be no other time for a first. It is the closest you can get to experiencing what the audience will experience. It’s a precious moment. I will sit and watch the dailies in the dark, holding a computer where I’ll type anything the image makes me feel or think, in order to preserve that first moment. Doing so will help clear the fog down the road when you’re feeling you’re getting lost.”

Q: How do you know if a scene works or doesn’t?

A: ”A scene may work on its own, but not in the context of the movie. It can be very dangerous to preemptively strike a scene from a film before you’ve seen the entire film. You can say you don’t agree with where the scene is going, but you don’t know if in the larger picture it may still have a shot.”

Q: Is there one piece of advice you can impart to sound designers?

A: ”Always go farther than you think you can go. Try to bend the literalness. Literalness doesn’t light the fire in the audiences mind. Levitate the film. Ignite the imagination.”

Q: Thoughts on 3D?

A: ”In 2D, your eyes focus on the plane of the screen while they converge towards the plan of the screen, but when you have something coming out of the screen in 3D, you not only need to focus on the screen, but you also need to converge on the detail protruding out of the screen. The mind can do it, but we’re not programmed for it. It requires processing many frames before your mind figures it out, and by then you’ve missed information. It’s analogous to the moment when the fan on your computer starts up.”

Q: If you didn’t use FCP, where would you go?

A: “I’ve used Avid in the past, so I know it well. There are some very good things that Avid has, but I’m also curious about Premiere since I’m interested in technology.”

The End of the Line for Film Cameras

While the debate has raged over whether or not film is dead, ARRIPanavision and Aaton have quietly ceased production of film cameras within the last year to focus exclusively on design and manufacture of digital cameras. That’s right: someone, somewhere in the world is now holding the last film camera ever to roll off the line.

“The demand for film cameras on a global basis has all but disappeared,” says ARRI VP of Cameras, Bill Russell, who notes that the company has only built film cameras on demand since 2009. “There are still some markets–not in the U.S.–where film cameras are still sold, but those numbers are far fewer than they used to be. If you talk to the people in camera rentals, the amount of film camera utilization in the overall schedule is probably between 30 to 40 percent.”

*

*

*

*


Mary Pickford on the beach about 1916 with film movie camera

At New York City rental house AbelCine, Director of Business Development/Strategic Relationships Moe Shore says the company rents mostly digital cameras at this point. “Film isn’t dead, but it’s becoming less of a choice,” he says. “It’s a number of factors all moving in one direction, an inexorable march of digital progress that may be driven more by cell phones and consumer cameras than the motion picture industry.”

Aaton founder Jean-Pierre Beauviala notes why. “Almost nobody is buying new film cameras. Why buy a new one when there are so many used cameras around the world?” he says. “We wouldn’t survive in the film industry if we were not designing a digital camera.”

Beauviala believes that that stereoscopic 3D has “accelerated the demise of film.” He says, “It’s a nightmare to synchronize two film cameras.” Three years ago, Aaton introduced a new 35mm film camera, Penelope, but sold only 50 to 60 of them. As a result, Beauviala turned to creating a digital Penelope, which will be on the market by NAB 2012. “It’s a 4K camera and very, very quiet,” he tells us. “We tried to give a digital camera the same ease of handling as the film camera.”

Panavision is also hard at work on a new digital camera, says Phil Radin, Executive VP, Worldwide Marketing, who notes that Panavision built its last 35mm Millennium XL camera in the winter of 2009, although the company continues an “active program of upgrading and retrofitting of our 35mm camera fleet on a ongoing basis.”

“I would have to say that the pulse [of film] was weakened and it’s an appropriate time,” Radin remarks. “We are not making film cameras.” He notes that the creative industry is reveling in the choices available. “I believe people in the industry love the idea of having all these various formats available to them,” he says. “We have shows shooting with RED Epics, ARRI Alexas, Panavision Genesis and even the older Sony F-900 cameras. We also have shows shooting 35mm and a combination of 35mm and 65mm. It’s a potpourri of imaging tools now available that have never existed before, and an exciting time for cinematographers who like the idea of having a lot of tools at their disposal to create different tools and looks.”

Do camera manufacturers believe film will disappear? “Eventually it will,” says ARRI’s Russell. “In two or three years, it could be 85 percent digital and 15 percent film. But the date of the complete disappearance of film? No one knows.”

From Radin’s point of view, the question of when film will die, “Can only be answered by Kodak and Fuji. Film will be around as long as Kodak and Fuji believe they can make money at it,” he says.

FILM PRINTS GO UP IN SMOKE
Neither Kodak nor Fuji have made noises about the end of film stock manufacture, but there are plenty of signs that making film stock has become ever less profitable. The need for film release prints has plummeted in the last year and, in an unprecedented move, Deluxe Entertainment Services Group and Technicolor–both of which have been in the film business for nearly 100 years–essentially divvied up the dwindling business of film printing and distribution.

Couched in legalese of mutual “subcontracting” deals, the bottom line is that Deluxe will now handle all of Technicolor’s 35mm bulk release print distribution business in North America. Technicolor, meanwhile, will handle Deluxe’s 35mm print distribution business in the U.S. and Deluxe’s 35mm/16mm color negative processing business in London, as well as film printing in Thailand. In the wake of these agreements, Technicolor shut its North Hollywood and Montreal film labs and moved its 65mm/70mm print business to its Glendale, California, facility; and Deluxe ended its 35mm/16mm negative processing service at two facilities in the U.K.

“It’s a stunning development,” says International Cinematographer Guild President Steven Poster, ASC. “We’ve been waiting for it as far back as 2001. I think we’ve reached a kind of tipping point on the acquisition side and, now, there’s a tipping point on the exhibition side.”

“From the lab side, obviously film as a distribution medium is changing from the physical print world to file-based delivery and Digital Cinema,” says Deluxe Digital Media Executive VP/General Manager Gray Ainsworth. “The big factories are absolutely in decline. Part of the planning for this has been significant investments and acquisitions to bolster the non-photochemical lab part of our business. We’re developing ourselves to be content stewards, from the beginning with on-set solutions all the way downstream to distribution and archiving.” Deluxe did exactly that with the 2010 purchase of the Ascent Media post production conglomerate.

Technicolor has also been busy expanding into other areas of the motion picture/TV business, with the purchase of Hollywood post house LaserPacific and a franchise licensing agreement with PostWorks New York. Technicolor also acquired Cinedigm Digital Cinema Corp., expanding their North America footprint in Digital Cinema connectivity to 90 percent. “We have been planning our transition from film to digital, which is why you see our increased investments and clear growth in visual effects and animation, and 2D-to-3D conversion,” says Technicolor’s Ouri. “We know one day film won’t be around. We continue to invest meaningfully in digital and R&D.”

DIGITAL: AN “OVERNIGHT SUCCESS”
Although recent events–the end of film camera manufacturing and the swan dive of the film distribution business–makes it appear that digital is an overnight success, nothing could be further from the truth. Digital first arrived with the advent of computer-based editing systems more than 20 years ago, and industry people immediately began talking about the death of film. “The first time I heard film was dead was in 1972 at a TV station with videotape,” says Poster, ASC. “He said, give it a year or two.”

Videotape did overtake film in the TV station, but, in the early 1990s, with the first stirrings of High Definition video, the “film is dead” mantra arose again. Laurence Thorpe, who was involved in the early days of HD cameras at Sony, recalls the drumbeat. “In the 1990s, there were a lot of folks saying that digital has come a long way and seems to be unstoppable,” he says.

The portion of the film ecosystem that has managed the most complete transition to digital is post-production.

According to Technicolor Chief Marketing Officer Ouri, over 90 percent of films are finished with digital intermediates.

But the path to digital domination has also taken place in a world of Hollywood politics and economics. A near-strike by Screen Actors Guild actors, the Japanese tsunami and dramatic changes in the business of theater exhibition have all contributed to the ebbing fortunes of film. Under pressure, any weakness or break in the disciplines that form the art and science of film–from film schools to film laboratories–could spell the final demise of a medium that has endured and thrived for over 100 years.

Two Icons of Film above Technicolor’s new Hollywood H.Q. and below Kodak’s Rochester H.Q. built in 1914

THREE STRIKES AND YOU’RE OUT?
Until 2008, the bulk of TV productions and all feature films took place under SAG jurisdiction, which covers actors in filmed productions. In the months leading up to the Screen Actor Guild’s 2008 contract negotiations with the Alliance of Motion Picture and Television Producers, SAG leadership balked on several elements, including the new media provisions of the proposed contract. Negotiations stalemated. Not so with AFTRA, the union that covers actors in videotaped (including HD) productions, which inked its own separate agreement with AMPTP.

“When producers realized they could go with AFTRA contracts, but they now had to record digitally, they switched almost overnight,” recalls Poster. Whereas, in previous seasons, 90 percent of the TV pilots were filmed, and under SAG jurisdiction, in one fell swoop the 2009 pilot season went digital video, capturing 90 percent of the pilots. In a single season, the use of film in primetime TV nearly completely vanished, never to return.

The Japanese tsunami on March 11, 2011, further pushed TV production into the digital realm. Up until then, TV productions were largely mastered to Sony’s high-resolution HD SR tape, but the sole plant that made the tape, located in the northern city of Sendai, was heavily damaged and ceased operation for several months. With only two weeks worth of tape still available, TV producers scrambled to come up with a workaround, leading at least some of them to switch to a tapeless delivery, another step into the future of an all-digital ecosystem.

The third, and perhaps most devastating blow to film, comes from the increased penetration of Digital Cinema. According to Patrick Corcoran, National Association of Theatre Owners (NATO) Director of Media & Research/California Operations Chief, at the end of July 2011, “We passed the 50 percent mark in terms of digital screens in the U.S. We’ve been adding screens at a fast clip this year, 700 to 750 a month,” he says.

He notes that the turning point was the creation of the virtual print fee, which allows NATO members to recoup the investment they have to make to upgrade to digital cinema. (Studios, meanwhile, save $1 billion a year for the costs of making and shipping release prints.)

To take advantage of the virtual print fee, theater owners will have to transition screens to digital by the beginning of 2013. “Sometime, in 2013, all the screens will be digital,” says Corcoran. “As the number of digital screens increase, it won’t make economic sense for the studios to make and ship film prints. It’ll be absolutely necessary to switch to Digital Cinema to survive.”

REINVENTING THE FILM LAB

Can the continued production of film stock survive the twin disappearance of film acquisition and distribution? Veteran industry executive Rob Hummel, currently president of Group 47, recalls when, as head of production operations, he was negotiating the Kodak deal for DreamWorks Studios. “At the time, the Kodak representative told me that motion pictures was 6 percent of their worldwide capacity and 7 percent of their revenues,” he recalls. “The rest was snapshots. In 2008 motion pictures was 92 percent of their business and the actual volume hasn’t grown. The other business has just disappeared.”

Eastman Kodak, Chris Johnson, Director of New Business Development, Entertainment Imaging, counters that “I don’t see a time when Kodak stops making film stock,” noting the year-on-year growth in 65mm film and popularity of Super 8mm. “We still make billions of linear feet of film,” he says. “Over the horizon as far as we can see, we’ll be making billions of feet of film.”

Yet, as Johnson’s title indicates, Kodak is hedging its bets by looking for new areas of growth. One focus is on digital asset management via leveraging its Pro-Tek Vaults for digital, says Johnson, and another is investigating “asset protection film,” a less expensive film medium that provides a 50 to 100 year longevity at a lower price point that B&W separation film.

Kodak has also developed a laser-based 3D digital cinema projector. “Our system will give much brighter 3D images because we’re using lasers for the light source,” says Johnson. “And the costs of long-term ownership is much less expensive because the lasers last longer than the light sources for other projectors.”

STORING FOR THE FUTURE

As more than 1 million feet of un-transferred nitrate film worldwide demonstrates, archiving doesn’t get top billing in Hollywood. Although the value of archived material is unarguable, positioned at the end of the life cycle of a production, archivists have unfortunately had a relatively weak voice in the discussion over transitioning from film to digital.

Since the “film is dead” debate began, archivists fought to keep elements on film, the only medium that has proven to last well over 100 years. “Most responsible archivists in the industry still believe today that, if you can at all do it, you should still stick it on celluloid and put it in a cold, dry place, because the last 100 years has been the story of nitrate and celluloid,” says Deluxe’s Ainsworth.

He jokes that if the world’s best physicists brought a gizmo to an archivist that they said would hold film for 100 years, the archivist would say, “Fine, come back in 99 years.” “With the plethora of digital files, formats and technologies–some of which still exist and some of which don’t–we’re running into problems with digital files made only five years ago,” he adds.

At Sony Pictures Entertainment, Grover Crisp, Executive VP of Asset Management, Film Restoration and Digital Mastering, notes that “Although it’s a new environment and everyone is feeling their way through, what’s important is to not throw out the traditional sensibilities of what preservation is and means.
“We still make B&W separations on our productions, now directly from the data,” he says. “That’s been going on for decades and has not stopped. Eventually it will be all digital, somewhere down the road, but following a strict conservation approach certainly makes sense.”

Crisp pushes for a dual, hybrid approach. “You need to make sure you’re preserving your data as data and your film as film,” he says. “And since there’s a crossover, you need to do both.” LTO tape, currently the digital storage medium of choice, is backwards compatible only two generations, which means that careful migration is a fact of life–for now at least–in a digital age. “The danger of losing media is especially high for documentaries and indie productions,” says Crisp.

Hummel and his partners at Group 47, meanwhile, believe they have the solution. His company bought the patents for a digital archival medium developed by Kodak: Digital Optical Tape System (DOTS). “It’s a metal alloy system that requires no more storage than a book on a shelf,” says Hummel, who reports that Carnegie Mellon University did accelerated life testing to 97 years.

THE DEATH OF FILM REDUX
“Though reports of its imminent death have been exaggerated, more industry observers than before accept the end of film. “In 100 years, yes,” says AbelCine’s Shore. “In ten years, I think we’ll still have film cameras. So somewhere between 10 and 100 years.”

Film camera manufacturers have walked a tightrope, ceasing unprofitable manufacture of film cameras at the same time that they continue to serve the film market by making cameras on demand and upgrading existing ones. But they–as well as film labs and film stock manufacturers–clearly see the future as digital and are acting accordingly.

Will film die? Seen in one way, it never will: our cinematic history exists on celluloid and as long as there are viable film cameras and film, someone will be shooting it. Seen another way, film is already dead…what we see today is the after-life of a medium that has become increasingly marginalized in production and distribution of films and TV. Just as the last film camera was sold without headlines or fireworks, the end of film as a significant production and distribution medium will, one day soon, arrive, without fanfare.

source: CreativeCow.net

Visual Effects Bill of Rights draws a line in the sand

The Visual Effects Society, the industry’s organization of visual effects artists and technicians, today released a Bill of Rights designed to call attention to problems affecting its membership and Hollywood. The document follows an open letter to the entertainment industry by the VES, which cited a downward spiral of working conditions and benefits as well as earnings for effects pros around the globe.  “In the VES open letter, we said it was time to step up as the voice of the visual effects industry by talking to all parties regarding their concerns,” said exec director Eric Roth. “At this time we have engaged in a vigorous dialog with key stakeholders at all levels and believe our Bill of Rights lays out the vital concerns of each segment of the industry. Our next step is to focus on bringing all parties together to seek solutions.”

source: deadlinehollywood.com

While training and education are crucial to supporting the VFX and Animation industries here at home, what this bill of rights actually reveals is that much of the labor continues to be outsourced to India and China, where working conditions are not regulated and wages are minimal.  Every U.S. industry faces this harsh reality. Despite the fact that we remain the leader in creation of filmed entertainment, producers are content to have the work done in sweatshops around the world, rather than maintain a talent base here at home.

Scott Arundale

Final Cut is Dead! Long live Final Cut!

Apple’s Final Cut Pro is the leading video-editing program. It’s a $1,000 professional app. It was used to make “The Social Network,” “True Grit,” “Eat Pray Love” and thousands of student movies, independent films and TV shows. According to the research firm SCRI, it has 54 percent of the video-editing market, far more than its rivals from Adobe and Avid.

On Tuesday, Apple pulled a typical Apple move: it killed off the two-year-old Final Cut 7 at the peak of its popularity.

In its place, Apple now offers something called Final Cut Pro X (pronounced “10”). But don’t be misled by the name. It’s a new program, written from scratch. Apple says a fresh start was required to accommodate huge changes in the technological landscape.

Apple veterans may, at this point, be feeling some creepy déjà vu. You’ve seen this movie before. Didn’t Apple kill off iMovie, too, in 2008, and replace it with an all-new, less capable version that lacked dozens of important features? It took three years of upgrades before the new iMovie finally surpassed its predecessor in features and coherence.

Some professional editors are already insisting that Apple has made exactly the same mistake with Final Cut X; they pointed out various flaws with the program after an earlier version of this column was posted online on Wednesday. They say the new program is missing high-end features like the ability to edit multiple camera angles, to export to tape, to burn anything more than rudimentary DVDs and to work with EDL, XML and OMF files (used to exchange projects with other programs). You can use a second computer monitor, but you need new TV-output drivers to attach an external video monitor. You can’t change the settings of your exported QuickTime movies without the $50 Compressor program.

Apple admits that version X is a “foundational piece.” It says that it will restore some of these features over time, and that other companies are rapidly filling in the other holes.

For nonprofessionals, meanwhile, Final Cut is already tempting — especially because the price is $300, not $1,000. It’s the first Apple program that’s available only by download from the online Mac App Store, not on DVD. All of the programs formerly called Final Cut Studio have been rolled into Final Cut except Motion and Compressor, which are sold separately. Final Cut Express and DVD Studio Pro are gone.)

The new Final Cut has been radically redesigned. In fact, it looks and works a lot like iMovie, all dark gray, with “skimming” available; you run your cursor over a clip without pressing the mouse button to play it.

Once you’re past the shock of the new layout, the first thing you’ll notice is that Apple has left most of the old Final Cut’s greatest annoyances on the cutting-room floor.

First — and this is huge — there’s no more waiting to “render.” You no longer sit there, dead in the water, while the software computes the changes, locking up the program in the meantime, every time you add an effect or insert a piece of video that’s in a different format. Final Cut X renders in the background, so you can keep right on editing. You cannot, however, organize your files or delete clips during rendering.

Second, in the old Final Cut, it was all too easy to drag the audio and video of a clip out of sync accidently; little “-1” or “+10” indicators, showing how many frames off you were, were a chronic headache. But in the new Final Cut, “sync is holy,” as Apple puts it. Primary audio and video are always synced, and you can even lock other clips together so that they all move as one.

In fact, an ingenious feature called Compound Clips lets you collapse a stack of audio and video clips into a single, merged filmstrip on the timeline. You can adjust it, move it and apply effects as if it were a single unit, and then un-merge it anytime you like. Compound Clips make it simple to manage with a complicated composition without going quietly insane.

In the old Final Cut, if you dragged Clip A so that it overlapped part of Clip B, even briefly, you wound up chopping away the covered-up piece of Clip B. But now, the timeline sprouts enough new parallel “tracks” to keep both of the overlapping clips. Nothing gets chopped unless you do it yourself.

Source: David Pogue / NYTimes.com

Apple’s new non-linear editing app plots a roadmap to the future of video editing

by Gary Adcock, Macworld.com

With the release of its hotly anticipated Final Cut Pro X (FCP X), Apple breaks new ground—not just with its flagship video editor’s interface and underlying infrastructure—but with the whole mindset of what it means to be a working professional video editor.

Apple has revamped Final Cut Pro’s hands-on user experience in three major areas: Editing, media organization, and post-production workflow. New tools such as the Magnetic Timeline, Clip Connections, Compound Clips, and Auditions provide a smooth, intuitive editing experience.

With the rise of data-centric workflows and tapeless video recording, organizational tools such as Content Auto-Analysis, Range-based keywords, and Smart Collections work in the background to automate formerly tedious and time-consuming manual processes.

Post production workflows now offer customizable effects, integrated audio editing, color grading, and a host of streamlined delivery options.

With this new application, video pros can no longer follow traditional ways of working.

Clips in FCP X’s new Event Library are sorted by both user-created Custom Keywords (blue icons) and Smart Collections. The latter are created automatically by Content Auto-Analysis during import (purple icons).

Final Cut Pro X, despite its familiar name, is not an upgrade of Final Cut Pro 7. It is a brand new product. FCP X is also no longer part of a suite of applications such as Final Cut Studio, but rather one of a trio of component parts that include Final Cut Pro X ($300), Motion 5 ($50), and Compressor 4 ($50). All are available separately for download from the Mac App Store. There will be no boxed copies.

Rolling

Final Cut Pro X starts off by immediately analyzing your media as it begins to import footage, while at the same time archiving critical secondary information on color balance, motion, rolling shutter artifacts, tracking, and stabilization data on a clip-by-clip basis.

While handling the bulk of analytical information at ingest, FCP X is tagging the files with metadata in a manner that speeds secondary file processing, delivery, and rendering capabilities and vastly accelerates workflow. The heavy lifting of this content is invisibly handled in the background—between the application and the Mac operating system—as a byproduct of the conversion to a fully 64-bit application workflow.

The most profound interface changes to FCP X—beyond the new darker look—are the Event Browser and the Event Library. Importing your content into the app creates a new Event, a virtual folder that holds all of the information about your media: what it is, where it’s stored, and whether it’s from a specific date, place, or client. You can even rate, organize, and show or hide clips from view while accessing tools like Keyword and Smart Collections. Events are created by the application as part of the ingest, in addition to your organizational effort.

When you’re done creating your video, you can use the direct upload options within FCP X to share it on Facebook, Vimeo, YouTube, and CNN iReport. All Apple devices are available as options, as well as Podcast Producer, output for standard definition DVDs, and even Blu-ray devices, directly within FCP X. Plus, the application still offers fully integrated processing with Compressor. Standalone export output options offer all flavors of Apple’s ProRes, H.264, DVCProHD, Apple HDV, and even Sony’s XDCamEX format at 35Mbps and the 50Mbps version of the XDCamHD 422 codec.

Here are the currently available output options for file delivery when exporting your project directly from FCP X.

Magnetic Timeline

Acting as a trackless canvas for your video edit, the Magnetic Timeline allows you the freedom to arrange and re-arrange your media wherever you want. Existing clips on the timeline slide in and out of the way without danger of collision or overwriting a previous edit. They snap into place “magnetically,” dynamically aligning with existing media in the timeline. Despite being trackless, you can easily create multi-level compositions and properly maintain continuity as you move media around in your project. This feature interactively shows you exactly what’s happening in the timeline as you work, so you can easily execute your vision.

Clip Connections and Compound Clips

Designed to maintain the continuity of media in the timeline, Clip Connections are relational links between primary media in the timeline and secondary elements. Content such as titles, B-roll, sound effects, and even music, can be moved and repositioned seamlessly as a single clip, maintaining audio and video sync, and giving you a clear, visually defined connection to your assets.

Alongside Clip Connections and its facility at combining primary and secondary elements into a cohesive unit for editing and filtering, Compound Clips further advances the concept by allowing a complex multi-element group of media to be handled as a single clip. It’s easy. Just select the relevant media and choose Create Compound Clip from the File menu (or hit the Option-G key command).

Compound Clips let you move, duplicate, and handle clips as an individual segment. You can even share such clips across multiple projects or use a clip to apply filters and effects across all combined elements. The Compound Clips feature helps video editors remove clutter and simplify the timeline’s organization, while maintaining media continuity.

This is the Compound Clip when open.

The Compound Clip command offers a vastly simpler timeline that minimizes the additional track and clip information until needed. Think of it as a nested sequence on steroids. This is the Compound Clip when closed.

Auditions

Think about being able to create multiple versions of an opening or closing sequence for different clients or presentations. That’s the power of Auditions. With this feature, you can view various alternative scenes in your video without leaving the timeline. Auditions provides a fast and easy way to preview a number of variations—with any media collection—in real time. To create Auditions, just drag the shots to the same place in the timeline and choose the Add to Audition command. This allows the Magnetic Timeline to handle the sequence continuity and sync.

Auditions lets you dynamically preview multiple clips within the timeline without disturbing any other media.

Content Auto-Analysis and Keywords

Underneath the surface, FCP X mines metadata from your content from the second you ingest footage from the camera. That data stream includes information such as camera type, frame rate, white balance, and a host of other pre-defined parameters.

The Content Auto-Analysis feature can, during import, distinguish individual people, shot types (close, medium, and wide) and rolling shutter artifacts common to many CMOS cameras. It can also rectify stabilization issues with hand-held shots, adjust overall color balance, and analyze and remove excessive audio noise and hum or silent audio tracks from footage.

Much of FCP X’s automatic keyword creation is derived from this media detection functionality. The program uses the information gathered to create keywords and automatically assembles assets into Smart Collections within the Event Library. Thus, while importing your content, FCP X is sorting, categorizing, and auto keywording in the background. In addition to the keywords that the program assigns, you can add or edit your own keywords to identify specific shots in any manner you choose. Since you can have multiple keywords for the same clip, and all of those clips would appear in each search and link to a single original piece of media, you can accomplish faster, cleaner edits.

Applying customizable effects offers you a real-time preview of the effect on your video in the canvas. Note that the thumbnail view in the Effects Window also shows the clip. The new Skimming feature brings added power to the content preview in FCP X.

Another result of this metadata analysis is the ability to create a keyword selection across multiple pieces of media. Range-based Keywords allow you to flag a keyword across all or just small parts of multiple clips. Keywording offers a larger, far more flexible canvas. No longer are you restricted to specific bins or folders.

Content library access

Borrowing a page from its iLife line of consumer apps, FCP X lets you browse all of your attached media and content libraries within the program. View your iTunes, iPhoto, and Aperture content as well as Motion Libraries directly, as well as 1300 royalty-free sound effects offered as a free download, available via Apple’s software update, after you purchase FCP X.

Customizable effects

FCP X provides a wide array of content, including animations, titles, transitions, and effects sequences, all accessible and editable within the application.

Much of this content was created specifically for Apple by Hollywood effects pros and graphic designers. Customizable from the start, these effects allow you to preview a clip by selecting a shot and then using the skimming function to get a true instantaneous, real-time preview—both as a thumbnail and in the viewer—of how your shot will look with that effect applied.

You also have access to all of the transform functions (crop, scale, rotate, and distort) as well as keyframing of those effects without having to jump between different parts of the interface. Effects imported from Motion 5 can be managed to allow you to modify different parts of the program’s new Parameter Rigging feature.

Effects in viewer.

Audio editing

Apple has chosen to fully integrate audio editing into FCP X. Starting with the ingest, the program analyzes content for hum, noise, and dynamic audio changes. It even automates audio sync from an external recorder and the camera, matching audio and video via the waveforms, to connect content and sync it properly. This was formerly a manual process.

With a large library of sound effects and high-quality audio effects plug-ins available in FCP X, you now have greater control of your audio enhancements than ever before. You can access control for sub-frame audio edits as well as many of the available 64-bit versions of third-party Audio Units plug-ins.

Color grading

Whether you need a single-click correction or want to create a stylized look, all color work now happens within FCP X. From the first analysis, color balance and correction are mapped for use, allowing you to quickly match multiple shots in the same group or refine the look of any clip in the Event Browser or the timeline.

The Color Board gives you a dynamic way to make custom modifications to overall color, saturation, and exposure, while allowing keying and masking to be done simply and directly within the app. The Match Color feature offers a fast and easy way to match the overall color, contrast, and look between two different shots to maintain a project’s visual continuity.

The Color Board represents the essence of Apple’s FCP X interface changes. It allows you simple control over exposure, color, and saturation.

Bottom line

Apple’s new Final Cut Pro X has been re-designed from the ground up with a radically different approach—one that acknowledges and uses device and camera data in a manner that has never before been attempted in the video editing environment.

With this release, Apple shows us the future in which data streams from all the devices we work with communicate seamlessly, sharing media behind the scenes. Think of the advantages and possibilities when all the effort you put into setting up a shot or project continue downstream from your camera into post-production, or follow your content when it’s delivered on the web. That’s the promise of Final Cut Pro X. Will that promise be fulfilled?

[Gary Adcock is a Chicago-based consultant who specializes in building workflows for film and television productions. He is the founder and past president of the Chicago Final Cut Users Group, Tech Chairman of the NAB Director of Photography Conference, and a member of the I/A Local 600/ Camera Guild Training Committee, teaching tapeless production techniques and workflows to professional camera operators. His writings and musing can be found at his blog on Creative Cow .]

3ality Digital forms strategic partnership with RED Digital Cinema Education

3ality Digital and RED Digital Cinema Partnership Kicks Off With S3D Production Classes and Presentation at REDucation

Burbank, CA. – June 1, 2011 – 3ality Digital, world leader in advanced technologies to empower creative digital stereoscopic 3D (S3D) acquisition, and RED Digital Cinema announced today a Stereoscopic 3D partnership, which launched during the recently completed REDucation sessions on May 24-28 at RED Studios Hollywood. 3ality Digital will be the primary 3D partner for RED Digital Cinema, and together the companies will train professional and aspiring filmmakers on how to create clear and pristine 3D images using the same equipment as elite Hollywood directors like Peter Jackson and Bryan Singer.

“The biggest tent pole movies shooting on the planet right now, like The Hobbit, are all shooting S3D on EPIC and 3ality Digital,” said Ted Schilowitz, Leader of the Rebellion at RED Digital Cinema. “The teams at RED and 3ality Digital have been working together for years behind the scenes. Now is the right time to take that relationship to the next level and integrate education components for the community.”

As the primary stereoscopic 3D partner for RED, 3ality Digital lent its technology, currently being used in feature films such as The Amazing Spiderman and Jack the Giant Killer, to REDucation’s 3-day introductory session May 24-26, as well as during the advanced classes May 27-28. The REDucation Open House included a screening of S3D content produced with 3ality Digital technology. Attendees also experienced special presentations from RED including the latest “Tattoo” EPIC Reel shown in 4k.

“S3D is here to stay and choosing partners at the forefront of the technology that really grasp what true, high-resolution cinema and S3D are all about is essential for business and for the community,” said Steve Schklair, CEO of 3ality Digital. “Educating filmmakers and getting RED and 3ality Digital technology in their hands at events like REDucation is a crucial step towards accelerating and facilitating S3D content production and ultimately consumer adoption.”

The ongoing partnership will also include collaboration at the Camp RED youth summer program August 1-19, where the new partners will provide young filmmakers with training in S3D production. Students ages 9-15 will shoot their own S3D films at RED Studios Hollywood during the week-long day camp sessions, including an exclusive, behind-the-scenes trip to 3ality Digital Studios.


About RED Digital Cinema
Red Digital Cinema is the brainchild of Jim Jannard, founder of Oakley, world-famous manufacturer of sunglasses, sports apparel and personal electronics. Mr. Jannard is a self-professed lover of all things photographic, having amassed an extensive photographic collection, as well as having been a shooter for most of his life. His search for the perfect video/film camera was never satisfied and proved to be the inspiration behind creating the ultimate full motion camera. His desire was to create a camera that matched the quality of, and processed images similar to, the very finest digital still cameras…. but at motion picture frame rates.

About 3ality Digital
3ality Digital is a pioneer and leading authority in stereoscopic 3D (S3D). 3ality Digital provides the film and television industry with camera platforms, stereo image processing systems and S3D image scaling technologies that are considered the “gold standard” for the production of compelling and immersive S3D entertainment. Whether for a feature film or live sporting event, its innovative technology empowers customers to stay in control of creativity when working with S3D.

Founded in 2000 by CEO, Steve Schklair, 3ality Digital has a reputation as an innovator in S3D, with its technology powering multiple live-action firsts. This includes: U2 3D, the first movie shot completely in live-action S3D; the first live S3D broadcast of an NFL game (Raiders vs. Chargers, December 4th, 2008, broadcasted to a select audience); the first live S3D sports broadcast available to consumers, including the 2009 BCS Championship Game, BSkyB’s landmark Manchester United vs. Arsenal soccer broadcast (January 31st, 2010), the first network hockey telecast ever produced in S3D (New York Rangers vs. Islanders, March 24th 2010 on MSG); the first S3D commercial broadcast during a Super Bowl (Sobe “Lizard Lake”); the first full episode of a scripted television series shot in live-action S3D (Chuck vs. The Third Dimension, aired on NBC on February 2nd, 2009); and first RED EPIC S3D Movie, ‘The Amazing Spider-Man.’

For more information, please visit www.3alitydigital.com

Final Cut Pro Editor explains why he is going back to Avid

Matt Toder has been editing video professionally for eight years, and currently works at Gawker.TV. These are his thoughts on Apple’s latest Final Cut Pro release.

I landed my first job in post-production in 2003 at a small house which used Avid exclusively. It had plenty of problems; we struggled with the Dragon error for a few months, converted to Xpress Pro when it came out, and then wrestled with that. There just weren’t any other options. And then Apple’s Final Cut Pro was released, although it too had some problems. But when Avid stopped listening to their customers and became more and more inflexible, Final Cut Pro became an increasingly attractive option. By 2009, significant portions of the editing community were using it.

And now we’ve been given a glimpse of FCPX, a massive, from-the-ground-up revision of Final Cut Pro which proves one thing definitively: that Apple understood many of the problems that were inherent to Final Cut Pro. But, instead of fixing them, they just decided to change everything.

At the preview event, Peter Steinauer, FCP Architect, assured the audience that FCPX was just as much for professional editors as FCP7 was. It really doesn’t seem that way, though. After getting through some of technical aspects of what makes FCPX better than its predecessor in terms of processing power and such—which does seem awesome—Steinauer moved on immediately to color sync. He boasted that FCPX would make sure that pixels looked exactly the same throughout the editing process, noting “you can trust that the pixels coming off a pro file device track all the way through your workflow to display on the screen and ultimately out to output.” This all seems well and good, except it’s completely unimportant for professional editors who aren’t finishing in Final Cut. Some of us color correct in a da Vinci with a professional colorist and then conform in a Flame. Steinauer’s point proves the underlying key of FCPX: that it really isn’t for professional editors.

If it were a device for professional editors, FCPX wouldn’t require a complete rethinking of non-linear editing. It would have instead addressed some of the problems that Final Cut Pro presents for professionals, problems that have existed since day one and that have solutions in the Avid. Like the ability to save your export settings. Or the ability to have an upackable project that allows editors to share bins and not force them into creating multiple projects to share. Or a reliable shared media solution, like Unity or LanShare, so we don’t have to work off of local drives all the time. Or a reliable find bin command that doesn’t constantly tell you your clips aren’t in the browser when you know for a fact that they are. Or a title tool that not only allows you to kern your text but allows you to see what you’re doing in the sequence without having to click back and forth constantly. Or, as the most recent updates to Media Composer have, a way to read RED files directly and then export DPX files. Because, again, not everyone is finishing in Final Cut.

If this were truly a device for professional editors, those improvements would have been in FCPX, and Steinauer would have made a point of mentioning them considering the room he was playing to. But he didn’t. He also didn’t mention EDL’s, OMF’s, XML’s or any changes to the Media Manager that might make generating a cut list for telecine a little easier. He also would have mentioned how the new Compound Clip feature would react when EDL’s are being generated from a sequence full of them.

The idea of Compound Clips speaks to another issue with FCPX. One of the hardest adjustments an Avid editor had to make when switching to Final Cut Pro was no longer being able to load a sequence into the source monitor and cut it into the sequence while maintaining master clip information; FCP turned it into a new clip, which really was just a work around for not being able to generate video mixdowns. This meant that you couldn’t build a select string and then edit from it while still being able to match to your master clip. One would have hoped that FCPX would be able to do something like this, have a more nuanced understanding of the timeline, the way that Avid does, and improve upon a situation where every little move throws everything out of whack unless you’ve gone through and manually locked tracks.

Apple seems to know that keeping things in sync in Final Cut Pro was extremely problematic and have attempted to solve this with Clip Connections and the Magnetic Timeline. Clip Connections can lock a piece of video and its corresponding dialogue to, say, a specific sound effect so that they all travel together all the time. The Magnetic Timeline feature ensures that when this group is moved, you don’t get a clip collision or have to eliminate something from the next piece of media in the timeline. Instead, the next piece of media slides down one track in the timeline. Of course, the demo contains one track of video and two tracks of audio so it’s easy to see that everything works out. I wonder what will occur when you’ve got two pieces of video composited together with a title on top and your audio has dialogue, music, and a couple of sound effects. Will it move everything in the higher audio tracks down as well, thereby destroying the scheme of your timeline?

The biggest, most apparent change is the absence of the source monitor: it’s the iMovie-ing of non-linear editing. Of all the people watching the preview, applauding wildly and yelling out “I want it!” and “thank you,” I can’t believe that one person didn’t scream, “where’s the freaking source monitor?” This represents a gigantic change in the way non-linear editing occurs, a nearly unfathomable one. Since non-linear editing was invented, the mainstays have been the source monitor, the record monitor, the browser and the timeline. To take one of these away means that non-linear editing has to be rethought entirely. I’m not quite sure how you can set an exact in point without it, especially when you’re forced into using the iMovie yellow selection brackets.

All this being said, there certainly are some incredible things about FCPX, most obviously that it will render in the background and that no one will have to stare at the “writing video” dialogue box anymore. That really does sound great. And that it will analyze clips upon import so it will stabilize more quickly (although it already does the analyzing in the background). The FCPX function of analyzing clips for shot length and content (wide two shot, close single, etc) also seems great, though it would have been nice for Steinauer to mention whether this increases import time or not. And since it’s doing all this during-import work, can it also provide a transcript of some sort? That would have been truly useful because it takes a lot of work find an interview subject saying the exact right phrase, much more work than scanning through dailies for the close up series.

Another thing that I would have loved Steinauer to discuss is whether or not an editor can customize how clips are analyzed upon import and how find bin will work now. Specifically, where you will get thrown when try to find a clip in the browser. Do you get thrown to the folder with other wide shots, with other two shots, with other sunset shots or do you get to the original master clip housed somewhere else? These are the questions that need to be answered, the ones that professionals are asking. Because these are the features that change individual workflow and force editors to alter the habits that they’ve developed over time.

(The audio also gets analyzed during import, to remove hum and balance levels. Do these adjustments hold when you export an OMF and do they carry over to ProTools? Who knows, Steinauer didn’t mention anything about the way FCPX talks to other applications.)

If this is the future of Final Cut Pro, and indeed non-linear editing, then that’s fine and I can’t change it. Just don’t tell me that it’s for pros, but you have to change the way you’ve been thinking about everything. And don’t make me change for the wrong reasons, for reasons applied because the improvements speak most to people who aren’t professionals. I love that editing is something that a lot of people can do now, that there’s a greater level of understanding about what it really takes to make a compellingpiece out of a collection of images and sounds and your imagination. Editing, for me, is still where the magic is. It’s one thing to make changes for the sake of the people you claim are your clients and quite another to make changes for the sake of people who aren’t. That’s what these changes are, they are changes for the sake of making editing more accessible, not more functional.

FCPX shouldn’t be about helping people who don’t know what they’re doing, it should be about helping people who do know what they’re doing work better and faster and, most often, that means giving them the flexibility to work however they please, using the techniques they’ve developed over years of working in tough conditions. Because when you don’t have a Senior Creative Director sitting behind you, you don’t really have to worry about finding clips fast enough or making precise edits immediately. But when you are in that situation, you won’t have time re-think the thing you’ve been doing for years and years.

When FCPX is released in June, the countdown will be on for FCP7. Whether it takes a year or possibly less, support will dry up and eventually it won’t be a viable editing platform anymore. I’m not gonna wait that long. Instead, I’ll reacquaint myself with my old friend Avid, catch up on what I’ve missed and fall back into the warm embrace of my fully customized appearance and keyboard settings. It’ll take a minute to get completely familiar with it, to remember everything, and even to be reminded of all the things that drove me crazy. But at least I’ll still have a source monitor.

source: uk.gizmodo.com

Has Apple dumbed down FCP X or is this a step up?

PART ONE
Apple just introduced a new version of Final Cut at the Final Cut Pro Supermeet during NAB 2011 in Las Vegas, Nevada. Touted as “revolutionary as the first version” from 1999, Apple introduced the new Final Cut Pro X saying that every major broadcaster and film maker nowadays relies on FCP for their video editing needs.

Basing on live updates coming from attendees at NAB 2011, Final Cut Pro X has been built from scratch, and it’s entirely 64-bit. It’s based on technologies like Cocoa, Core Animation, Open CL, Grand Central Dispatch and it focuses on image quality. It features a resolution independent timeline up to 4K for scalable rendering — in fact, it appears the old render dialog is gone entirely as the app uses the available CPU to keep files always rendered. FCP X allows you to edit while you’re importing thanks to its new engine, and it’s also got automatic media and people detection on import, as well as image stabilization.

Apple is promoting the new FCP X as a complete and total rebuild. Smart collections look very similar to iMovie, and overall there is a feeling Apple has borrowed some UI elements from the iLife application to make the general design more accessible, even for professionals. For instance, Apple has brought “single keystroke nesting” to Final Cut Pro — a new functionality that allows you to group chunks of media into a single clip in the timeline.  The “inline precision editor” allows you to make edits by revealing media with an iOS-like menu.

Source: http://www.engadget.com

It’s possible that the GUI is more user friendly and the functionality has improved but based on the comments and features presented today the jury is still out as to whether or not FCP will be going head to head with the competition.  This means that ease of use may or may not improve functionality but instead lowers the playing field for all the non-editors out there.  I am all in favor of making editing easier but the roll-out today suggests a beta experience that does little to assist the professional editor in cutting a long form project.  Feels like a step backward on the time/space continuum and I always get a little queasy when the word iMovie is mentioned in the same breath as Final Cut.

-Scott Arundale
PART TWO

UPDATE 4.13.11

Upon watching the demonstration in full, I got the impression there were some shills in the audience shouting their appreciation for the new features.

Now that calmer minds prevail let’s look at the upside.  Instant nesting with a single keystroke. Easy keyword features. Better sync and collision options. Automatic color grading, stabilization and background rendering.  Excellent use of the 64 bit engine.  But all of this suggests Apple is more interested in the young editor cutting short form trailers than the longform editor trying to cut a feature, nevermind the hapless assistant who must keep it together.  Much of the work that FCP is trying to achieve is normally the work of the assistant but they pre-suppose that the editor is working solo.  Pity that editor if they don’t have a second pair of hands to help him or her.  I’m all in favor of having the machine do the work.  I’m ready as an Apple Certified Trainer to go back to school and re-learn how to cut faster and easier.  What I loved about FCP is that copy and paste makes it easy to move stuff around.  Apple has made it even “easier”, but again requires a new mindset i.e it takes more thought and fewer keystrokes to achieve the same thing.  If this is the future than I am in.  But it begs the question: who is the target audience for this product?  Surely not the Hollywood narrative professional.  Instead the trailer/bumper/extreme sports crowd may find these new features useful.  For improved storytelling techniques, the jury is still out.

-SA

James Cameron & Vince Pace Unveil New 3D Venture At NAB

Source: Deadline Hollywood.

The Cameron-Pace Group, announced today at the start of the National Association of Broadcasters confab in Las Vegas, “seeks to accelerate worldwide growth of 3D across all entertainment platforms including features, episodic and live television, sports, advertising and consumer products.” The company, run by co-chairman James Cameron and longtime collaborator Vince Pace, will offer next-generation camera systems, services and creative tools to the entire entertainment industry, not just film. “Our goal is to banish all the perceived and actual barriers to entry that are currently holding back producers, studios and networks from embracing their 3D future,” Cameron said. “We are dedicated to building a global brand that is synonymous with high-quality 3D and spans multiple channels, from features to episodic television, and changes the boundaries of what is understood to be 3D material.”


Cameron and Pace developed under Pace’s company PACE the Fusion 3D system, which was used for the 3D in such films as AvatarTron: Legacyand U2: 3D. PACE has begun the formal rebranding process, and its operation under the Cameron-Pace Group banner is effective immediately. CPG will be headquartered in Burbank, Calif., the current home to PACE.

CPG already is working on film projects that include Pirates of the Caribbean: On Stranger TidesTransformers: Dark of the MoonThe Three Musketeers,The Invention of Hugo CabretLife of Pi and 47 Ronin.

Chapman University wants to overtake USC and NYU

Film Studies: Chapman University wants to overtake USC and NYU.

The article in this Sunday’s L.A. Times sums it up nicely.

It has been an extraordinary ride during the last four years.  I began teaching at Dodge College as an Adjunct Professor and was thrilled when Bob Bassett offered me a contract a year later.  The new Marion Knotts studio complex came with the usual wrinkles: too much technology and over-engineered but with time the school began to hit its stride.  In terms of post-production we offer 100 Avids along with DS Nitris, Smoke and Flame.  We still count ourselves as a “Film School” and it remains part of our name.  There is no other learning institution in the country that features both Autodesk Lustre and Spirit 4K.  I give a big shout out to Dan Leonard, Associate Dean and Chief Technology Officer who is the mad scientist who put this rig together along with Deszo Magyar, Associate Dean and Chief Academic Officer who consistently reminds me that character development is key to a successful story.   I’m very much involved in Alumni relations as I believe the most important aspect of our program will be when graduates will return to Orange  and share their experiences.  It is happening now. We are building our own Dodge College/Chapman mafia and we have a great reputation in the industry for interns who are bright and committed and show up on time and ready to work!

- Scott Arundale

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.