Archive for April, 2011

Final Cut Pro Editor explains why he is going back to Avid

Matt Toder has been editing video professionally for eight years, and currently works at Gawker.TV. These are his thoughts on Apple’s latest Final Cut Pro release.

I landed my first job in post-production in 2003 at a small house which used Avid exclusively. It had plenty of problems; we struggled with the Dragon error for a few months, converted to Xpress Pro when it came out, and then wrestled with that. There just weren’t any other options. And then Apple’s Final Cut Pro was released, although it too had some problems. But when Avid stopped listening to their customers and became more and more inflexible, Final Cut Pro became an increasingly attractive option. By 2009, significant portions of the editing community were using it.

And now we’ve been given a glimpse of FCPX, a massive, from-the-ground-up revision of Final Cut Pro which proves one thing definitively: that Apple understood many of the problems that were inherent to Final Cut Pro. But, instead of fixing them, they just decided to change everything.

At the preview event, Peter Steinauer, FCP Architect, assured the audience that FCPX was just as much for professional editors as FCP7 was. It really doesn’t seem that way, though. After getting through some of technical aspects of what makes FCPX better than its predecessor in terms of processing power and such—which does seem awesome—Steinauer moved on immediately to color sync. He boasted that FCPX would make sure that pixels looked exactly the same throughout the editing process, noting “you can trust that the pixels coming off a pro file device track all the way through your workflow to display on the screen and ultimately out to output.” This all seems well and good, except it’s completely unimportant for professional editors who aren’t finishing in Final Cut. Some of us color correct in a da Vinci with a professional colorist and then conform in a Flame. Steinauer’s point proves the underlying key of FCPX: that it really isn’t for professional editors.

If it were a device for professional editors, FCPX wouldn’t require a complete rethinking of non-linear editing. It would have instead addressed some of the problems that Final Cut Pro presents for professionals, problems that have existed since day one and that have solutions in the Avid. Like the ability to save your export settings. Or the ability to have an upackable project that allows editors to share bins and not force them into creating multiple projects to share. Or a reliable shared media solution, like Unity or LanShare, so we don’t have to work off of local drives all the time. Or a reliable find bin command that doesn’t constantly tell you your clips aren’t in the browser when you know for a fact that they are. Or a title tool that not only allows you to kern your text but allows you to see what you’re doing in the sequence without having to click back and forth constantly. Or, as the most recent updates to Media Composer have, a way to read RED files directly and then export DPX files. Because, again, not everyone is finishing in Final Cut.

If this were truly a device for professional editors, those improvements would have been in FCPX, and Steinauer would have made a point of mentioning them considering the room he was playing to. But he didn’t. He also didn’t mention EDL’s, OMF’s, XML’s or any changes to the Media Manager that might make generating a cut list for telecine a little easier. He also would have mentioned how the new Compound Clip feature would react when EDL’s are being generated from a sequence full of them.

The idea of Compound Clips speaks to another issue with FCPX. One of the hardest adjustments an Avid editor had to make when switching to Final Cut Pro was no longer being able to load a sequence into the source monitor and cut it into the sequence while maintaining master clip information; FCP turned it into a new clip, which really was just a work around for not being able to generate video mixdowns. This meant that you couldn’t build a select string and then edit from it while still being able to match to your master clip. One would have hoped that FCPX would be able to do something like this, have a more nuanced understanding of the timeline, the way that Avid does, and improve upon a situation where every little move throws everything out of whack unless you’ve gone through and manually locked tracks.

Apple seems to know that keeping things in sync in Final Cut Pro was extremely problematic and have attempted to solve this with Clip Connections and the Magnetic Timeline. Clip Connections can lock a piece of video and its corresponding dialogue to, say, a specific sound effect so that they all travel together all the time. The Magnetic Timeline feature ensures that when this group is moved, you don’t get a clip collision or have to eliminate something from the next piece of media in the timeline. Instead, the next piece of media slides down one track in the timeline. Of course, the demo contains one track of video and two tracks of audio so it’s easy to see that everything works out. I wonder what will occur when you’ve got two pieces of video composited together with a title on top and your audio has dialogue, music, and a couple of sound effects. Will it move everything in the higher audio tracks down as well, thereby destroying the scheme of your timeline?

The biggest, most apparent change is the absence of the source monitor: it’s the iMovie-ing of non-linear editing. Of all the people watching the preview, applauding wildly and yelling out “I want it!” and “thank you,” I can’t believe that one person didn’t scream, “where’s the freaking source monitor?” This represents a gigantic change in the way non-linear editing occurs, a nearly unfathomable one. Since non-linear editing was invented, the mainstays have been the source monitor, the record monitor, the browser and the timeline. To take one of these away means that non-linear editing has to be rethought entirely. I’m not quite sure how you can set an exact in point without it, especially when you’re forced into using the iMovie yellow selection brackets.

All this being said, there certainly are some incredible things about FCPX, most obviously that it will render in the background and that no one will have to stare at the “writing video” dialogue box anymore. That really does sound great. And that it will analyze clips upon import so it will stabilize more quickly (although it already does the analyzing in the background). The FCPX function of analyzing clips for shot length and content (wide two shot, close single, etc) also seems great, though it would have been nice for Steinauer to mention whether this increases import time or not. And since it’s doing all this during-import work, can it also provide a transcript of some sort? That would have been truly useful because it takes a lot of work find an interview subject saying the exact right phrase, much more work than scanning through dailies for the close up series.

Another thing that I would have loved Steinauer to discuss is whether or not an editor can customize how clips are analyzed upon import and how find bin will work now. Specifically, where you will get thrown when try to find a clip in the browser. Do you get thrown to the folder with other wide shots, with other two shots, with other sunset shots or do you get to the original master clip housed somewhere else? These are the questions that need to be answered, the ones that professionals are asking. Because these are the features that change individual workflow and force editors to alter the habits that they’ve developed over time.

(The audio also gets analyzed during import, to remove hum and balance levels. Do these adjustments hold when you export an OMF and do they carry over to ProTools? Who knows, Steinauer didn’t mention anything about the way FCPX talks to other applications.)

If this is the future of Final Cut Pro, and indeed non-linear editing, then that’s fine and I can’t change it. Just don’t tell me that it’s for pros, but you have to change the way you’ve been thinking about everything. And don’t make me change for the wrong reasons, for reasons applied because the improvements speak most to people who aren’t professionals. I love that editing is something that a lot of people can do now, that there’s a greater level of understanding about what it really takes to make a compellingpiece out of a collection of images and sounds and your imagination. Editing, for me, is still where the magic is. It’s one thing to make changes for the sake of the people you claim are your clients and quite another to make changes for the sake of people who aren’t. That’s what these changes are, they are changes for the sake of making editing more accessible, not more functional.

FCPX shouldn’t be about helping people who don’t know what they’re doing, it should be about helping people who do know what they’re doing work better and faster and, most often, that means giving them the flexibility to work however they please, using the techniques they’ve developed over years of working in tough conditions. Because when you don’t have a Senior Creative Director sitting behind you, you don’t really have to worry about finding clips fast enough or making precise edits immediately. But when you are in that situation, you won’t have time re-think the thing you’ve been doing for years and years.

When FCPX is released in June, the countdown will be on for FCP7. Whether it takes a year or possibly less, support will dry up and eventually it won’t be a viable editing platform anymore. I’m not gonna wait that long. Instead, I’ll reacquaint myself with my old friend Avid, catch up on what I’ve missed and fall back into the warm embrace of my fully customized appearance and keyboard settings. It’ll take a minute to get completely familiar with it, to remember everything, and even to be reminded of all the things that drove me crazy. But at least I’ll still have a source monitor.

source: uk.gizmodo.com

Despite all the hype 3D at home stumbles

TV and film industries treated 3-D like any other premium tech, pumping it full of marketing dollars. Everyone lost money. Now they await a new generation of film directors to save them using the one thing money can’t rush: talent.

You could forgive them for thinking that selling 3-D movies and TV would be easy. Manufacturers and retailers banked on 3-D’s famous novelty; allegedly “good” films like Avatar; and gleaming new HD infrastructure to carry it all into homes. Instead, most of them lost money in Q1, prompting The Financial Times to declare 3-D content would be doomed to niches like gaming and sports.

Samsung has responded tepidly to the 3-D slump by bundling their TVs with a second pair of 3-D glasses for free. If the problem were solvable by marketing, retail price, or technology, the industry might have corrected its path already. But what’s missing isn’t so easy to conjure: good film-making.

Asked about mediocre 3-D TV sales, and Panasonic’s CTO Eisuke Tsuyuzaki echoes a common sentiment in the industry: the barrier is content. “What makes good 3-D TV is new 3-D services,” he says, “and we need to work with the content industry to do this.”

He’s talking about breadth. Panasonic has partnered with DirecTV to produce and manage content for a new 24-hour 3-D channel that will feature all genres of stuff, from sports to documentaries. Partners like DirecTV need a lot of “support,” says Tsuyuzaki, because producing video in 3-D is difficult. “You need a second crew, a second director, and new hardware,” he says.

But that’s not the real holdup, according to sources in Hollywood. Whatever 3-D bottlenecks once existed in the film and TV industry, they’re all but gone now, says Ted Schilowitz of RED Digital Cinema Company, whose menacing-looking Epic 3-D camera rigs are being used to shoot new blockbusters by directors like Peter Jackson, Ridley Scott, and Bryan Singer. “We’ve basically solved all the issues, and the cost wouldn’t even discourage a film with a tiny budget.” Schilowitz says one such film, an indie horror flick named Hellbenders, is being shot in 3-D in suburban New York this spring.

Intel, which makes most of the processors inside today’s 3-D TVs, says that another obstacle is distribution. “The only thing that’s standardized about 3-D is Blu-ray,” says Lance Koenders, the director of marketing for Intel’s digital home group. “What definitely isn’t standardized is how broadcast content and Web content is displayed in 3-D.”

So while TV-makers are bickering over technology standards, Koenders says, many are also hedging their bets, loading TVs with cheap 3-D systems that offload most of the cost onto battery-powered glasses with active shutters in the lenses. (This little strategy is also the reason today’s home 3-D glasses cost $180 instead of $10, like the 3-D glasses you get in the theater.) “These are the well-calculated risks of breaking a chicken-and-egg problem,” he says.

Consumers haven’t been impressed with OEMs half-assed 3-D systems, which has put the burden on Hollywood to make 3-D appealing. But Schilowitz says that the move to 3-D isn’t like the move to HD. High-definition TV was about improving infrastructure and picture quality. Three-D, by contrast, is an artistic tool. ”When you get right down to brass tacks,” says Schilowitz, “it’s an education issue.” He says most directors of photography in Hollywood haven’t internalized 3-D in their creative process, and it will take time before movie-goers begin discovering films that have innovated with it. “We’re starting to see some guys who are really talented with 3-D,” he says, naming directors of photography like Darius Walsky, responsible for Pirates of the Carribbean 4, and John Schwartzman, who is rumored to be using RED 3-D cameras on the next Spiderman film in 2012. “John [Schwartzman] has taken to it like a duck to water,” says Schilowitz.

Unfortunately for companies like Panasonic and Best Buy, Americans only invest in a new TV an average of once every 8.6 years. By comparison, making a new TV show or movie takes a few months or a year. So retailers and manufacturers will continue to get hung out to dry while Hollywood finds its way.

Manufacturers like LG and Vizio are hoping to speed things by produce “passive” 3-D TVs that forgo geeky, expensive battery-powered glasses in favor of more traditional-looking 3-D eye glasses. As more 3-D blockbusters hit Blu-ray and more TVs come bundled with passive 3-D, consumers might get around to trying it. But the real panacea–and it’s not a quick one–may be the proliferation of consumer 3-D cameras. “There is a huge appetite for people to make their own content in 3-D,” says Tsuyuzaki, whose company has produced one of the first consumer-grade high-def 3-D video cameras. At $800, Panasonic’s 3-D video camera could be cheap enough to get people experimenting.

“Most people that buy those consumer cameras will have no idea how to use 3-D,” says Schilowitz, “but then again, some people will. And one of them will become the next Steven Soderbergh.” Until 3-D’s savior is united with his film-making destiny, a billion-dollar chicken and egg problem rolls on.

source: fastcompany.com

Has Apple dumbed down FCP X or is this a step up?

PART ONE
Apple just introduced a new version of Final Cut at the Final Cut Pro Supermeet during NAB 2011 in Las Vegas, Nevada. Touted as “revolutionary as the first version” from 1999, Apple introduced the new Final Cut Pro X saying that every major broadcaster and film maker nowadays relies on FCP for their video editing needs.

Basing on live updates coming from attendees at NAB 2011, Final Cut Pro X has been built from scratch, and it’s entirely 64-bit. It’s based on technologies like Cocoa, Core Animation, Open CL, Grand Central Dispatch and it focuses on image quality. It features a resolution independent timeline up to 4K for scalable rendering — in fact, it appears the old render dialog is gone entirely as the app uses the available CPU to keep files always rendered. FCP X allows you to edit while you’re importing thanks to its new engine, and it’s also got automatic media and people detection on import, as well as image stabilization.

Apple is promoting the new FCP X as a complete and total rebuild. Smart collections look very similar to iMovie, and overall there is a feeling Apple has borrowed some UI elements from the iLife application to make the general design more accessible, even for professionals. For instance, Apple has brought “single keystroke nesting” to Final Cut Pro — a new functionality that allows you to group chunks of media into a single clip in the timeline.  The “inline precision editor” allows you to make edits by revealing media with an iOS-like menu.

Source: http://www.engadget.com

It’s possible that the GUI is more user friendly and the functionality has improved but based on the comments and features presented today the jury is still out as to whether or not FCP will be going head to head with the competition.  This means that ease of use may or may not improve functionality but instead lowers the playing field for all the non-editors out there.  I am all in favor of making editing easier but the roll-out today suggests a beta experience that does little to assist the professional editor in cutting a long form project.  Feels like a step backward on the time/space continuum and I always get a little queasy when the word iMovie is mentioned in the same breath as Final Cut.

-Scott Arundale
PART TWO

UPDATE 4.13.11

Upon watching the demonstration in full, I got the impression there were some shills in the audience shouting their appreciation for the new features.

Now that calmer minds prevail let’s look at the upside.  Instant nesting with a single keystroke. Easy keyword features. Better sync and collision options. Automatic color grading, stabilization and background rendering.  Excellent use of the 64 bit engine.  But all of this suggests Apple is more interested in the young editor cutting short form trailers than the longform editor trying to cut a feature, nevermind the hapless assistant who must keep it together.  Much of the work that FCP is trying to achieve is normally the work of the assistant but they pre-suppose that the editor is working solo.  Pity that editor if they don’t have a second pair of hands to help him or her.  I’m all in favor of having the machine do the work.  I’m ready as an Apple Certified Trainer to go back to school and re-learn how to cut faster and easier.  What I loved about FCP is that copy and paste makes it easy to move stuff around.  Apple has made it even “easier”, but again requires a new mindset i.e it takes more thought and fewer keystrokes to achieve the same thing.  If this is the future than I am in.  But it begs the question: who is the target audience for this product?  Surely not the Hollywood narrative professional.  Instead the trailer/bumper/extreme sports crowd may find these new features useful.  For improved storytelling techniques, the jury is still out.

-SA

Sony and Panasonic go head to head with 3D cameras and displays

Stereo 3D is becoming almost mundane in its ubiquity with virtually every company of note in the video space touting product which is capable of acquiring, recording, managing, manipulating, delivering or viewing 3D in some fashion. Sony and Panasonic showed off their new 3D camcorders a the National Association of Broadcasters show here on Sunday.

Cost remains the biggest Cost remains the biggest Cost remains the biggest impediment to production and Sony and Panasonic, both of whom have vested interests in 3DTV channels (3Net and DirecTV’s n3D) and a strategy to sell more 3D displays to consumers, are preparing to ship new inexpensive – and uncomplex – camcorders aimed at putting 3D production in the hands of any professional.

Indeed, by the year end both companies will have professional shoulder-mounted and semi-pro handheld integrated 3D camcorders on the market.

Panasonic’s handheld version (the AG-3DA1) is already out and will be joined in the fall with a second integrated 3D camcorder, this time with a larger imager recording to Panasonic’s memory card format P2. This unit, the AG-3DP1, is intended for use in live productions, sports, independent films and documentaries.

Panasonic claims this shoulder-mounted camera can record 80 minutes of stereo in 10 bit AVC intra to twin 64Gb P2 cards. It contains two 1/3”, 2.2 3MOS sensors. By contrast its predecessor contained 2.7 megapixel chips and records to SD cards.

Panasonic’s shoulder mount will vie for market attention with Sony’s version, which is due out at around the same time. First shown in prototype last September, the PMW-TD300 3D camcorder features a twin optical lens equipped with three ½-type CMOS sensors.

Also shipping this summer from Sony is a compact 3D XDCAM camcorder intended for videographers, events and corporate videos. The HXR-NX3D1 incorporates two ¼-type CMOS sensors, twin 10x zoom lenses and an internal flash memory of 96GB to enable around 7.5 hours of 3D recording.

Panasonic said its 3DA1was finding favour as a training tool at film schools and sports facilities, including at Florida State for college football.

An eye-catching use of the camcorder will be aboard the final mission of NASA’s shuttle Atlantis this June, during which astronauts will use it to document the International Space Station and experiments in orbit.

At CES earlier this year Sony, Panasonic, and JVC all announced consumer-friendly still imaging and digital video stereo cameras as they seek to create a groundswell of interest and even user generated content in the 3D format. The cameras announced at NAB are a step up in terms of professional ergonomics and imaging quality. Nonetheless there are many critics of such single-bodied twin lens cameras who argue that the fixed interaxial distance between the lenses hampers 3D capture of events, particularly when capturing close ups.

Source: streamingmedia.com

James Cameron & Vince Pace Unveil New 3D Venture At NAB

Source: Deadline Hollywood.

The Cameron-Pace Group, announced today at the start of the National Association of Broadcasters confab in Las Vegas, “seeks to accelerate worldwide growth of 3D across all entertainment platforms including features, episodic and live television, sports, advertising and consumer products.” The company, run by co-chairman James Cameron and longtime collaborator Vince Pace, will offer next-generation camera systems, services and creative tools to the entire entertainment industry, not just film. “Our goal is to banish all the perceived and actual barriers to entry that are currently holding back producers, studios and networks from embracing their 3D future,” Cameron said. “We are dedicated to building a global brand that is synonymous with high-quality 3D and spans multiple channels, from features to episodic television, and changes the boundaries of what is understood to be 3D material.”


Cameron and Pace developed under Pace’s company PACE the Fusion 3D system, which was used for the 3D in such films as AvatarTron: Legacyand U2: 3D. PACE has begun the formal rebranding process, and its operation under the Cameron-Pace Group banner is effective immediately. CPG will be headquartered in Burbank, Calif., the current home to PACE.

CPG already is working on film projects that include Pirates of the Caribbean: On Stranger TidesTransformers: Dark of the MoonThe Three Musketeers,The Invention of Hugo CabretLife of Pi and 47 Ronin.

Young People worldwide are addicted to media

COLLEGE PARK, Md., April 10 (UPI) — It doesn’t matter if a college student lives in the United States, Chile, China, Slovakia, Mexico or Lebanon — many are addicted to media, researchers say.

Susan D. Moeller of the University of Maryland and the director of International Center for Media & the Public Agenda says whether in developing countries or developed countries the findings are strikingly similar in how teens and young adults use media and how “addicted” they are to their cellphone, laptop or mp3 player.

The researchers and colleagues at the Salzburg Academy on Media & Global Change asked about 1,000 students in 10 countries on five continents to give up all media for 24 hours and record their experiences.

The study found the students reacted almost identically to being unplugged from media and used virtually the same words to describe their reactions, including: fretful, confused, anxious, irritable, insecure, nervous, restless, crazy, addicted, panicked, jealous, angry, lonely, dependent, depressed, jittery and paranoid.

“Perhaps naively, we assumed that we would find substantial differences among the students who took part in this study,” Moeller says in a statement.

“After all, our partner universities come from very different regions and from countries with great disparities in economic development, culture and political governance.”

In short, the students were blind-sided by how much media have come to dominate their lives and their identity, Moeller says.

The study is at: http://theworldunplugged.wordpress.com/

Chapman University wants to overtake USC and NYU

Film Studies: Chapman University wants to overtake USC and NYU.

The article in this Sunday’s L.A. Times sums it up nicely.

It has been an extraordinary ride during the last four years.  I began teaching at Dodge College as an Adjunct Professor and was thrilled when Bob Bassett offered me a contract a year later.  The new Marion Knotts studio complex came with the usual wrinkles: too much technology and over-engineered but with time the school began to hit its stride.  In terms of post-production we offer 100 Avids along with DS Nitris, Smoke and Flame.  We still count ourselves as a “Film School” and it remains part of our name.  There is no other learning institution in the country that features both Autodesk Lustre and Spirit 4K.  I give a big shout out to Dan Leonard, Associate Dean and Chief Technology Officer who is the mad scientist who put this rig together along with Deszo Magyar, Associate Dean and Chief Academic Officer who consistently reminds me that character development is key to a successful story.   I’m very much involved in Alumni relations as I believe the most important aspect of our program will be when graduates will return to Orange  and share their experiences.  It is happening now. We are building our own Dodge College/Chapman mafia and we have a great reputation in the industry for interns who are bright and committed and show up on time and ready to work!

- Scott Arundale

Final Cut Pro is long overdue for a real upgrade

Many complained that FCP vers. 7 was not really worthy of a new number but belonged in the vers. 6 family.  In my opinion Apple has always been trigger happy with upgrades to all their software but nevertheless much of their brain trust has been noticeably absent when it comes to improving their editing platform.  It has been assumed that the tech wizards were otherwise engaged in the cash cows of iPhone and iPad.  This new version is long overdue.

Apple’s Final Cut Pro made its debut at NAB in 1998 before being released as a product the following year. The software has a history of April releases, though its last major version came in July 2009. The software itself hasn’t been a standalone product for quite a bit longer though, instead being wrapped up as part of Apple’s Final Cut Studio suite, which bundles together Final Cut Pro with Motion, DVD Studio, and Soundtrack Pro, as well as the Color and Compressor applications.

Reports began circulating in late February that Apple was nearing completion on a complete overhaul of the software that would bring Final Cut Pro into the 64-bit era and more importantly a release this spring. That report from Tech Crunch which cited anonymous sources, said that the design was both under the hood and sporting a new user interface.

A new report from ProVideoCoalition says Apple plans on “taking over” the 10th Annual SuperMeet event taking place on April 12 to announce a new version of the software.

It may be time to break out the champagne.

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.