Archive for the ‘Apple’ Category

Final Cut Pro X 10.0.6 Update

From Philip Hodgett’s blog:

Final Cut Pro X 10.0.6 is probably the most feature-rich release since the original one. As well as the features Apple discussed at NAB  2011:

  • Multichannel Audio Editing Tools
  • Dual Viewers
  • MXF Plug-in Support, and
  • RED camera support

there’s more. Much more. Including a feature I wish they hadn’t put in and one I’m extremely pleased they did. I’m ecstatic that selective pasting of attributes is now an Final Cut Pro X feature, but I’m really annoyed that persistent In/Out points made it to this release. More on these later.

There’s a rebuilt and more flexible Share function; a simplified and improved Unified Import with optional list view, horizontal scopes mode (and scope for each viewer), Chapter Markers, faster freeze frames, support for new titling features inherited from Motion, more control over connection point, 5 K image support, vastly improved Compound Clip structure (both functional and for us the XML), customized metadata export in the XML (for asset management tools mostly), and two features that didn’t make it to the “what’s new” list: Range Export from Projects and a bonus for 7toX customers.

All up I count more than 14 new features, whereas Final Cut Pro X 10.0.3 had four (although arguably Multicam and Video out were heavy duty features).

Because of the developer connection, I’ve been working with this release for a few months. We have new versions of 7toX and Xto7 waiting for review in the App Store that support the new version 1.2 XML.

MXF and RED Camera Support

In keeping with their patten, Apple have supported a third party MXF solution rather than (presumably) paying license fees for each sale when only a small percentage of users will use the MXF input (and yes, output) capability. The named solutions are MXF4mac and Calibrated{Q} MXF Import, but apparently there are others. Working with MXF files should not feel different than working with QuickTime files.

Along with RED native support, Apple quietly upped the maximum resolution from 4K (since release) to 5K.

I don’t work with MXF or RED so I’ve had no ability (nor time) to test these functions. I’ll leave that to those with more knowledge.

Dual Viewers

More accurately, the current Timeline Viewer and an optional Event Viewer (Window > Show Event Viewer). You get one view from the Timeline and one view from the Event. I can see how this could be useful at times, although truthfully I never missed it.

Final Cut Pro X's dual viewers
One Viewer from the Event, one Viewer for the Project. (Click to enlarge)

Multichannel Audio Tools

There’s a lot of room for improvement in the audio handing in Final Cut Pro X so the new multichannel audio tools are a welcome step in the right direction.  Initially there’s no visible change, until you choose Clip > Expand Audio Components. With the audio components open, you can individually apply levels, disabled states, pans and filters to individual components, trim them, delete a section in the middle – all without affecting the clips around.

Final Cut Pro X's multichannel audio
No multichannel audio in Solar Odyssey, so I borrowed a test file from Greg used to add support to 7toX and Xto7.

For 7toX, if there are separate audio levels, pans or keyframes on a sequence clip’s audio tracks these will be translated onto the separate audio components in Final Cut Pro X. Similarly for Xto7 the levels/pans/keyframes on the clip’s audio components are translated onto the audio clip’s tracks.

More flexible Scopes

A new layout – Vertical – stacks the Scopes on top of each other. Better still, they remember the settings from the last time used! Also good is that you can open a Scope for each of the viewers.

Final Cut Pro X's dual scopes
Dual Viewers, Dual Scopes and a stacked Vertical layout. If the brightness control was there before, I missed it.

You should note that both those images are 720P at 97% (from the Parrot A.R. Drone 2.0 FWIW). I love the Retina display!

Improved Sharing

There’s now a Share button directly in the interface. More importantly, you do not have to open a Clip from the Event as a Timeline to export.

Final Cut Pro X's share destinations
Share directly from Event or Project.

But what’s that at the end? Why yes, I can create a Share to my own specifications, including anything you can do in Compressor (by creating a Compressor setting and adding that to a New Destination). Note that HTTP live streaming is an option.

Final Cut Pro X's share destinations
Now you can create a custom Share output for exactly your needs.

Final Cut Pro X 10.0.6 will also remember your YouTube password, even for multiple accounts. If you have a set package of deliverables (multiple variations for example) you can create a Bundle that manages the whole set of outputs by applying the Bundle to a Project or Clip in Share. Create a new bundle and add in the outputs you want.

Range-based export from Projects

Another feature not seen on the “What’s new” list is the ability to set a Range in a Project and export only that Range via Share. A much-requested feature that’s now available.

Unified Import

I never quite loved that I would import media from the cameras (or their SD cards) via the Import dialog, while importing Zoom audio files was a whole other dialog. Not any more with the new unified Import dialog. There’s even an optional List View, which is my preferred option. (The woodpecker was very cooperative and let me sneak in very close with the NEX 7.)

Final Cut Pro X's unified Import dialog
The Unified Import dialog, with optional list view like the Event list view, with skimmer and filmstrip view.

Waveforms and Hiding already imported clips are also options. The window now (optionally) automatically closes when Import begins.

Other import options.

Chapter Markers

For use when outputting a DVD, Blu-ray,  iTunes, QuickTime Player, and Apple devices.

Final Cut Pro X's chapter markers
There are now three types of Marker: Marker, To Do and Chapter.

The Marker position notes the actual marker, the orange ball sets the poster frame, either side of the chapter mark. A nice refinement for Share.

In 7toX translation, sequence chapter markers become chapter markers on a clip in the primary storyline at the same point in the timeline.

Fast Freeze Frame

Simply select Edit > Add Freeze Frame or press Option-F to add a Freeze Frame to the Project at the Playhead (or if the Clip is in an Event, the freeze frame will be applied in the active Project at the Playhead as a connected clip). Duration, not surprisingly, is the default Still duration set in Final Cut Pro X’s Editing preferences.

New Compound Clip Behavior

Did you ever wonder why Compound Clips were one way to a Project and didn’t dynamically update, but Multicam was “live” between Events and Projects? So, apparently did Apple. (We certainly did when dealing with it in XML). Compound Clips now are live.

  • If you create a Compound Clip in a Project, it is added to the default event and remain linked and live.
  • If you create a Compound Clip in an Event, it can be added to many Projects and remain linked and live.

By linked and live I mean, like Multiclips, changes made in a Compound Clip in an Event will be reflected in all uses of that Compound Clip across multiple Projects.

Changes made to a Compound Clip in a Project, are also made in the Compound Clip in the Event and all other Projects.

To use the old behavior and make a Compound Clip independent, duplicate it in the Event.

The old behavior is still supported so legacy Projects and Events will be fine.

Final Cut Pro 7 sequences translated using 7toX become these new “live” Compound Clips. If you don’t want this behavior you can select the Compound Clip in the Project timeline and choose Clip > Break Apart Clip Items to “unnest” the compound clip .

Selective Pasting of Attributes

It had to be coming, and I’m glad it’s here. This has probably been the feature from Final Cut Pro 7 I’ve missed most.

Final Cut Pro X's Paste Attributes
Looks familiar! I like the visual clue of which clip the content is coming from, and which it is targeted at.

One of the things I love about Final Cut Pro X is that there are “sensible defaults”. Not least of which is the Maintain Timing choice. In the 11-12 years I spent with FCP 1-7 on three occasions I wanted to use the (opposite) default. Every other time I had to change to Maintain Timing, which is now thankfully the default.

Persistent In and Out Points

You got them. And it’s a good implementation, allowing multiple ranges to be created in a clip. I am not a fan, and wish it were an option. Over the last two months I’ve added keywords to “ranges” I didn’t intent to have because the In and Out were held from the last playback or edit I made. Not what I want. So I have to select the whole clips again, and reapply the Keyword. It gets old after the twentieth time.

It gets in my way more than it helps, which is rather as I expected. Selection is by mouse click (mostly – there is limited keyboard support) so this gets every bit as confusing as I anticipated.

Your last range selection is maintained. To add additional range selections (persistent) hold down the Command key and drag out a selection. (There are keyboard equivalents for setting a new range during playback.) You can select multiple ranges and add them to a Project together. (I’m not sure about the use case, but it’s available.)

Customizable Metadata Export to XML (and new XML format)

Along with a whole new version 1.2 of the XML (which lets us support more features in the XML) is the ability to export metadata into the XML. These are the metadata collections found at the bottom of the Inspector.

Final Cut Pro X's XML metadata view
Select the metadata set you want included in the XML during export.

Remember that you can create as many custom metadata sets as you want and choose between them for export. This will be a great feature as soon as Asset Management tools support it. No doubt Square Box will be announcing an update for CatDV the moment this release of Final Cut Pro X is public.

The new XML format also allows 7toX to transfer a Final Cut Pro 7 clip’s Reel, Scene, and Shot/Take metadata into their Final Cut Pro X equivalents.

Flexible Connection Points

We’ve always been able to hold down the Command and Option keys to move a connection point. What is new is the ability to move a clip on the Primary Storyline while leaving connected clips in place. This is really a great new feature and one I’ve used a lot. Hold down the Back-tick/Tilde key (` at the top left of your keyboard) and slip, slide, trim or move the Primary Storyline clip leaving the connected clips in place.

Titling is significantly improved, including support for the new title markers feature in Motion

As I’m not in the Motion beta I’m not at all certain what this means. I’m sure Mark Spencer will have an explanation over at RippleTraining.com soon.

Drop Shadow effect

Well, a new effect that adds a Final Cut Pro 7 style drop shadow to a Clip. If you’ve got a clip in Final Cut Pro 7 with a Motion tab Drop Shadow applied, 7toX will add the new Drop Shadow effect to it during translation.

Bonus unannounced feature – XML can create “offline” clips

This is great news for developers because previously all media referenced by an XML file had to be available (online) when the XML was imported into Final Cut Pro X. With XML version 1.2 that’s not necessary, so we’ve taken advantage of this in 7toX. The user can relink the offline clips to media files by the usual File > Relink Event Files… command after translation and import.

What else do I want?

I’d like a Role-based audio mixer.

I’d like Event Sharing to multiple users at the same time, with dynamic update of keywords and other metadata between editors. (I do not think I want to share a Project in that way – more sequentially manage with a Project, like Adobe Anywhere.

The Rise of Dual Screen Apps, courtesy of Apple TV

Article by Jeremy Allaire  mashable.com

Dual-screen apps are a new phenomena, enabled by the advent of wireless technologies that allow for effortless pairing of a PC, tablet or smartphone with a TV. They are changing how people are interacting and “consuming” content within apps. For developers this creates many new opportunities to provide better experiences for their users, but it requires thinking about dual-screen setups from the start as well as new tools.

The opportunity for dual-screen apps is huge. And it’s more than just watching a video or playing a game: Dual-screen apps have the potential to transform the office meeting room, the classroom, the retail store, the hospital, and really any other context where people are interacting around content and information and where that information would benefit from rendering and display on a large screen such as a TV monitor.

To better understand this concept, it’s necessary to step back and reconsider the nature of how we write software and the user experience model for software.

The Evolution From Single Screen

Today, the predominant user-experience model for software and applications online is a single screen. We browse web applications on a desktop PC, mobile browser or tablet browser and interact with and consume content and applications on that screen. It is very much a single, individual user task. Likewise, we install apps onto these devices and consume and interact with information, perform tasks, make purchases, etc. through these apps. Again, this is a solitary single individual task.

As a result, when software creators plan their applications, they are typically designed and developed with this single user, single-screen concept in mind.

Dual-screen apps change all of that by shifting the software and user experience model from one user to potentially many, and from one screen (PC/phone/tablet) to two screens (phone/tablet and TV monitor). From a software development and user-experience perspective, the large monitor (which is the true second screen — versus the standard concept that considers the tablet as the second screen) becomes an open computing surface where one can render any form of application functionality, information, data and content.

SEE ALSO: Is This the Second-Screen TV App That Finally Goes Mainstream?

Importantly, designers and developers need to shed the concept that “TVs” are for rendering video, and instead think about TVs as large monitors on which they can render applications, content and interactivity that’s supported by a touch-based tablet application.

The Social Computing Surface

While we have the greatest affinity for large monitors as fixtures of the living room, increasingly flat-screen monitors are a becoming a ubiquitous part of our social fabric. In fact, large monitors often sit at the center of any social setting. In the home, these large monitors provide a social surface for those sharing the living room space. Increasingly, monitors are a common part of nearly every business meeting room space — not for watching video, but for projecting shared content and business data and presentations that support business and organization collaboration.

Likewise, monitors are in medical and hospital settings providing visual information to patients. They are increasingly in nearly every classroom, whether through a projector or an actual TV monitor and support the presentation of information that is needed for a collection of students. Large monitors are increasingly ubiquitous in retail settings as well.

The key concept here is that this pervasive adoption of TV monitors is the tip of the spear in creating a social computing surface in the real world. Forget about social networks that connect people across their individual, atomized computing devices — the real social world is groups of people in a shared space (living room, office, classroom, store, etc.) interacting around information and data on a shared screen.

Until very recently, the way in which these TV monitors could be leveraged was limited to connecting a PC through an external display connector to a projector or directly to a TV. The recent breakthrough that Apple has fostered and advanced more than any other tech company is AirPlay and associated dual-screen features in iOS and Apple TV.

Specifically, Apple has provided the backbone for dual screen apps, enabling:

  • Any iOS device (and OS X Mountain Lion-enabled PCs) to broadcast its screen onto a TV. Think of this as essentially a wireless HDMI output to a TV. If you haven’t played with AirPlay mirroring features in iOS and Apple TV, give it a spin. It’s a really exciting development.
  • A set of APIs and an event model for enabling applications to become “dual-screen aware” (e.g. to know when a device has a TV screen it can connect to, and to handle rendering information, data and content onto both the touch screen and the TV screen).

With the existing Apple TV unit sales already outselling the Xbox in the most recent quarter, we can see a world that goes from approximately 5 million dual-screen-capable Apple TVs to potentially 15-20 million in the next couple of years, and eventually to 30-50 million as new and improved versions of the Apple TV companion device come to market.

As a result, it’s an incredible time to experiment with this fundamental shift in computing, software and user experience, to embrace a world where the Tablet is the most important personal productivity device, and the TV is a rich and powerful surface for rendering content and applications.

How Dual-Screen Apps Will Work

As we rethink the TV as a computing surface for apps, it’s really helpful to have some ideas on what we’re talking about. Below are a series of hypothetical examples of what is possible today and of course what will be even bigger as these new dual screen run-times proliferate.

Buying a House: Imagine you’re looking into buying a house. You open your tablet app from a reputable home-listing service and perform a search using criteria that you care about and begin adding potential fits to a list of houses you’d like to explore. When you select a specific house, the app detects you’re connected to an Apple TV and launches a second screen on the TV that provides rich and large visual displays about the house — HD-quality photos and contextual information about the house. Here, the power of dual screen is the fact that you and your spouse can sit in the living room and explore a house together without crouching over a computer or tablet on someone’s lap, and the house can be presented with HD-quality media and contextual information.

Buying a Car: Imagine launching the BMW app on your tablet and deciding to both learn about car models and configure a car — like buying a house, often a “social” decision between partners. On the TV, the app renders a high-quality rendition of the car. As you explore the car’s features from your tablet, associated media (photos, video and contextual metadata) render onto the large TV in front of you. As you configure your car using your tablet, it updates a visual build of the car on the large screen, providing an inline HD video for specific features.

Kids Edutainment: Looking to introduce your three-year old to key cognitive development concepts? Launch a learning app where the child interacts with the tablet application and sees visual information, animation and other content on the TV screen. Their touches on the tablet instantly produce rich and relevant content on the TV screen. Learning to count? Feed cookies over AirPlay to Cookie Monster on the TV who eats and counts with you. Learning about concepts like near and far? Tap the table to make a character move closer and away from you. Build a character on the tablet and watch the character emerge on the TV screen.

SEE ALSO: Designing for Context on Multiple Devices

Sales Reporting: As a sales manager, you walk into your team conference room with a TV monitor mounted on the wall. You kick open your Salesforce.com tablet app on your tablet and begin filtering and bringing up specific reports on your tablet, and with the touch of a button you push unique visual reports onto the shared surface of the conference room TV. Here, the sales manager wants control of the searches and filters they have access to and only wants to render the charts and reports that are needed for the whole team to see.

Board Games: Imagine playing Monopoly with your family in the living room — one or two or maybe even three touch devices present (phones, iPod touches, iPads). Each player has their inventory of properties and money visible on their device. The app passes control to each user as they play. On the TV screen is the Monopoly “board” with a dynamic visual that updates as users play — the movement of players, the building up of properties, etc.

The Classroom: A teacher walks into a classroom with an Apple TV connected to a HDMI-capable projector that projects onto a wall or screen. From their tablet, they pull up an application that is designed to help teach chemistry and the periodic table — they can control which element to display up on the screen, and the TV provides rich information, video explanations, etc. The app is designed to provide ‘public quiz’ functionality where the TV display shows a question, presumably related to material just reviewed or from homework, students raise their hand to answer and then the answer and explanation is displayed.

Doctor’s Office: You are meeting with your doctor to go over test results from an MRI scan. The doctor uses his or her tablet to bring up your results, picks visuals to throw onto the TV monitor in the room, then uses his or her finger to highlight key areas and talk to you about they’re seeing.

Retail Electronics Store: You’re at a Best Buy and interested in buying a new high-quality digital camera. A sales specialist approaches you with tablet in hand and asks you a few questions about what you’re interested in while tapping those choices into their tablet app. From there, it brings up on a nearby TV display a set of options of cameras — based on further probing, they drill into a specific camera choices which brings up a rich visual with a video overview of the specific camera that you’re interested in.

Consuming News: A major revolution has just broken out in a nation across the planet. Time has captured incredible audio, photos and video of the events. You and your friends sit down in front of the TV to learn more. You open the Time Magazine tablet app and bring up a special digital edition about the revolution. From the tablet, you flip through and render onto the TV rich HD-quality photographs, listen to first hand audio accounts (accompanied by photos) and watch footage from the events. The app renders a huge visual timeline of the events that led up to the revolution. It’s an immersive media experience that can be easily shared by friends and family in the living room.

Consuming Video: Last but not least, of course, dual-screen apps will be essential to any app that is about consuming video — whether a news or magazine app, a vertical website (think Cars.com, BabyCenter.com, AllRecipies.com, etc.), or of course a catch-up TV app from a TV network or show that you care about. You open the app on your table to explore what to watch, and when you’re ready to watch the show instantly pops onto your TV in gorgeous HD quality, and the tablet app becomes your remote control and presents relevant contextual information about the video, episode or what have you.

The Coming Dual-Screen Revolution

This is such a groundbreaking approach to apps and software we expect lots of others to try and emulate what Apple is doing. Already, Microsoft is promoting the ability to use its Surface Tablet in conjunction with apps built for the Xbox. Samsung has introduced features in its tablets and TVs to enable easy media sharing from your tablet or phone onto a Samsung Smart TV, and surely Google will follow suit with similar features to AirPlay in the Android OS. Apple is still early in deploying this technology — it’s sometimes flaky and a little bit hidden from end-user view — but I expect major changes in the coming months and years.

Virtually every application that exists on the web and phones and tablets likely has a dual-screen use case. Simply put, Web and app designers and developers need to imagine a world where the tablet and TV are a single run-time for their applications which each screen providing distinct value for the user controlling the app and the user consuming rich media and information on a large display. Sometimes this is just one person (like picking and watching a show or playing a game or learning something), but crucially and very often I believe that these apps will be designed with multiple users — and a social context — in mind.

Jeremy Allaire is CEO and founder of Brightcove, a global provider of cloud-content services that offers a family of products and developer tools used to publish and distribute professional digital media.

The iPad 3 morphs into a professional film camera

The iPad 3 morphs into a professional film camera

The iPad has just been turned into a fully functioning professional-standard digital film camera by a New York-based startup called The Padcaster LLC.

The company has created what it calls The Padcaster, which takes the humble iPad 3’s video capabilities and catapults them into professional level with the addition of an aluminium frame with threaded holes around the edges to attach external mics, lights and other accessories.

The frame can be connected to a professional tripod, monopod or shoulder mount and, crucially, the Padmaster has an optional ‘Lenscaster’ that attaches to the Padcaster and makes it possible to attach standard camera lenses to the iPad.

There’s even a 35mm lens adapter to enable cinema lenses to be strapped on to the iPad to capture shallow depth of field and provide DSLR-like focusing.

The Padcaster is aimed at video journalists, videographers and DSLR shooters and, says the maker, provides the opportunity to make “film-quality footage as an all-in-one production studio on the go”, capturing images with the device then using video and audio editing and grading apps freely available on the iPad to cut the footage.

The Padcaster and Padcaster/Lenscaster combo are currently available at a ‘special launch price’ of $149 and $189 respectively.

PADCASTER PRODUCT TOUR from Manhattan Edit Workshop on Vimeo.

Here’s a short film shot entirely on the iPad 3 using the Padcaster.

“Sprung Spring” – shot on the iPad3 (new iPad) with the Padcaster from Manhattan Edit Workshop on Vimeo.

http://www.televisual.com/news-detail/The-iPad-3-morphs-into-a-professional-film-camera_nid-1938.html

Avid goes mobile

Avid, maker of high-end digital video and audio production tools, is bringing its “pro-sumer” video editing software to the iPad.

The app is available starting Thursday as part of the Avid Studio suite. The app will run on iPad only, though Avid says it’s exploring other mobile operating systems.

Avid Studio for iPad costs $4.99 to start; after 30 days, the price will jump to $7.99.

That’s still much less than what other current desktop editing applications cost, including Avid’s own Avid Studio ($129.99), Adobe Premiere Elements ($99.99), Apple’s Final Cut Pro X ($299.99), and Sony’s Vegas Movie Studio HD Platinum ($59.95).

The iPad app marks the Burlington, Mass.-based company’s first video editing application for tablets. Video editing software generally requires a substantial desktop system or a bulky laptop; using video editing apps on relatively small smartphone screens can be cumbersome. Avid is hoping its app hits somewhere in the middle.

“We’ve seen a shift in how creation is happening, and it’s really happening on almost any device,” said Tanguy Leborgne, vice president of consumer and mobile technology strategy at Avid. “We think the tablet is more than just a consumer device; more and more people are creating on it.”

While Avid says the app captures most of the editing capabilities available on its desktop system, there are some obvious areas in which an iPad editing app would be lacking.

For starters, pro-level editors accustomed to using a large screen for edits will likely feel a tablet doesn’t provide enough screen real estate for real edits.

Also, with Avid Studio on a PC, video editors can export a Flash video file, and burn video files to a CD or DVD. On the iPad, neither of those functions is an option.

Users also likely won’t want to export lots of large, high-definition video files to the iPad and take up storage space on the tablet.

Fortunately, full projects and video files can be transferred to and from the Avid Studio app via iCloud and iTunes. Finished movie files can also be shared directly from the Avid app to Facebook and YouTube.

The idea is that the iPad app and the desktop software are complementary, Leborgne said, so that users who want to create and edit projects on the go can do so, but ultimately preserve them by taking them to the PC.

The Avid iPad app does have some nice features, including an interface that includes a storyboard area and an editing timeline. And while some video editors rely heavily on customized keyboards or a mouse, others might appreciate the ability to pinch and squeeze videos and images to scale them on the touchscreen of the iPad, or the ability to move text and titles around with their fingers.

Avid’s new product comes just a couple days after Apple released an update for its Final Cut Pro X (FCPX) video editing software, which addressed video editors’ complaints about the software’s lack of professional-level bells and whistles. Now FCPX includes multicam editing, advanced chroma-key features and the ability to open up old FCP projects in the new software.

While Adobe Premiere is considered the first popular digital video editing application, it was Apple’s Final Cut Pro, which launched in 1999, that eventually chipped away at the market of video editors using Avid’s high-end system.

Apple’s FCPX also comes at a significantly reduced price from previous iterations of Final Cut Pro, which used to cost around $1,000. Both Avid and Adobe responded to Apple’s new software by offering discounts to users who switched over to their software.

“Both Apple’s product and the pricing strategy were the same thing we’re trying to address here,” Leborgne said. “But for professionals, it relayed to them that Apple was not really focused on the higher end of the market.”

As evidence that some professionals were disappointed with the new FCPX, Leborgne pointed to Hollywood production company Bunim/Murray — the reality TV pioneers dropped Final Cut Pro in favor of Avid.

http://allthingsd.com/20120201/avid-brings-its-pro-sumer-video-editing-app-to-ipad/

Final Cut Pro X yet to be embraced by Reality Producers

Chris Foresman reports from arstechnica.com:

Following the controversial launch of Apple’s completely revamped video editing software, Final Cut Pro X, Avid Media has announced (hat tip to MacRumors) that award-winning TV production company Bunim/Murray is dropping Final Cut Pro in favor of a complete Avid makeover. Going forward, the company will use Avid Media Composer and Avid Symphony for editing along with an Avid ISIS 5000 networked storage system to replace its current Final Cut Pro workflow.

“Due to the large volume of media generated by our reality shows, we needed to re-evaluate our editing and storage solutions. At the same time, we were looking for a partner who would understand our long-term needs,” Bunim/Murray’s SVP of post production Mark Raudonis said in a statement.

The message between the lines is that Apple’s latest offerings simply won’t (ahem) cut it anymore. Earlier this year, Apple completely re-architected Final Cut Pro X from the ground up with a new, modern media handling framework as well as 64-bit support. In doing so, however, it dropped many features that editing pros had come to rely on in their workflows. Apple also dropped its Final Cut Server product after phasing out its Xserveand Xserve RAID storage products over the past few years.

The decision has left many working pros wondering if Apple cares much about the professional market anymore. And, while the company has promised to improve Final Cut Pro X over time, those promises apparently weren’t enough for Bunim/Murray and others who have since migrated to competing solutions from Avid and Adobe. Let’s just say we don’t expect this to be the last we hear about major production companies making the switch.

UPDATE:  12/31/12

Apple has updated the latest version of its video editing software, Final Cut Pro X, addressing some of the complaints video editors had voiced when it was released last June. The updated FCPX now includes multicam editing and advanced chroma-key controls, and supports a new tool for opening up old Final Cut Pro 7 projects.

http://allthingsd.com/20120131/apples-updates-final-cut-pro-x-addressing-video-editors-complaints/?refzone=topics_apple

iPads in schools: ‘The last generation with backpacks’?

In survey, 16% of school tech directors expect to have 1 tablet per student within 5 years


Whether counting heads at the Apple Store or buttonholing cell phone users at the Mall of America, Piper Jaffray’s Gene Munster is the master of the small survey that may or may not be significant.

His latest: A survey of 25 educational technology directors at a conference on integrating technology in the classroom. “While our sample is small,” he writes in a note to clients issued Monday, “so is the population of IT decision makers in the education field in the US.”

And what did he discover? Among his findings:

  • 100% were testing or deploying iPads in their schools. 0% were testing or deploying Android tablets
  • Their schools currently have an average of one computer for every 10 students
  • Nearly half (12) expect to eventually deploy one computer per child; two of their schools already do
  • More than a third (9) expect to deploy one tablet per child; one of them already does

Given the huge problems facing America’s schools, it’s a slender thread on which to base a vision of broad educational reform. (Munster quotes outgoing Apple retail chief Ron Johnson, who has suggested that the current crop of students might be “the last generation with backpacks.”)

But Munster is probably correct that the overwhelming preference for iPads over tablets running Google’s (GOOG) Android reflects the power of Apple’s (AAPL) first mover advantage. He writes:

“We also see a trend in education (which is mirrored in the enterprise) that familiarity with Apple devices among students (or employees) is causing a demand pull within institutions to also provide Apple devices.”

SOURCE:  http://tech.fortune.cnn.com

Walter Murch on the demise of FCP

Chris Portal attended the Boston Supermeet of the Final Cut Pro Users and reports:

Walter Murch, a long time Final Cut Pro user, and editor of Apocalypse Now, The Godfather Part III, The English Patient, Cold Mountain, Tetro, among many other films, headlined the Boston Supermeet on Thursday October 27, 2011. It marked his first public appearance since the launch of Final Cut Pro X.

Hemingway & Gellhorn is his latest project for HBO, and is edited on Final Cut Pro 7. The film is a celebration of the tactility of film, yet a film that wouldn’t have been possible without the digitization of film. It uses archive material existing on a wide variety of film mediums, all with different grain sizes, in which actors are dropped in digitally, while trying to preserve the grain of the original element. The film takes you on a roller coaster ride diving in and out of this world, going into the grain and sprockets, and out into the digital world.

His Final Cut Pro project consisted of 22 video tracks and 50 audio tracks, combining sound elements ranging from 8 tracks of dialogue, to 24 tracks of mono and stereo sound effects with and without low frequency enhancements (LFE)!!

Another piece of the workflow was the integration of Filemaker Pro, which he uses to gain a different insight into his film. Using a dependency diagram of sorts, he associates every shot to a specific scene, what music and effects should belong to it, etc. It’s not a time line in any way, but more a view of all the relationships between your media assets.

As far as other equipment Walter used on the project, he used 2 Arri Alexas, outputting to codex materials. The codex downloaded into a ProRes 1280 LT, DPX “negative” (to do the final color timing), and H.264 with internet via PIX (to share assemblies with HBO). There were 5 editing stations, using an XSAN with 28 TB on XRaid running XServe.

There were 1862 shots in the finished film:

  • 482 visually manipulated
  • 227 visual effects
  • 255 repositioned or blown up

While there used to be a rule of not blowing up an image beyond 120% to avoid introducing noise and grain, with the Alexa footage, he was able to take the film and blow it up 240% without being noticeable.

He used FCP7, which he acknowledged may be the last time he uses Final Cut Pro. He considers many professionals to be at a juncture where we need to come to terms with what the software can do in the time the film is being developed.

Walter was in Cupertino when Final Cut Pro X was first dangled in front of a few editors. It was a beta version, and Apple highlighted things like 64 bit support. After that initial exposure to FCPX, he dove into making a film, and it wasn’t until June when FCPX was published that he revisited it. He quickly looked at it, and said he couldn’t use it, wondering where the “Pro” had gone. It didn’t have XML support which he depended on, the ability to share projects on a raid with people, etc. He was confused and wondered what was happening.

He wrote Apple a letter asking what was behind everything that was happening, especially since they had end-of-lifed the current version, as well as a list of things he needed. Like a report card children often get, without XML, Walter explained to Apple that FCPX “did not play well with others”. The lack of tracks was another killer for him. While he doesn’t really need to work with 50 tracks, he does need to leverage the ability to selectively raise or lower the levels very specifically.

Walter sees there having been a shift at Apple over the last 10 years. They have benefited from the professional market, and we all have made a lot of noise about Apple, but starting with the iPod, iPhone, and iPad, Apple has broadened out into a mass-market creature, wanting to democratize capabilities even further.

While Walter is encouraged by the updated FCPX version last month, he hasn’t used it on any real work yet, so he is cautiously optimistic (and still traumatized he says). “Do they love us? No…I know they like us….but they keep saying they love us??”

Things wrapped up with a Q&A, mostly comprised of questions attendees had submitted that evening prior to his talk. A few interesting ones were:

Q: When is it time to walk away from the work?

A: ”When you see dailies, that is the only time you are seeing the images for the first time. There will be no other time for a first. It is the closest you can get to experiencing what the audience will experience. It’s a precious moment. I will sit and watch the dailies in the dark, holding a computer where I’ll type anything the image makes me feel or think, in order to preserve that first moment. Doing so will help clear the fog down the road when you’re feeling you’re getting lost.”

Q: How do you know if a scene works or doesn’t?

A: ”A scene may work on its own, but not in the context of the movie. It can be very dangerous to preemptively strike a scene from a film before you’ve seen the entire film. You can say you don’t agree with where the scene is going, but you don’t know if in the larger picture it may still have a shot.”

Q: Is there one piece of advice you can impart to sound designers?

A: ”Always go farther than you think you can go. Try to bend the literalness. Literalness doesn’t light the fire in the audiences mind. Levitate the film. Ignite the imagination.”

Q: Thoughts on 3D?

A: ”In 2D, your eyes focus on the plane of the screen while they converge towards the plan of the screen, but when you have something coming out of the screen in 3D, you not only need to focus on the screen, but you also need to converge on the detail protruding out of the screen. The mind can do it, but we’re not programmed for it. It requires processing many frames before your mind figures it out, and by then you’ve missed information. It’s analogous to the moment when the fan on your computer starts up.”

Q: If you didn’t use FCP, where would you go?

A: “I’ve used Avid in the past, so I know it well. There are some very good things that Avid has, but I’m also curious about Premiere since I’m interested in technology.”

Apple iCloud vs. UltraViolet

Apple is working on a cloud service for movies that is set to launch late this year or in 2012, the LA Times reported.

Apple has been meeting with studio executives and is looking to finalize agreements that would allow consumers to  buy films via Apple’s iTunes and access them on any Apple device via its iCloud online locker, it said.

The news comes after Tuesday’s launch of the UltraViolet cloud service, for which Wall Street has a mixed outlook, with the home entertainment release of Warner Bros.’ Horrible Bosses.

Since Apple is not currently part of the consortium of studios, electronics makers and online distributors that launched UltraViolet, the company is considering allowing consumers who buy UltraViolet-enabled film to watch them more easily on Apple devices via apps, the Times said. BTIG analyst Richard Greenfield has said Apple’s lack of participation is a key hurdle for UltraViolet’s success.

Importantly though, movies bought on iTunes would continue to only work on Apple devices as Apple wants to keep encouraging consumers to buy its devices, the Times highlighted.

source: hollywoodreporter.com

Final Cut is Dead! Long live Final Cut!

Apple’s Final Cut Pro is the leading video-editing program. It’s a $1,000 professional app. It was used to make “The Social Network,” “True Grit,” “Eat Pray Love” and thousands of student movies, independent films and TV shows. According to the research firm SCRI, it has 54 percent of the video-editing market, far more than its rivals from Adobe and Avid.

On Tuesday, Apple pulled a typical Apple move: it killed off the two-year-old Final Cut 7 at the peak of its popularity.

In its place, Apple now offers something called Final Cut Pro X (pronounced “10”). But don’t be misled by the name. It’s a new program, written from scratch. Apple says a fresh start was required to accommodate huge changes in the technological landscape.

Apple veterans may, at this point, be feeling some creepy déjà vu. You’ve seen this movie before. Didn’t Apple kill off iMovie, too, in 2008, and replace it with an all-new, less capable version that lacked dozens of important features? It took three years of upgrades before the new iMovie finally surpassed its predecessor in features and coherence.

Some professional editors are already insisting that Apple has made exactly the same mistake with Final Cut X; they pointed out various flaws with the program after an earlier version of this column was posted online on Wednesday. They say the new program is missing high-end features like the ability to edit multiple camera angles, to export to tape, to burn anything more than rudimentary DVDs and to work with EDL, XML and OMF files (used to exchange projects with other programs). You can use a second computer monitor, but you need new TV-output drivers to attach an external video monitor. You can’t change the settings of your exported QuickTime movies without the $50 Compressor program.

Apple admits that version X is a “foundational piece.” It says that it will restore some of these features over time, and that other companies are rapidly filling in the other holes.

For nonprofessionals, meanwhile, Final Cut is already tempting — especially because the price is $300, not $1,000. It’s the first Apple program that’s available only by download from the online Mac App Store, not on DVD. All of the programs formerly called Final Cut Studio have been rolled into Final Cut except Motion and Compressor, which are sold separately. Final Cut Express and DVD Studio Pro are gone.)

The new Final Cut has been radically redesigned. In fact, it looks and works a lot like iMovie, all dark gray, with “skimming” available; you run your cursor over a clip without pressing the mouse button to play it.

Once you’re past the shock of the new layout, the first thing you’ll notice is that Apple has left most of the old Final Cut’s greatest annoyances on the cutting-room floor.

First — and this is huge — there’s no more waiting to “render.” You no longer sit there, dead in the water, while the software computes the changes, locking up the program in the meantime, every time you add an effect or insert a piece of video that’s in a different format. Final Cut X renders in the background, so you can keep right on editing. You cannot, however, organize your files or delete clips during rendering.

Second, in the old Final Cut, it was all too easy to drag the audio and video of a clip out of sync accidently; little “-1” or “+10” indicators, showing how many frames off you were, were a chronic headache. But in the new Final Cut, “sync is holy,” as Apple puts it. Primary audio and video are always synced, and you can even lock other clips together so that they all move as one.

In fact, an ingenious feature called Compound Clips lets you collapse a stack of audio and video clips into a single, merged filmstrip on the timeline. You can adjust it, move it and apply effects as if it were a single unit, and then un-merge it anytime you like. Compound Clips make it simple to manage with a complicated composition without going quietly insane.

In the old Final Cut, if you dragged Clip A so that it overlapped part of Clip B, even briefly, you wound up chopping away the covered-up piece of Clip B. But now, the timeline sprouts enough new parallel “tracks” to keep both of the overlapping clips. Nothing gets chopped unless you do it yourself.

Source: David Pogue / NYTimes.com

Apple’s new non-linear editing app plots a roadmap to the future of video editing

by Gary Adcock, Macworld.com

With the release of its hotly anticipated Final Cut Pro X (FCP X), Apple breaks new ground—not just with its flagship video editor’s interface and underlying infrastructure—but with the whole mindset of what it means to be a working professional video editor.

Apple has revamped Final Cut Pro’s hands-on user experience in three major areas: Editing, media organization, and post-production workflow. New tools such as the Magnetic Timeline, Clip Connections, Compound Clips, and Auditions provide a smooth, intuitive editing experience.

With the rise of data-centric workflows and tapeless video recording, organizational tools such as Content Auto-Analysis, Range-based keywords, and Smart Collections work in the background to automate formerly tedious and time-consuming manual processes.

Post production workflows now offer customizable effects, integrated audio editing, color grading, and a host of streamlined delivery options.

With this new application, video pros can no longer follow traditional ways of working.

Clips in FCP X’s new Event Library are sorted by both user-created Custom Keywords (blue icons) and Smart Collections. The latter are created automatically by Content Auto-Analysis during import (purple icons).

Final Cut Pro X, despite its familiar name, is not an upgrade of Final Cut Pro 7. It is a brand new product. FCP X is also no longer part of a suite of applications such as Final Cut Studio, but rather one of a trio of component parts that include Final Cut Pro X ($300), Motion 5 ($50), and Compressor 4 ($50). All are available separately for download from the Mac App Store. There will be no boxed copies.

Rolling

Final Cut Pro X starts off by immediately analyzing your media as it begins to import footage, while at the same time archiving critical secondary information on color balance, motion, rolling shutter artifacts, tracking, and stabilization data on a clip-by-clip basis.

While handling the bulk of analytical information at ingest, FCP X is tagging the files with metadata in a manner that speeds secondary file processing, delivery, and rendering capabilities and vastly accelerates workflow. The heavy lifting of this content is invisibly handled in the background—between the application and the Mac operating system—as a byproduct of the conversion to a fully 64-bit application workflow.

The most profound interface changes to FCP X—beyond the new darker look—are the Event Browser and the Event Library. Importing your content into the app creates a new Event, a virtual folder that holds all of the information about your media: what it is, where it’s stored, and whether it’s from a specific date, place, or client. You can even rate, organize, and show or hide clips from view while accessing tools like Keyword and Smart Collections. Events are created by the application as part of the ingest, in addition to your organizational effort.

When you’re done creating your video, you can use the direct upload options within FCP X to share it on Facebook, Vimeo, YouTube, and CNN iReport. All Apple devices are available as options, as well as Podcast Producer, output for standard definition DVDs, and even Blu-ray devices, directly within FCP X. Plus, the application still offers fully integrated processing with Compressor. Standalone export output options offer all flavors of Apple’s ProRes, H.264, DVCProHD, Apple HDV, and even Sony’s XDCamEX format at 35Mbps and the 50Mbps version of the XDCamHD 422 codec.

Here are the currently available output options for file delivery when exporting your project directly from FCP X.

Magnetic Timeline

Acting as a trackless canvas for your video edit, the Magnetic Timeline allows you the freedom to arrange and re-arrange your media wherever you want. Existing clips on the timeline slide in and out of the way without danger of collision or overwriting a previous edit. They snap into place “magnetically,” dynamically aligning with existing media in the timeline. Despite being trackless, you can easily create multi-level compositions and properly maintain continuity as you move media around in your project. This feature interactively shows you exactly what’s happening in the timeline as you work, so you can easily execute your vision.

Clip Connections and Compound Clips

Designed to maintain the continuity of media in the timeline, Clip Connections are relational links between primary media in the timeline and secondary elements. Content such as titles, B-roll, sound effects, and even music, can be moved and repositioned seamlessly as a single clip, maintaining audio and video sync, and giving you a clear, visually defined connection to your assets.

Alongside Clip Connections and its facility at combining primary and secondary elements into a cohesive unit for editing and filtering, Compound Clips further advances the concept by allowing a complex multi-element group of media to be handled as a single clip. It’s easy. Just select the relevant media and choose Create Compound Clip from the File menu (or hit the Option-G key command).

Compound Clips let you move, duplicate, and handle clips as an individual segment. You can even share such clips across multiple projects or use a clip to apply filters and effects across all combined elements. The Compound Clips feature helps video editors remove clutter and simplify the timeline’s organization, while maintaining media continuity.

This is the Compound Clip when open.

The Compound Clip command offers a vastly simpler timeline that minimizes the additional track and clip information until needed. Think of it as a nested sequence on steroids. This is the Compound Clip when closed.

Auditions

Think about being able to create multiple versions of an opening or closing sequence for different clients or presentations. That’s the power of Auditions. With this feature, you can view various alternative scenes in your video without leaving the timeline. Auditions provides a fast and easy way to preview a number of variations—with any media collection—in real time. To create Auditions, just drag the shots to the same place in the timeline and choose the Add to Audition command. This allows the Magnetic Timeline to handle the sequence continuity and sync.

Auditions lets you dynamically preview multiple clips within the timeline without disturbing any other media.

Content Auto-Analysis and Keywords

Underneath the surface, FCP X mines metadata from your content from the second you ingest footage from the camera. That data stream includes information such as camera type, frame rate, white balance, and a host of other pre-defined parameters.

The Content Auto-Analysis feature can, during import, distinguish individual people, shot types (close, medium, and wide) and rolling shutter artifacts common to many CMOS cameras. It can also rectify stabilization issues with hand-held shots, adjust overall color balance, and analyze and remove excessive audio noise and hum or silent audio tracks from footage.

Much of FCP X’s automatic keyword creation is derived from this media detection functionality. The program uses the information gathered to create keywords and automatically assembles assets into Smart Collections within the Event Library. Thus, while importing your content, FCP X is sorting, categorizing, and auto keywording in the background. In addition to the keywords that the program assigns, you can add or edit your own keywords to identify specific shots in any manner you choose. Since you can have multiple keywords for the same clip, and all of those clips would appear in each search and link to a single original piece of media, you can accomplish faster, cleaner edits.

Applying customizable effects offers you a real-time preview of the effect on your video in the canvas. Note that the thumbnail view in the Effects Window also shows the clip. The new Skimming feature brings added power to the content preview in FCP X.

Another result of this metadata analysis is the ability to create a keyword selection across multiple pieces of media. Range-based Keywords allow you to flag a keyword across all or just small parts of multiple clips. Keywording offers a larger, far more flexible canvas. No longer are you restricted to specific bins or folders.

Content library access

Borrowing a page from its iLife line of consumer apps, FCP X lets you browse all of your attached media and content libraries within the program. View your iTunes, iPhoto, and Aperture content as well as Motion Libraries directly, as well as 1300 royalty-free sound effects offered as a free download, available via Apple’s software update, after you purchase FCP X.

Customizable effects

FCP X provides a wide array of content, including animations, titles, transitions, and effects sequences, all accessible and editable within the application.

Much of this content was created specifically for Apple by Hollywood effects pros and graphic designers. Customizable from the start, these effects allow you to preview a clip by selecting a shot and then using the skimming function to get a true instantaneous, real-time preview—both as a thumbnail and in the viewer—of how your shot will look with that effect applied.

You also have access to all of the transform functions (crop, scale, rotate, and distort) as well as keyframing of those effects without having to jump between different parts of the interface. Effects imported from Motion 5 can be managed to allow you to modify different parts of the program’s new Parameter Rigging feature.

Effects in viewer.

Audio editing

Apple has chosen to fully integrate audio editing into FCP X. Starting with the ingest, the program analyzes content for hum, noise, and dynamic audio changes. It even automates audio sync from an external recorder and the camera, matching audio and video via the waveforms, to connect content and sync it properly. This was formerly a manual process.

With a large library of sound effects and high-quality audio effects plug-ins available in FCP X, you now have greater control of your audio enhancements than ever before. You can access control for sub-frame audio edits as well as many of the available 64-bit versions of third-party Audio Units plug-ins.

Color grading

Whether you need a single-click correction or want to create a stylized look, all color work now happens within FCP X. From the first analysis, color balance and correction are mapped for use, allowing you to quickly match multiple shots in the same group or refine the look of any clip in the Event Browser or the timeline.

The Color Board gives you a dynamic way to make custom modifications to overall color, saturation, and exposure, while allowing keying and masking to be done simply and directly within the app. The Match Color feature offers a fast and easy way to match the overall color, contrast, and look between two different shots to maintain a project’s visual continuity.

The Color Board represents the essence of Apple’s FCP X interface changes. It allows you simple control over exposure, color, and saturation.

Bottom line

Apple’s new Final Cut Pro X has been re-designed from the ground up with a radically different approach—one that acknowledges and uses device and camera data in a manner that has never before been attempted in the video editing environment.

With this release, Apple shows us the future in which data streams from all the devices we work with communicate seamlessly, sharing media behind the scenes. Think of the advantages and possibilities when all the effort you put into setting up a shot or project continue downstream from your camera into post-production, or follow your content when it’s delivered on the web. That’s the promise of Final Cut Pro X. Will that promise be fulfilled?

[Gary Adcock is a Chicago-based consultant who specializes in building workflows for film and television productions. He is the founder and past president of the Chicago Final Cut Users Group, Tech Chairman of the NAB Director of Photography Conference, and a member of the I/A Local 600/ Camera Guild Training Committee, teaching tapeless production techniques and workflows to professional camera operators. His writings and musing can be found at his blog on Creative Cow .]

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.