Archive for October, 2012

Final Cut Pro X 10.0.6 Update

From Philip Hodgett’s blog:

Final Cut Pro X 10.0.6 is probably the most feature-rich release since the original one. As well as the features Apple discussed at NAB  2011:

  • Multichannel Audio Editing Tools
  • Dual Viewers
  • MXF Plug-in Support, and
  • RED camera support

there’s more. Much more. Including a feature I wish they hadn’t put in and one I’m extremely pleased they did. I’m ecstatic that selective pasting of attributes is now an Final Cut Pro X feature, but I’m really annoyed that persistent In/Out points made it to this release. More on these later.

There’s a rebuilt and more flexible Share function; a simplified and improved Unified Import with optional list view, horizontal scopes mode (and scope for each viewer), Chapter Markers, faster freeze frames, support for new titling features inherited from Motion, more control over connection point, 5 K image support, vastly improved Compound Clip structure (both functional and for us the XML), customized metadata export in the XML (for asset management tools mostly), and two features that didn’t make it to the “what’s new” list: Range Export from Projects and a bonus for 7toX customers.

All up I count more than 14 new features, whereas Final Cut Pro X 10.0.3 had four (although arguably Multicam and Video out were heavy duty features).

Because of the developer connection, I’ve been working with this release for a few months. We have new versions of 7toX and Xto7 waiting for review in the App Store that support the new version 1.2 XML.

MXF and RED Camera Support

In keeping with their patten, Apple have supported a third party MXF solution rather than (presumably) paying license fees for each sale when only a small percentage of users will use the MXF input (and yes, output) capability. The named solutions are MXF4mac and Calibrated{Q} MXF Import, but apparently there are others. Working with MXF files should not feel different than working with QuickTime files.

Along with RED native support, Apple quietly upped the maximum resolution from 4K (since release) to 5K.

I don’t work with MXF or RED so I’ve had no ability (nor time) to test these functions. I’ll leave that to those with more knowledge.

Dual Viewers

More accurately, the current Timeline Viewer and an optional Event Viewer (Window > Show Event Viewer). You get one view from the Timeline and one view from the Event. I can see how this could be useful at times, although truthfully I never missed it.

Final Cut Pro X's dual viewers

Next is the main reason as to why 1 cannot generic viagra have an orgasm these would need to have to be discussed with your group of friends or even your partner out of fear of ridicule. The Department of Urology at PSRI hospital dedicatedly looks after all cialis samples the health concerns of men related to their urinary tract and reproductive system with a function. Acupuncture can easily deal with distinct health generic cialis pills problems successfully. Most http://pamelaannschoolofdance.com/amy-geldhof/ order cheap cialis of the time, proper warm-up, fitness, and equipment allow athletes to practice their sport safely.

One Viewer from the Event, one Viewer for the Project. (Click to enlarge)

Multichannel Audio Tools

There’s a lot of room for improvement in the audio handing in Final Cut Pro X so the new multichannel audio tools are a welcome step in the right direction.  Initially there’s no visible change, until you choose Clip > Expand Audio Components. With the audio components open, you can individually apply levels, disabled states, pans and filters to individual components, trim them, delete a section in the middle – all without affecting the clips around.

Final Cut Pro X's multichannel audio
No multichannel audio in Solar Odyssey, so I borrowed a test file from Greg used to add support to 7toX and Xto7.

For 7toX, if there are separate audio levels, pans or keyframes on a sequence clip’s audio tracks these will be translated onto the separate audio components in Final Cut Pro X. Similarly for Xto7 the levels/pans/keyframes on the clip’s audio components are translated onto the audio clip’s tracks.

More flexible Scopes

A new layout – Vertical – stacks the Scopes on top of each other. Better still, they remember the settings from the last time used! Also good is that you can open a Scope for each of the viewers.

Final Cut Pro X's dual scopes
Dual Viewers, Dual Scopes and a stacked Vertical layout. If the brightness control was there before, I missed it.

You should note that both those images are 720P at 97% (from the Parrot A.R. Drone 2.0 FWIW). I love the Retina display!

Improved Sharing

There’s now a Share button directly in the interface. More importantly, you do not have to open a Clip from the Event as a Timeline to export.

Final Cut Pro X's share destinations
Share directly from Event or Project.

But what’s that at the end? Why yes, I can create a Share to my own specifications, including anything you can do in Compressor (by creating a Compressor setting and adding that to a New Destination). Note that HTTP live streaming is an option.

Final Cut Pro X's share destinations
Now you can create a custom Share output for exactly your needs.

Final Cut Pro X 10.0.6 will also remember your YouTube password, even for multiple accounts. If you have a set package of deliverables (multiple variations for example) you can create a Bundle that manages the whole set of outputs by applying the Bundle to a Project or Clip in Share. Create a new bundle and add in the outputs you want.

Range-based export from Projects

Another feature not seen on the “What’s new” list is the ability to set a Range in a Project and export only that Range via Share. A much-requested feature that’s now available.

Unified Import

I never quite loved that I would import media from the cameras (or their SD cards) via the Import dialog, while importing Zoom audio files was a whole other dialog. Not any more with the new unified Import dialog. There’s even an optional List View, which is my preferred option. (The woodpecker was very cooperative and let me sneak in very close with the NEX 7.)

Final Cut Pro X's unified Import dialog
The Unified Import dialog, with optional list view like the Event list view, with skimmer and filmstrip view.

Waveforms and Hiding already imported clips are also options. The window now (optionally) automatically closes when Import begins.

Other import options.

Chapter Markers

For use when outputting a DVD, Blu-ray,  iTunes, QuickTime Player, and Apple devices.

Final Cut Pro X's chapter markers
There are now three types of Marker: Marker, To Do and Chapter.

The Marker position notes the actual marker, the orange ball sets the poster frame, either side of the chapter mark. A nice refinement for Share.

In 7toX translation, sequence chapter markers become chapter markers on a clip in the primary storyline at the same point in the timeline.

Fast Freeze Frame

Simply select Edit > Add Freeze Frame or press Option-F to add a Freeze Frame to the Project at the Playhead (or if the Clip is in an Event, the freeze frame will be applied in the active Project at the Playhead as a connected clip). Duration, not surprisingly, is the default Still duration set in Final Cut Pro X’s Editing preferences.

New Compound Clip Behavior

Did you ever wonder why Compound Clips were one way to a Project and didn’t dynamically update, but Multicam was “live” between Events and Projects? So, apparently did Apple. (We certainly did when dealing with it in XML). Compound Clips now are live.

  • If you create a Compound Clip in a Project, it is added to the default event and remain linked and live.
  • If you create a Compound Clip in an Event, it can be added to many Projects and remain linked and live.

By linked and live I mean, like Multiclips, changes made in a Compound Clip in an Event will be reflected in all uses of that Compound Clip across multiple Projects.

Changes made to a Compound Clip in a Project, are also made in the Compound Clip in the Event and all other Projects.

To use the old behavior and make a Compound Clip independent, duplicate it in the Event.

The old behavior is still supported so legacy Projects and Events will be fine.

Final Cut Pro 7 sequences translated using 7toX become these new “live” Compound Clips. If you don’t want this behavior you can select the Compound Clip in the Project timeline and choose Clip > Break Apart Clip Items to “unnest” the compound clip .

Selective Pasting of Attributes

It had to be coming, and I’m glad it’s here. This has probably been the feature from Final Cut Pro 7 I’ve missed most.

Final Cut Pro X's Paste Attributes
Looks familiar! I like the visual clue of which clip the content is coming from, and which it is targeted at.

One of the things I love about Final Cut Pro X is that there are “sensible defaults”. Not least of which is the Maintain Timing choice. In the 11-12 years I spent with FCP 1-7 on three occasions I wanted to use the (opposite) default. Every other time I had to change to Maintain Timing, which is now thankfully the default.

Persistent In and Out Points

You got them. And it’s a good implementation, allowing multiple ranges to be created in a clip. I am not a fan, and wish it were an option. Over the last two months I’ve added keywords to “ranges” I didn’t intent to have because the In and Out were held from the last playback or edit I made. Not what I want. So I have to select the whole clips again, and reapply the Keyword. It gets old after the twentieth time.

It gets in my way more than it helps, which is rather as I expected. Selection is by mouse click (mostly – there is limited keyboard support) so this gets every bit as confusing as I anticipated.

Your last range selection is maintained. To add additional range selections (persistent) hold down the Command key and drag out a selection. (There are keyboard equivalents for setting a new range during playback.) You can select multiple ranges and add them to a Project together. (I’m not sure about the use case, but it’s available.)

Customizable Metadata Export to XML (and new XML format)

Along with a whole new version 1.2 of the XML (which lets us support more features in the XML) is the ability to export metadata into the XML. These are the metadata collections found at the bottom of the Inspector.

Final Cut Pro X's XML metadata view
Select the metadata set you want included in the XML during export.

Remember that you can create as many custom metadata sets as you want and choose between them for export. This will be a great feature as soon as Asset Management tools support it. No doubt Square Box will be announcing an update for CatDV the moment this release of Final Cut Pro X is public.

The new XML format also allows 7toX to transfer a Final Cut Pro 7 clip’s Reel, Scene, and Shot/Take metadata into their Final Cut Pro X equivalents.

Flexible Connection Points

We’ve always been able to hold down the Command and Option keys to move a connection point. What is new is the ability to move a clip on the Primary Storyline while leaving connected clips in place. This is really a great new feature and one I’ve used a lot. Hold down the Back-tick/Tilde key (` at the top left of your keyboard) and slip, slide, trim or move the Primary Storyline clip leaving the connected clips in place.

Titling is significantly improved, including support for the new title markers feature in Motion

As I’m not in the Motion beta I’m not at all certain what this means. I’m sure Mark Spencer will have an explanation over at RippleTraining.com soon.

Drop Shadow effect

Well, a new effect that adds a Final Cut Pro 7 style drop shadow to a Clip. If you’ve got a clip in Final Cut Pro 7 with a Motion tab Drop Shadow applied, 7toX will add the new Drop Shadow effect to it during translation.

Bonus unannounced feature – XML can create “offline” clips

This is great news for developers because previously all media referenced by an XML file had to be available (online) when the XML was imported into Final Cut Pro X. With XML version 1.2 that’s not necessary, so we’ve taken advantage of this in 7toX. The user can relink the offline clips to media files by the usual File > Relink Event Files… command after translation and import.

What else do I want?

I’d like a Role-based audio mixer.

I’d like Event Sharing to multiple users at the same time, with dynamic update of keywords and other metadata between editors. (I do not think I want to share a Project in that way – more sequentially manage with a Project, like Adobe Anywhere.

The Rise of Dual Screen Apps, courtesy of Apple TV

Article by Jeremy Allaire  mashable.com

Dual-screen apps are a new phenomena, enabled by the advent of wireless technologies that allow for effortless pairing of a PC, tablet or smartphone with a TV. They are changing how people are interacting and “consuming” content within apps. For developers this creates many new opportunities to provide better experiences for their users, but it requires thinking about dual-screen setups from the start as well as new tools.

The opportunity for dual-screen apps is huge. And it’s more than just watching a video or playing a game: Dual-screen apps have the potential to transform the office meeting room, the classroom, the retail store, the hospital, and really any other context where people are interacting around content and information and where that information would benefit from rendering and display on a large screen such as a TV monitor.

To better understand this concept, it’s necessary to step back and reconsider the nature of how we write software and the user experience model for software.

The Evolution From Single Screen

Today, the predominant user-experience model for software and applications online is a single screen. We browse web applications on a desktop PC, mobile browser or tablet browser and interact with and consume content and applications on that screen. It is very much a single, individual user task. Likewise, we install apps onto these devices and consume and interact with information, perform tasks, make purchases, etc. through these apps. Again, this is a solitary single individual task.

As a result, when software creators plan their applications, they are typically designed and developed with this single user, single-screen concept in mind.

Dual-screen apps change all of that by shifting the software and user experience model from one user to potentially many, and from one screen (PC/phone/tablet) to two screens (phone/tablet and TV monitor). From a software development and user-experience perspective, the large monitor (which is the true second screen — versus the standard concept that considers the tablet as the second screen) becomes an open computing surface where one can render any form of application functionality, information, data and content.

SEE ALSO: Is This the Second-Screen TV App That Finally Goes Mainstream?

Importantly, designers and developers need to shed the concept that “TVs” are for rendering video, and instead think about TVs as large monitors on which they can render applications, content and interactivity that’s supported by a touch-based tablet application.

The Social Computing Surface

While we have the greatest affinity for large monitors as fixtures of the living room, increasingly flat-screen monitors are a becoming a ubiquitous part of our social fabric. In fact, large monitors often sit at the center of any social setting. In the home, these large monitors provide a social surface for those sharing the living room space. Increasingly, monitors are a common part of nearly every business meeting room space — not for watching video, but for projecting shared content and business data and presentations that support business and organization collaboration.

Likewise, monitors are in medical and hospital settings providing visual information to patients. They are increasingly in nearly every classroom, whether through a projector or an actual TV monitor and support the presentation of information that is needed for a collection of students. Large monitors are increasingly ubiquitous in retail settings as well.

The key concept here is that this pervasive adoption of TV monitors is the tip of the spear in creating a social computing surface in the real world. Forget about social networks that connect people across their individual, atomized computing devices — the real social world is groups of people in a shared space (living room, office, classroom, store, etc.) interacting around information and data on a shared screen.

Until very recently, the way in which these TV monitors could be leveraged was limited to connecting a PC through an external display connector to a projector or directly to a TV. The recent breakthrough that Apple has fostered and advanced more than any other tech company is AirPlay and associated dual-screen features in iOS and Apple TV.

Specifically, Apple has provided the backbone for dual screen apps, enabling:

  • Any iOS device (and OS X Mountain Lion-enabled PCs) to broadcast its screen onto a TV. Think of this as essentially a wireless HDMI output to a TV. If you haven’t played with AirPlay mirroring features in iOS and Apple TV, give it a spin. It’s a really exciting development.
  • A set of APIs and an event model for enabling applications to become “dual-screen aware” (e.g. to know when a device has a TV screen it can connect to, and to handle rendering information, data and content onto both the touch screen and the TV screen).

The solution is well demonstrated for viagra for sale australia its adequacy by client point of views. The HPI has helped businesses to improve employee buy levitra online turnover, absenteeism, and underperforming customer service. Bacterial vaginosis is treated with antibiotic pills or more information cialis in canada creams. This can relieve a few symptoms, but generally used for those who suffer from online viagra prescriptions try content insomnia.
With the existing Apple TV unit sales already outselling the Xbox in the most recent quarter, we can see a world that goes from approximately 5 million dual-screen-capable Apple TVs to potentially 15-20 million in the next couple of years, and eventually to 30-50 million as new and improved versions of the Apple TV companion device come to market.

As a result, it’s an incredible time to experiment with this fundamental shift in computing, software and user experience, to embrace a world where the Tablet is the most important personal productivity device, and the TV is a rich and powerful surface for rendering content and applications.

How Dual-Screen Apps Will Work

As we rethink the TV as a computing surface for apps, it’s really helpful to have some ideas on what we’re talking about. Below are a series of hypothetical examples of what is possible today and of course what will be even bigger as these new dual screen run-times proliferate.

Buying a House: Imagine you’re looking into buying a house. You open your tablet app from a reputable home-listing service and perform a search using criteria that you care about and begin adding potential fits to a list of houses you’d like to explore. When you select a specific house, the app detects you’re connected to an Apple TV and launches a second screen on the TV that provides rich and large visual displays about the house — HD-quality photos and contextual information about the house. Here, the power of dual screen is the fact that you and your spouse can sit in the living room and explore a house together without crouching over a computer or tablet on someone’s lap, and the house can be presented with HD-quality media and contextual information.

Buying a Car: Imagine launching the BMW app on your tablet and deciding to both learn about car models and configure a car — like buying a house, often a “social” decision between partners. On the TV, the app renders a high-quality rendition of the car. As you explore the car’s features from your tablet, associated media (photos, video and contextual metadata) render onto the large TV in front of you. As you configure your car using your tablet, it updates a visual build of the car on the large screen, providing an inline HD video for specific features.

Kids Edutainment: Looking to introduce your three-year old to key cognitive development concepts? Launch a learning app where the child interacts with the tablet application and sees visual information, animation and other content on the TV screen. Their touches on the tablet instantly produce rich and relevant content on the TV screen. Learning to count? Feed cookies over AirPlay to Cookie Monster on the TV who eats and counts with you. Learning about concepts like near and far? Tap the table to make a character move closer and away from you. Build a character on the tablet and watch the character emerge on the TV screen.

SEE ALSO: Designing for Context on Multiple Devices

Sales Reporting: As a sales manager, you walk into your team conference room with a TV monitor mounted on the wall. You kick open your Salesforce.com tablet app on your tablet and begin filtering and bringing up specific reports on your tablet, and with the touch of a button you push unique visual reports onto the shared surface of the conference room TV. Here, the sales manager wants control of the searches and filters they have access to and only wants to render the charts and reports that are needed for the whole team to see.

Board Games: Imagine playing Monopoly with your family in the living room — one or two or maybe even three touch devices present (phones, iPod touches, iPads). Each player has their inventory of properties and money visible on their device. The app passes control to each user as they play. On the TV screen is the Monopoly “board” with a dynamic visual that updates as users play — the movement of players, the building up of properties, etc.

The Classroom: A teacher walks into a classroom with an Apple TV connected to a HDMI-capable projector that projects onto a wall or screen. From their tablet, they pull up an application that is designed to help teach chemistry and the periodic table — they can control which element to display up on the screen, and the TV provides rich information, video explanations, etc. The app is designed to provide ‘public quiz’ functionality where the TV display shows a question, presumably related to material just reviewed or from homework, students raise their hand to answer and then the answer and explanation is displayed.

Doctor’s Office: You are meeting with your doctor to go over test results from an MRI scan. The doctor uses his or her tablet to bring up your results, picks visuals to throw onto the TV monitor in the room, then uses his or her finger to highlight key areas and talk to you about they’re seeing.

Retail Electronics Store: You’re at a Best Buy and interested in buying a new high-quality digital camera. A sales specialist approaches you with tablet in hand and asks you a few questions about what you’re interested in while tapping those choices into their tablet app. From there, it brings up on a nearby TV display a set of options of cameras — based on further probing, they drill into a specific camera choices which brings up a rich visual with a video overview of the specific camera that you’re interested in.

Consuming News: A major revolution has just broken out in a nation across the planet. Time has captured incredible audio, photos and video of the events. You and your friends sit down in front of the TV to learn more. You open the Time Magazine tablet app and bring up a special digital edition about the revolution. From the tablet, you flip through and render onto the TV rich HD-quality photographs, listen to first hand audio accounts (accompanied by photos) and watch footage from the events. The app renders a huge visual timeline of the events that led up to the revolution. It’s an immersive media experience that can be easily shared by friends and family in the living room.

Consuming Video: Last but not least, of course, dual-screen apps will be essential to any app that is about consuming video — whether a news or magazine app, a vertical website (think Cars.com, BabyCenter.com, AllRecipies.com, etc.), or of course a catch-up TV app from a TV network or show that you care about. You open the app on your table to explore what to watch, and when you’re ready to watch the show instantly pops onto your TV in gorgeous HD quality, and the tablet app becomes your remote control and presents relevant contextual information about the video, episode or what have you.

The Coming Dual-Screen Revolution

This is such a groundbreaking approach to apps and software we expect lots of others to try and emulate what Apple is doing. Already, Microsoft is promoting the ability to use its Surface Tablet in conjunction with apps built for the Xbox. Samsung has introduced features in its tablets and TVs to enable easy media sharing from your tablet or phone onto a Samsung Smart TV, and surely Google will follow suit with similar features to AirPlay in the Android OS. Apple is still early in deploying this technology — it’s sometimes flaky and a little bit hidden from end-user view — but I expect major changes in the coming months and years.

Virtually every application that exists on the web and phones and tablets likely has a dual-screen use case. Simply put, Web and app designers and developers need to imagine a world where the tablet and TV are a single run-time for their applications which each screen providing distinct value for the user controlling the app and the user consuming rich media and information on a large display. Sometimes this is just one person (like picking and watching a show or playing a game or learning something), but crucially and very often I believe that these apps will be designed with multiple users — and a social context — in mind.

Jeremy Allaire is CEO and founder of Brightcove, a global provider of cloud-content services that offers a family of products and developer tools used to publish and distribute professional digital media.

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.