Archive for the ‘Digital Cinema’ Category

Final Cut Pro X 10.0.6 Update

From Philip Hodgett’s blog:

Final Cut Pro X 10.0.6 is probably the most feature-rich release since the original one. As well as the features Apple discussed at NAB  2011:

  • Multichannel Audio Editing Tools
  • Dual Viewers
  • MXF Plug-in Support, and
  • RED camera support

there’s more. Much more. Including a feature I wish they hadn’t put in and one I’m extremely pleased they did. I’m ecstatic that selective pasting of attributes is now an Final Cut Pro X feature, but I’m really annoyed that persistent In/Out points made it to this release. More on these later.

There’s a rebuilt and more flexible Share function; a simplified and improved Unified Import with optional list view, horizontal scopes mode (and scope for each viewer), Chapter Markers, faster freeze frames, support for new titling features inherited from Motion, more control over connection point, 5 K image support, vastly improved Compound Clip structure (both functional and for us the XML), customized metadata export in the XML (for asset management tools mostly), and two features that didn’t make it to the “what’s new” list: Range Export from Projects and a bonus for 7toX customers.

All up I count more than 14 new features, whereas Final Cut Pro X 10.0.3 had four (although arguably Multicam and Video out were heavy duty features).

Because of the developer connection, I’ve been working with this release for a few months. We have new versions of 7toX and Xto7 waiting for review in the App Store that support the new version 1.2 XML.

MXF and RED Camera Support

In keeping with their patten, Apple have supported a third party MXF solution rather than (presumably) paying license fees for each sale when only a small percentage of users will use the MXF input (and yes, output) capability. The named solutions are MXF4mac and Calibrated{Q} MXF Import, but apparently there are others. Working with MXF files should not feel different than working with QuickTime files.

Along with RED native support, Apple quietly upped the maximum resolution from 4K (since release) to 5K.

I don’t work with MXF or RED so I’ve had no ability (nor time) to test these functions. I’ll leave that to those with more knowledge.

Dual Viewers

More accurately, the current Timeline Viewer and an optional Event Viewer (Window > Show Event Viewer). You get one view from the Timeline and one view from the Event. I can see how this could be useful at times, although truthfully I never missed it.

Final Cut Pro X's dual viewers
One Viewer from the Event, one Viewer for the Project. (Click to enlarge)

Multichannel Audio Tools

There’s a lot of room for improvement in the audio handing in Final Cut Pro X so the new multichannel audio tools are a welcome step in the right direction.  Initially there’s no visible change, until you choose Clip > Expand Audio Components. With the audio components open, you can individually apply levels, disabled states, pans and filters to individual components, trim them, delete a section in the middle – all without affecting the clips around.

Final Cut Pro X's multichannel audio
No multichannel audio in Solar Odyssey, so I borrowed a test file from Greg used to add support to 7toX and Xto7.

For 7toX, if there are separate audio levels, pans or keyframes on a sequence clip’s audio tracks these will be translated onto the separate audio components in Final Cut Pro X. Similarly for Xto7 the levels/pans/keyframes on the clip’s audio components are translated onto the audio clip’s tracks.

More flexible Scopes

A new layout – Vertical – stacks the Scopes on top of each other. Better still, they remember the settings from the last time used! Also good is that you can open a Scope for each of the viewers.

Final Cut Pro X's dual scopes
Dual Viewers, Dual Scopes and a stacked Vertical layout. If the brightness control was there before, I missed it.

You should note that both those images are 720P at 97% (from the Parrot A.R. Drone 2.0 FWIW). I love the Retina display!

Improved Sharing

There’s now a Share button directly in the interface. More importantly, you do not have to open a Clip from the Event as a Timeline to export.

Final Cut Pro X's share destinations
Share directly from Event or Project.

But what’s that at the end? Why yes, I can create a Share to my own specifications, including anything you can do in Compressor (by creating a Compressor setting and adding that to a New Destination). Note that HTTP live streaming is an option.

Final Cut Pro X's share destinations
Now you can create a custom Share output for exactly your needs.

Final Cut Pro X 10.0.6 will also remember your YouTube password, even for multiple accounts. If you have a set package of deliverables (multiple variations for example) you can create a Bundle that manages the whole set of outputs by applying the Bundle to a Project or Clip in Share. Create a new bundle and add in the outputs you want.

Range-based export from Projects

Another feature not seen on the “What’s new” list is the ability to set a Range in a Project and export only that Range via Share. A much-requested feature that’s now available.

Unified Import

I never quite loved that I would import media from the cameras (or their SD cards) via the Import dialog, while importing Zoom audio files was a whole other dialog. Not any more with the new unified Import dialog. There’s even an optional List View, which is my preferred option. (The woodpecker was very cooperative and let me sneak in very close with the NEX 7.)

Final Cut Pro X's unified Import dialog
The Unified Import dialog, with optional list view like the Event list view, with skimmer and filmstrip view.

Waveforms and Hiding already imported clips are also options. The window now (optionally) automatically closes when Import begins.

Other import options.

Chapter Markers

For use when outputting a DVD, Blu-ray,  iTunes, QuickTime Player, and Apple devices.

Final Cut Pro X's chapter markers
There are now three types of Marker: Marker, To Do and Chapter.

The Marker position notes the actual marker, the orange ball sets the poster frame, either side of the chapter mark. A nice refinement for Share.

In 7toX translation, sequence chapter markers become chapter markers on a clip in the primary storyline at the same point in the timeline.

Fast Freeze Frame

Simply select Edit > Add Freeze Frame or press Option-F to add a Freeze Frame to the Project at the Playhead (or if the Clip is in an Event, the freeze frame will be applied in the active Project at the Playhead as a connected clip). Duration, not surprisingly, is the default Still duration set in Final Cut Pro X’s Editing preferences.

New Compound Clip Behavior

Did you ever wonder why Compound Clips were one way to a Project and didn’t dynamically update, but Multicam was “live” between Events and Projects? So, apparently did Apple. (We certainly did when dealing with it in XML). Compound Clips now are live.

  • If you create a Compound Clip in a Project, it is added to the default event and remain linked and live.
  • If you create a Compound Clip in an Event, it can be added to many Projects and remain linked and live.

By linked and live I mean, like Multiclips, changes made in a Compound Clip in an Event will be reflected in all uses of that Compound Clip across multiple Projects.

Changes made to a Compound Clip in a Project, are also made in the Compound Clip in the Event and all other Projects.

To use the old behavior and make a Compound Clip independent, duplicate it in the Event.

The old behavior is still supported so legacy Projects and Events will be fine.

Final Cut Pro 7 sequences translated using 7toX become these new “live” Compound Clips. If you don’t want this behavior you can select the Compound Clip in the Project timeline and choose Clip > Break Apart Clip Items to “unnest” the compound clip .

Selective Pasting of Attributes

It had to be coming, and I’m glad it’s here. This has probably been the feature from Final Cut Pro 7 I’ve missed most.

Final Cut Pro X's Paste Attributes
Looks familiar! I like the visual clue of which clip the content is coming from, and which it is targeted at.

One of the things I love about Final Cut Pro X is that there are “sensible defaults”. Not least of which is the Maintain Timing choice. In the 11-12 years I spent with FCP 1-7 on three occasions I wanted to use the (opposite) default. Every other time I had to change to Maintain Timing, which is now thankfully the default.

Persistent In and Out Points

You got them. And it’s a good implementation, allowing multiple ranges to be created in a clip. I am not a fan, and wish it were an option. Over the last two months I’ve added keywords to “ranges” I didn’t intent to have because the In and Out were held from the last playback or edit I made. Not what I want. So I have to select the whole clips again, and reapply the Keyword. It gets old after the twentieth time.

It gets in my way more than it helps, which is rather as I expected. Selection is by mouse click (mostly – there is limited keyboard support) so this gets every bit as confusing as I anticipated.

Your last range selection is maintained. To add additional range selections (persistent) hold down the Command key and drag out a selection. (There are keyboard equivalents for setting a new range during playback.) You can select multiple ranges and add them to a Project together. (I’m not sure about the use case, but it’s available.)

Customizable Metadata Export to XML (and new XML format)

Along with a whole new version 1.2 of the XML (which lets us support more features in the XML) is the ability to export metadata into the XML. These are the metadata collections found at the bottom of the Inspector.

Final Cut Pro X's XML metadata view
Select the metadata set you want included in the XML during export.

Remember that you can create as many custom metadata sets as you want and choose between them for export. This will be a great feature as soon as Asset Management tools support it. No doubt Square Box will be announcing an update for CatDV the moment this release of Final Cut Pro X is public.

The new XML format also allows 7toX to transfer a Final Cut Pro 7 clip’s Reel, Scene, and Shot/Take metadata into their Final Cut Pro X equivalents.

Flexible Connection Points

We’ve always been able to hold down the Command and Option keys to move a connection point. What is new is the ability to move a clip on the Primary Storyline while leaving connected clips in place. This is really a great new feature and one I’ve used a lot. Hold down the Back-tick/Tilde key (` at the top left of your keyboard) and slip, slide, trim or move the Primary Storyline clip leaving the connected clips in place.

Titling is significantly improved, including support for the new title markers feature in Motion

As I’m not in the Motion beta I’m not at all certain what this means. I’m sure Mark Spencer will have an explanation over at RippleTraining.com soon.

Drop Shadow effect

Well, a new effect that adds a Final Cut Pro 7 style drop shadow to a Clip. If you’ve got a clip in Final Cut Pro 7 with a Motion tab Drop Shadow applied, 7toX will add the new Drop Shadow effect to it during translation.

Bonus unannounced feature – XML can create “offline” clips

This is great news for developers because previously all media referenced by an XML file had to be available (online) when the XML was imported into Final Cut Pro X. With XML version 1.2 that’s not necessary, so we’ve taken advantage of this in 7toX. The user can relink the offline clips to media files by the usual File > Relink Event Files… command after translation and import.

What else do I want?

I’d like a Role-based audio mixer.

I’d like Event Sharing to multiple users at the same time, with dynamic update of keywords and other metadata between editors. (I do not think I want to share a Project in that way – more sequentially manage with a Project, like Adobe Anywhere.

The Rise of Dual Screen Apps, courtesy of Apple TV

Article by Jeremy Allaire  mashable.com

Dual-screen apps are a new phenomena, enabled by the advent of wireless technologies that allow for effortless pairing of a PC, tablet or smartphone with a TV. They are changing how people are interacting and “consuming” content within apps. For developers this creates many new opportunities to provide better experiences for their users, but it requires thinking about dual-screen setups from the start as well as new tools.

The opportunity for dual-screen apps is huge. And it’s more than just watching a video or playing a game: Dual-screen apps have the potential to transform the office meeting room, the classroom, the retail store, the hospital, and really any other context where people are interacting around content and information and where that information would benefit from rendering and display on a large screen such as a TV monitor.

To better understand this concept, it’s necessary to step back and reconsider the nature of how we write software and the user experience model for software.

The Evolution From Single Screen

Today, the predominant user-experience model for software and applications online is a single screen. We browse web applications on a desktop PC, mobile browser or tablet browser and interact with and consume content and applications on that screen. It is very much a single, individual user task. Likewise, we install apps onto these devices and consume and interact with information, perform tasks, make purchases, etc. through these apps. Again, this is a solitary single individual task.

As a result, when software creators plan their applications, they are typically designed and developed with this single user, single-screen concept in mind.

Dual-screen apps change all of that by shifting the software and user experience model from one user to potentially many, and from one screen (PC/phone/tablet) to two screens (phone/tablet and TV monitor). From a software development and user-experience perspective, the large monitor (which is the true second screen — versus the standard concept that considers the tablet as the second screen) becomes an open computing surface where one can render any form of application functionality, information, data and content.

SEE ALSO: Is This the Second-Screen TV App That Finally Goes Mainstream?

Importantly, designers and developers need to shed the concept that “TVs” are for rendering video, and instead think about TVs as large monitors on which they can render applications, content and interactivity that’s supported by a touch-based tablet application.

The Social Computing Surface

While we have the greatest affinity for large monitors as fixtures of the living room, increasingly flat-screen monitors are a becoming a ubiquitous part of our social fabric. In fact, large monitors often sit at the center of any social setting. In the home, these large monitors provide a social surface for those sharing the living room space. Increasingly, monitors are a common part of nearly every business meeting room space — not for watching video, but for projecting shared content and business data and presentations that support business and organization collaboration.

Likewise, monitors are in medical and hospital settings providing visual information to patients. They are increasingly in nearly every classroom, whether through a projector or an actual TV monitor and support the presentation of information that is needed for a collection of students. Large monitors are increasingly ubiquitous in retail settings as well.

The key concept here is that this pervasive adoption of TV monitors is the tip of the spear in creating a social computing surface in the real world. Forget about social networks that connect people across their individual, atomized computing devices — the real social world is groups of people in a shared space (living room, office, classroom, store, etc.) interacting around information and data on a shared screen.

Until very recently, the way in which these TV monitors could be leveraged was limited to connecting a PC through an external display connector to a projector or directly to a TV. The recent breakthrough that Apple has fostered and advanced more than any other tech company is AirPlay and associated dual-screen features in iOS and Apple TV.

Specifically, Apple has provided the backbone for dual screen apps, enabling:

  • Any iOS device (and OS X Mountain Lion-enabled PCs) to broadcast its screen onto a TV. Think of this as essentially a wireless HDMI output to a TV. If you haven’t played with AirPlay mirroring features in iOS and Apple TV, give it a spin. It’s a really exciting development.
  • A set of APIs and an event model for enabling applications to become “dual-screen aware” (e.g. to know when a device has a TV screen it can connect to, and to handle rendering information, data and content onto both the touch screen and the TV screen).

With the existing Apple TV unit sales already outselling the Xbox in the most recent quarter, we can see a world that goes from approximately 5 million dual-screen-capable Apple TVs to potentially 15-20 million in the next couple of years, and eventually to 30-50 million as new and improved versions of the Apple TV companion device come to market.

As a result, it’s an incredible time to experiment with this fundamental shift in computing, software and user experience, to embrace a world where the Tablet is the most important personal productivity device, and the TV is a rich and powerful surface for rendering content and applications.

How Dual-Screen Apps Will Work

As we rethink the TV as a computing surface for apps, it’s really helpful to have some ideas on what we’re talking about. Below are a series of hypothetical examples of what is possible today and of course what will be even bigger as these new dual screen run-times proliferate.

Buying a House: Imagine you’re looking into buying a house. You open your tablet app from a reputable home-listing service and perform a search using criteria that you care about and begin adding potential fits to a list of houses you’d like to explore. When you select a specific house, the app detects you’re connected to an Apple TV and launches a second screen on the TV that provides rich and large visual displays about the house — HD-quality photos and contextual information about the house. Here, the power of dual screen is the fact that you and your spouse can sit in the living room and explore a house together without crouching over a computer or tablet on someone’s lap, and the house can be presented with HD-quality media and contextual information.

Buying a Car: Imagine launching the BMW app on your tablet and deciding to both learn about car models and configure a car — like buying a house, often a “social” decision between partners. On the TV, the app renders a high-quality rendition of the car. As you explore the car’s features from your tablet, associated media (photos, video and contextual metadata) render onto the large TV in front of you. As you configure your car using your tablet, it updates a visual build of the car on the large screen, providing an inline HD video for specific features.

Kids Edutainment: Looking to introduce your three-year old to key cognitive development concepts? Launch a learning app where the child interacts with the tablet application and sees visual information, animation and other content on the TV screen. Their touches on the tablet instantly produce rich and relevant content on the TV screen. Learning to count? Feed cookies over AirPlay to Cookie Monster on the TV who eats and counts with you. Learning about concepts like near and far? Tap the table to make a character move closer and away from you. Build a character on the tablet and watch the character emerge on the TV screen.

SEE ALSO: Designing for Context on Multiple Devices

Sales Reporting: As a sales manager, you walk into your team conference room with a TV monitor mounted on the wall. You kick open your Salesforce.com tablet app on your tablet and begin filtering and bringing up specific reports on your tablet, and with the touch of a button you push unique visual reports onto the shared surface of the conference room TV. Here, the sales manager wants control of the searches and filters they have access to and only wants to render the charts and reports that are needed for the whole team to see.

Board Games: Imagine playing Monopoly with your family in the living room — one or two or maybe even three touch devices present (phones, iPod touches, iPads). Each player has their inventory of properties and money visible on their device. The app passes control to each user as they play. On the TV screen is the Monopoly “board” with a dynamic visual that updates as users play — the movement of players, the building up of properties, etc.

The Classroom: A teacher walks into a classroom with an Apple TV connected to a HDMI-capable projector that projects onto a wall or screen. From their tablet, they pull up an application that is designed to help teach chemistry and the periodic table — they can control which element to display up on the screen, and the TV provides rich information, video explanations, etc. The app is designed to provide ‘public quiz’ functionality where the TV display shows a question, presumably related to material just reviewed or from homework, students raise their hand to answer and then the answer and explanation is displayed.

Doctor’s Office: You are meeting with your doctor to go over test results from an MRI scan. The doctor uses his or her tablet to bring up your results, picks visuals to throw onto the TV monitor in the room, then uses his or her finger to highlight key areas and talk to you about they’re seeing.

Retail Electronics Store: You’re at a Best Buy and interested in buying a new high-quality digital camera. A sales specialist approaches you with tablet in hand and asks you a few questions about what you’re interested in while tapping those choices into their tablet app. From there, it brings up on a nearby TV display a set of options of cameras — based on further probing, they drill into a specific camera choices which brings up a rich visual with a video overview of the specific camera that you’re interested in.

Consuming News: A major revolution has just broken out in a nation across the planet. Time has captured incredible audio, photos and video of the events. You and your friends sit down in front of the TV to learn more. You open the Time Magazine tablet app and bring up a special digital edition about the revolution. From the tablet, you flip through and render onto the TV rich HD-quality photographs, listen to first hand audio accounts (accompanied by photos) and watch footage from the events. The app renders a huge visual timeline of the events that led up to the revolution. It’s an immersive media experience that can be easily shared by friends and family in the living room.

Consuming Video: Last but not least, of course, dual-screen apps will be essential to any app that is about consuming video — whether a news or magazine app, a vertical website (think Cars.com, BabyCenter.com, AllRecipies.com, etc.), or of course a catch-up TV app from a TV network or show that you care about. You open the app on your table to explore what to watch, and when you’re ready to watch the show instantly pops onto your TV in gorgeous HD quality, and the tablet app becomes your remote control and presents relevant contextual information about the video, episode or what have you.

The Coming Dual-Screen Revolution

This is such a groundbreaking approach to apps and software we expect lots of others to try and emulate what Apple is doing. Already, Microsoft is promoting the ability to use its Surface Tablet in conjunction with apps built for the Xbox. Samsung has introduced features in its tablets and TVs to enable easy media sharing from your tablet or phone onto a Samsung Smart TV, and surely Google will follow suit with similar features to AirPlay in the Android OS. Apple is still early in deploying this technology — it’s sometimes flaky and a little bit hidden from end-user view — but I expect major changes in the coming months and years.

Virtually every application that exists on the web and phones and tablets likely has a dual-screen use case. Simply put, Web and app designers and developers need to imagine a world where the tablet and TV are a single run-time for their applications which each screen providing distinct value for the user controlling the app and the user consuming rich media and information on a large display. Sometimes this is just one person (like picking and watching a show or playing a game or learning something), but crucially and very often I believe that these apps will be designed with multiple users — and a social context — in mind.

Jeremy Allaire is CEO and founder of Brightcove, a global provider of cloud-content services that offers a family of products and developer tools used to publish and distribute professional digital media.

Fuji ceases film sales as digital continues to take over movie industry

Fujifilm has decided that the time has come for it to move on from film for motion pictures after 78 years. Starting in March of 2013, the last Japanese producer of negatives (for shooting) and positives (for projection) will cease sales of the majority of its film products for movies. Specifically, that means both color and black and white positives and negatives, intermediate film, sound recording film, and processing chemicals (the latter only in Japan, for now) will no longer be available from the company — essentially a complete pull out from the market. Rather surprisingly, Fujifilm thought it necessary to reaffirm that it will continue to sell film for still photography.

The move comes in the midst of a continuing sea of change as filmmakers transition to digital cameras and a ever-growing number movie theaters stock digital projectors in place of film. As digital movie cameras like those from frontrunners Arri and RED have increased in quality, the cost, size, and convenience savings offered by digital have swayed filmmakers. There are still holdouts, and purists will continue to stick with film, but for Fujifilm it was clearly a business decision. It says it couldn’t keep production costs low enough in the face of severely dwindling demand to maintain the business.

Fortunately, Fujifilm will continue to offer its archival film — rated for 500 years — which is still one of the best ways to preserve movies for future generations. Additionally, the company will remain in the movie industry, offering its array of lenses for filming and projection as well as its on-set color management system. For those who want to continue to shoot on film, there are still options out there, but the loss of a major player isanother clear sign that the industry is moving on.

source:  http://www.theverge.com

The iPad 3 morphs into a professional film camera

The iPad 3 morphs into a professional film camera

The iPad has just been turned into a fully functioning professional-standard digital film camera by a New York-based startup called The Padcaster LLC.

The company has created what it calls The Padcaster, which takes the humble iPad 3’s video capabilities and catapults them into professional level with the addition of an aluminium frame with threaded holes around the edges to attach external mics, lights and other accessories.

The frame can be connected to a professional tripod, monopod or shoulder mount and, crucially, the Padmaster has an optional ‘Lenscaster’ that attaches to the Padcaster and makes it possible to attach standard camera lenses to the iPad.

There’s even a 35mm lens adapter to enable cinema lenses to be strapped on to the iPad to capture shallow depth of field and provide DSLR-like focusing.

The Padcaster is aimed at video journalists, videographers and DSLR shooters and, says the maker, provides the opportunity to make “film-quality footage as an all-in-one production studio on the go”, capturing images with the device then using video and audio editing and grading apps freely available on the iPad to cut the footage.

The Padcaster and Padcaster/Lenscaster combo are currently available at a ‘special launch price’ of $149 and $189 respectively.

PADCASTER PRODUCT TOUR from Manhattan Edit Workshop on Vimeo.

Here’s a short film shot entirely on the iPad 3 using the Padcaster.

“Sprung Spring” – shot on the iPad3 (new iPad) with the Padcaster from Manhattan Edit Workshop on Vimeo.

http://www.televisual.com/news-detail/The-iPad-3-morphs-into-a-professional-film-camera_nid-1938.html

Is the Blu-Ray disc an endangered species?

The major studios are making a concerted effort to sell classic movies on Blu-ray, but streaming is rapidly taking over from optical discs.

Friday’s USA Today Money section lays out a compelling argument that physical media, at least in the form of Blu-ray discs, may be reaching the end of its golden era. Mike Snider writes that Blu-ray is “caught in shift to streaming” and that studios will make a major effort this holiday season to release many classic movies on Blu-ray. This, says Snider, means the Blu-ray “is reaching a critical juncture in its growth process”.

Why should the Streaming Media audience care about the sales of Blu-ray? Because direct competition for content and dwindling physical disc sales are both important harbingers for the streaming industry.

Direct competition. When streaming services like Amazon Prime and Netflix were first envisaged, the concept was day-and-date release of blockbuster movies for streaming at the same time that DVDs and Blu-rays Discs went on sale. What’s happened, in reality, is that the majority of content on streaming services has been classic movies, a few key television shows, and movies the studios consider too niche for strong Blu-ray sales.

That may be changing. According to Snider, the studios are making a concerted effort to bring a number of classic movies to Blu-ray in time for this year’s holiday season. While the list includes the Indiana Jones franchise and the recently re-released Titanic, it also appears studios may move to release lesser-known titles.

How low will Blu-ray prices will need to drop to make disc purchases worthwhile for the average consumer? And how much will studios need to charge to push their marketing efforts forward?

The day-and-date model may not be dead: while just-released blockbusters like the first Hunger Gamesmovie — which Epix holds exclusive rights to for ninety days — may not make it to Netflix for several months, movies that haven’t fared well in theaters are starting to trickle into the streaming lineup earlier.

Dwindling sales. So just how bad do projections of Blu-ray disc sales look over the next five years?

Preliminary 2011 sales numbers for overall DVD and Blu-ray Disc sales are estimated at $8.9 billion, with a projected drop to $5.5 billion by 2016, for a loss of $3.4 billion in annual revenues.

Disc rentals were at $4.8 billion in 2011 and are also projected to fall by 2016, to $2.9 billion annually.

In all, this means that disc sales and rentals will yield $8.4 billion in annual revenues by 2016.

The surge in streaming delivery of premium content, on the other hand, is expected to grow $3.9 billion between 2011 and 2016, yielding $6.7 billion in annual revenues.

Sometime in 2017, then, revenues from streaming media delivery of premium content will exceed physical disc sales and rentals. Also, in less than a year, sometime in 2013, the number of streaming minutes for premium content will exceed the number of minutes viewed on DVD and Blu-ray Discs.

There’s a chance that the cross-over point will happen much faster than that. If Blu-ray disc sales begin to drop off between the 2012 and 2013 holiday seasons, studios will be placed in a financial predicament, where day-and-date releases to DVD and Blu-ray will yield much lower sales than equivalent streaming revenues. The cost of physical disc production and distribution, coupled with more limited retail shelf space, may force studios to abandon optical discs more quickly than anticipated.

In other parts of the world, we’ll likely see two divergent models:

For the European market, where small production runs dubbed into the regional language don’t allow the economies of scale that English-language releases provide, we’ll see streaming delivery of dubbed premium content coupled with disc-based sales of subtitled English versions.

For Africa and the parts of Asia with lower infrastructure penetration, we’ll continue to see premium content delivered by optical disc for many years to come. It makes sense, given the fact that English-language movies are more widely watched in these small markets where studios have found it impractical to create dubbed or subtitled versions of all but the biggest blockbusters.

The only caveat for emerging markets is that they have no legacy infrastructure to work around. Just like we’ve seen in the surge in mobile phone sales in Africa and India, the ability to leapfrog from limited landline availability to widespread mobile handset adoption could serve as a model for rapid adoption of streaming media, effectively putting the final nail in the coffin of the optical disc.

source: StreamingMedia.com

New Technique Could Lead to Glasses-Free 3D in Theaters

Wired reported on research published in the journal Optics Express detailing a promising new system for glasses-free 3D viewing — long considered the holy grail of 3D technology, especially for home viewing. The system involves a polarization-preserving screen and a physical “parallax barrier polarizer” that allows four different screen views to be broken up vertically and delivered to only one eye at a time. It’s not immediately clear how well the system would work as viewers move around a room or cock their heads from side to side, but the scientists involved say they believe it will be useful for next-generation 3D theaters. Wired talked to a University of Arizona physicist who said the technology “is still in its infancy,” so you’ll probably have to break out those 3D glasses for Avatar 2 after all.

from Wired.com

Apple poised to enrage cable companies with new ad-blocking tech

In a move that is sure to strike fear into broadcasters and advertisers everywhere, Apple (AAPL) is apparently working on technology that would automatically shut off broadcast advertisements in favor of preloaded content.AppleInsider reports that a new Apple patent covers a system of “seamless switching between radio and local media” that will let mobile devices “automatically switch between broadcast content and stored media to offer the user a type of customized content consumption experience.”

So for example, the new technology is capable of looking at a broadcaster’s typical scheduling and determining “when an upcoming broadcast segment or media item is not of interest to the user” before switching off to other content. AsAppleInsider notes, the technology is being developed “to include any audio or video that can be broadcast by a content source and received by an electronic device for playback,” meaning it could encompass both radio and television.

Needless to say, this type of innovation would qualify as “disruptive” for the broadcasting industry, and not in a good way. If recent events surrounding Dish and its “auto-hop” feature are any indication, Apple may be able to look forward to some fresh lawsuits if this technology ever finds its way to production devices.

Source: BRG.com

Kodak set to quit camera film and photo paper business

Kodak film
Professional photographers still value the unique feel that film gives to their pictures

Debt-struck photography pioneer Kodak says it may sell off its still-camera film and photo paper divisions.

The firm has already stopped making digital cameras as part of efforts to reduce its losses after filing for bankruptcy protection in January. It has also been trying to raise funds by selling off more than 1,100 digital imaging patents. It had originally planned to announce a buyer last week, but said “discussions continue” and a deal might not happen.

Apple and Google had been reported to have made rival bids for the patents, but the Wall Street Journal reports they have now joined forces and have added Samsung, LG, HTC and others to their consortium The WSJ’s sources suggested the offer price for the portfolio would be about $500m (£315m) – well below the $2.6bn estimate that Kodak had suggested it could be worth.

The company recently reported a $665m net loss for the first six months of the year, putting further pressure on its finances.

Film’s feelIn its latest announcement the US company said it had hired investment bank Lazard to help it sell its Personalised Imaging and Document Imaging businesses. This would mean an end to it making films for still cameras, photo papers, souvenir photo products at theme parks, scanners and picture print-out kiosks at stores. It would leave the business focused on printers, cinema film stock and chemicals. The British Journal of Photography said the news would concern the industry.

“A lot of professionals still shoot with film and like the quality it gives them,” Olivier Laurent, news editor at the journal, told the BBC. ”The resolution is still a thousand times higher than most digital cameras can offer so long as a good scanner is used.

“A film photograph has a different mood thanks to its grain – it’s about the love of the image and digital still has a hard time trying to reproduce that feeling.”

Debt-struck photography pioneer Kodak says it may sell off its still-camera film and photo paper divisions.

The firm has already stopped making digital cameras as part of efforts to reduce its losses after filing for bankruptcy protection in January. It has also been trying to raise funds by selling off more than 1,100 digital imaging patents. It had originally planned to announce a buyer last week, but said “discussions continue” and a deal might not happen.

Apple and Google had been reported to have made rival bids for the patents, but the Wall Street Journal reports they have now joined forces and have added Samsung, LG, HTC and others to their consortium

The WSJ’s sources suggested the offer price for the portfolio would be about $500m (£315m) – well below the $2.6bn estimate that Kodak had suggested it could be worth.

The company recently reported a $665m net loss for the first six months of the year, putting further pressure on its finances.

In its latest announcement the US company said it had hired investment bank Lazard to help it sell its Personalised Imaging and Document Imaging businesses. This would mean an end to it making films for still cameras, photo papers, souvenir photo products at theme parks, scanners and picture print-out kiosks at stores. It would leave the business focused on printers, cinema film stock and chemicals.

The British Journal of Photography said the news would concern the industry. ”A lot of professionals still shoot with film and like the quality it gives them,” Olivier Laurent, news editor at the journal, told the BBC.

“The resolution is still a thousand times higher than most digital cameras can offer so long as a good scanner is used.

“A film photograph has a different mood thanks to its grain – it’s about the love of the image and digital still has a hard time trying to reproduce that feeling.”

Source: BBC.com

Quantel’s Part 2 Examination of the Rise of Digital Film

In the second part of our look at the uptake of digital technology in the movie business we explore filmmaking innovators who are taking digital filming and post production to new levels of excellence.

Filmmakers and post-production specialists are breaking new ground in visual storytelling facilitated by the increase in technological capabilities during acquisition, post and exhibition. The advances in Stereo3D are inextricably linked to developments in digital capture: James Cameron pioneered shooting on digital cameras that were custom built for Ghosts of the Abyss – the first feature length 3D IMAX production released in 2003. Then Avatar set the bar for 3D features; however Vince Pace, co-chairman of CAMERON | PACE Group recently told The Hollywood Reporter that they  “were experimenting with Avatar” and that they “could have gone further, but we wanted to make sure we found ourselves somewhere in the middle of concentrating on a good film and focusing on 3D elements. We didn’t want to compromise the actual film by taking away from the story for the sake of 3D.”

Pace continues to state that Martin Scorsese’s Hugo does far more than Avatar to showcase 3D filmmaking’s full capacity due to Scorsese not forcing elements of 3D but instead molding it to get maximum effect from the artists and the creative team’s vision. The technological advancements in S3D that have been made since Avatar also enabled the Hugo team to utilise bespoke systems and equipment to watch high quality stereo dailies, perfect the acquisition of stereo on set and finish Hugo to such a standard the stereo grading contributed to Scorsese earning a ‘Best Director’ nod at this year’s Golden Globes.

We asked Jonathan Tustain, Editor of leading website 3D Focus, to give us his take on future advances in the digital pipeline. “No doubt higher frame rate projection will become commonplace over the next few years thanks to influential directors like Peter Jackson and James Cameron filming big budget movies like The Hobbit and Avatar 2 at 48 frames per second. Cinema chains will be able to upgrade their digital projectors with a simple software update and, like 3D, the improved picture quality, particularly during fast motion sequences, will be marketed to draw audiences in.”

Digital drives quality

Stereoscopic 3D is not the only aspect of digital that is evolving on our cinema screens. As previously mentioned in part 1, 4K digital cinema projection systems are gathering momentum in the US/Europe with audiences increasingly experiencing high resolution movies screened as the filmmaker intended the scenes to be consumed in theaters.

The Girl with the Dragon Tattoo is one of the largest 4K movies to date, making up almost a quarter of a million frames at 45 megabytes each.  It was shot in 4.5k – 5K resolution on RED Epic MX and Epic cameras and was finished at Hollywood Post facility Light Iron on Quantel Pablo. The sheer volume of data captured needed twocolor correction systems working non stop to deliver a 4K DI 9 reels long which is the equivalent of around six 2K two hour movies. The end result, when seen on a 4K projector, is arguably another cinematic milestone demonstrating the amplified creative / visual control for the filmmaker and a higher standard of picture quality for the movie-going public.

We spoke to Michael Cioni, CEO of Light Iron. He explains why he feels that his data-centric post house will continue to work on more productions like The Girl with the Dragoon Tattoo. “My experience with most archetypes on the set has been the common desire to incase creative control. Creative control of the crew, control over the studio, image capture, performances, editing, art direction and even the overall pace of shooting are all things filmmakers tend to want to have creative control over. Yet film as a medium physically limits a degree of control in every category simply because you cannot inspect film until after it is developed. 18-48 hour delays are normal procedure for film review which presents a massive boundary around the evaluation component of creative control. In the upcoming documentary ‘Side-by-Side,’ director Christopher Nolan says ‘There isn’t yet a superior or even equal imaging technology to film.’ In my opinion, subjective interpretation of film’s benefits is not the issue that warrants discussion. From my perspective, it is becoming ever clearer that the simplicity of how work flows ultimately commands what format is adopted. For the average filmmaker this renders aesthetic opinions on the subject a lesser priority (I see this happening at all budget levels). In the hands of the masters, the apex of today’s digital cameras has breeched every measurable category that film used to champion as technically superior. In 36 months time, the gap will be so significant that this debate simply will not exist.”

The Case for Film

There are traditionalists and big names in cinema that are sill firmly camped in film’s corner. Steven Spielberg shot the acclaimed War Horse on 35mm and was recently quoted saying “I’m still planning to shoot everything on film. I guess when the last lab goes out of business, we’ll all be forced to shoot digitally and that could be in eight-to-ten years. It’s possible in ten years’ time there will be no labs processing celluloid.” Of course Spielberg has already dabbled with digital technologies and 3D with Tintin, albeit in a motion capture sense, however he stressed “It’s 100% digital animation but as far as a live-action film, I’m still planning to shoot everything on film… I love film.”

Christopher Nolan’s forthcoming conclusion to the Batman trilogy The Dark Knight Rises was filmed in 35mm and 70mm IMAX. Nolan resisted pressure from studio bosses at Warner Bros. to shoot and/or convert the final Batman in 3D. Quentin Tarantino is passionate about working with celluloid and from his remarks in the video below, like Spielberg, is not switching to shooting in digital anytime soon.

Filmmakers still have a choice when it comes to capturing and translating their vision for the big screen. But partly due to the rapid rise in digital projection, any motion picture captured on celluloid will almost inevitably end up being  digitalized anyway. We return to Light Iron’s CEO Michael Cioni, who sums it all up most eloquently:

“I don’t dislike film, but I do dislike unnecessary complexity. I think if film could be around forever, it would be used on some level forever. Because the art of story telling has infinite possibilities, the ways in which to tell stories should not be quantified into a single format or flavor. By this, I am in favor of film being used whenever deemed appropriate by those who are most comfortable using it. That can be anything from those who are seeking a desired texture or who have a creative preference, such as Mr. Nolan. But there is a bigger question here which presents the real root of the problem: it’s not whether people have a desire to shoot film; the question is whether or not film can afford to be manufactured at all. I predict that man’s desire to shoot film will far outlast the manufacturers’ ability to produce it. Thus, the ultimate decision will be made for us.”

http://blog.quantel.eu/2012/02/the-rise-of-digital-in-motion-pictures-beyond-the-tipping-point-for-film-part-2-of-2/

CEA announces new standards for 3D at home

In an effort to bolster the 3D Home Entertainment industry, the Consumer Electronics Association (CEA) announced new standards for closed captioning, active 3D glasses specs as well as display brightness information for manufacturers and consumers.

Brian Markwalter is senior vice president, research and standards, CEA. “CEA’s standards committees are always looking for new ways to help grow the consumer electronics industry through technological cooperation.”

CEA-2038 standard will allow “active” 3D glasses to switch from 3D to 2D when content such as advertising is placed in a 3D program. Known as the “Command-Driven Analog IT-Synchronized Active Eyewear”, the new spec also opens up the possibility of two separate viewers playing a 3D game offering different images or game paths.

Another area which has been a tremendous challenge for providers and viewers is in the area of closed-captioning.  The VEA-708.1 standard provides broadcasters with information on where data should be encoded for closed captioning.  It is expected that the Society of Motion Picture Broadcasters (SMPTE) are developing their own set of standards to be released later this year.

The  CEA Industry Forum is scheduled for October 14-17, 2012 to be held in San Francisco, CA with the Annual International CES meeting to be held January 8-11, 2013 in Las Vegas, NV.

CEA press release

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.