Archive for the ‘Apple’ Category

Final Cut Pro Editor explains why he is going back to Avid

Matt Toder has been editing video professionally for eight years, and currently works at Gawker.TV. These are his thoughts on Apple’s latest Final Cut Pro release.

I landed my first job in post-production in 2003 at a small house which used Avid exclusively. It had plenty of problems; we struggled with the Dragon error for a few months, converted to Xpress Pro when it came out, and then wrestled with that. There just weren’t any other options. And then Apple’s Final Cut Pro was released, although it too had some problems. But when Avid stopped listening to their customers and became more and more inflexible, Final Cut Pro became an increasingly attractive option. By 2009, significant portions of the editing community were using it.

And now we’ve been given a glimpse of FCPX, a massive, from-the-ground-up revision of Final Cut Pro which proves one thing definitively: that Apple understood many of the problems that were inherent to Final Cut Pro. But, instead of fixing them, they just decided to change everything.

At the preview event, Peter Steinauer, FCP Architect, assured the audience that FCPX was just as much for professional editors as FCP7 was. It really doesn’t seem that way, though. After getting through some of technical aspects of what makes FCPX better than its predecessor in terms of processing power and such—which does seem awesome—Steinauer moved on immediately to color sync. He boasted that FCPX would make sure that pixels looked exactly the same throughout the editing process, noting “you can trust that the pixels coming off a pro file device track all the way through your workflow to display on the screen and ultimately out to output.” This all seems well and good, except it’s completely unimportant for professional editors who aren’t finishing in Final Cut. Some of us color correct in a da Vinci with a professional colorist and then conform in a Flame. Steinauer’s point proves the underlying key of FCPX: that it really isn’t for professional editors.

If it were a device for professional editors, FCPX wouldn’t require a complete rethinking of non-linear editing. It would have instead addressed some of the problems that Final Cut Pro presents for professionals, problems that have existed since day one and that have solutions in the Avid. Like the ability to save your export settings. Or the ability to have an upackable project that allows editors to share bins and not force them into creating multiple projects to share. Or a reliable shared media solution, like Unity or LanShare, so we don’t have to work off of local drives all the time. Or a reliable find bin command that doesn’t constantly tell you your clips aren’t in the browser when you know for a fact that they are. Or a title tool that not only allows you to kern your text but allows you to see what you’re doing in the sequence without having to click back and forth constantly. Or, as the most recent updates to Media Composer have, a way to read RED files directly and then export DPX files. Because, again, not everyone is finishing in Final Cut.

If this were truly a device for professional editors, those improvements would have been in FCPX, and Steinauer would have made a point of mentioning them considering the room he was playing to. But he didn’t. He also didn’t mention EDL’s, OMF’s, XML’s or any changes to the Media Manager that might make generating a cut list for telecine a little easier. He also would have mentioned how the new Compound Clip feature would react when EDL’s are being generated from a sequence full of them.

The idea of Compound Clips speaks to another issue with FCPX. One of the hardest adjustments an Avid editor had to make when switching to Final Cut Pro was no longer being able to load a sequence into the source monitor and cut it into the sequence while maintaining master clip information; FCP turned it into a new clip, which really was just a work around for not being able to generate video mixdowns. This meant that you couldn’t build a select string and then edit from it while still being able to match to your master clip. One would have hoped that FCPX would be able to do something like this, have a more nuanced understanding of the timeline, the way that Avid does, and improve upon a situation where every little move throws everything out of whack unless you’ve gone through and manually locked tracks.

Apple seems to know that keeping things in sync in Final Cut Pro was extremely problematic and have attempted to solve this with Clip Connections and the Magnetic Timeline. Clip Connections can lock a piece of video and its corresponding dialogue to, say, a specific sound effect so that they all travel together all the time. The Magnetic Timeline feature ensures that when this group is moved, you don’t get a clip collision or have to eliminate something from the next piece of media in the timeline. Instead, the next piece of media slides down one track in the timeline. Of course, the demo contains one track of video and two tracks of audio so it’s easy to see that everything works out. I wonder what will occur when you’ve got two pieces of video composited together with a title on top and your audio has dialogue, music, and a couple of sound effects. Will it move everything in the higher audio tracks down as well, thereby destroying the scheme of your timeline?

The biggest, most apparent change is the absence of the source monitor: it’s the iMovie-ing of non-linear editing. Of all the people watching the preview, applauding wildly and yelling out “I want it!” and “thank you,” I can’t believe that one person didn’t scream, “where’s the freaking source monitor?” This represents a gigantic change in the way non-linear editing occurs, a nearly unfathomable one. Since non-linear editing was invented, the mainstays have been the source monitor, the record monitor, the browser and the timeline. To take one of these away means that non-linear editing has to be rethought entirely. I’m not quite sure how you can set an exact in point without it, especially when you’re forced into using the iMovie yellow selection brackets.

All this being said, there certainly are some incredible things about FCPX, most obviously that it will render in the background and that no one will have to stare at the “writing video” dialogue box anymore. That really does sound great. And that it will analyze clips upon import so it will stabilize more quickly (although it already does the analyzing in the background). The FCPX function of analyzing clips for shot length and content (wide two shot, close single, etc) also seems great, though it would have been nice for Steinauer to mention whether this increases import time or not. And since it’s doing all this during-import work, can it also provide a transcript of some sort? That would have been truly useful because it takes a lot of work find an interview subject saying the exact right phrase, much more work than scanning through dailies for the close up series.

Another thing that I would have loved Steinauer to discuss is whether or not an editor can customize how clips are analyzed upon import and how find bin will work now. Specifically, where you will get thrown when try to find a clip in the browser. Do you get thrown to the folder with other wide shots, with other two shots, with other sunset shots or do you get to the original master clip housed somewhere else? These are the questions that need to be answered, the ones that professionals are asking. Because these are the features that change individual workflow and force editors to alter the habits that they’ve developed over time.

(The audio also gets analyzed during import, to remove hum and balance levels. Do these adjustments hold when you export an OMF and do they carry over to ProTools? Who knows, Steinauer didn’t mention anything about the way FCPX talks to other applications.)

If this is the future of Final Cut Pro, and indeed non-linear editing, then that’s fine and I can’t change it. Just don’t tell me that it’s for pros, but you have to change the way you’ve been thinking about everything. And don’t make me change for the wrong reasons, for reasons applied because the improvements speak most to people who aren’t professionals. I love that editing is something that a lot of people can do now, that there’s a greater level of understanding about what it really takes to make a compellingpiece out of a collection of images and sounds and your imagination. Editing, for me, is still where the magic is. It’s one thing to make changes for the sake of the people you claim are your clients and quite another to make changes for the sake of people who aren’t. That’s what these changes are, they are changes for the sake of making editing more accessible, not more functional.

FCPX shouldn’t be about helping people who don’t know what they’re doing, it should be about helping people who do know what they’re doing work better and faster and, most often, that means giving them the flexibility to work however they please, using the techniques they’ve developed over years of working in tough conditions. Because when you don’t have a Senior Creative Director sitting behind you, you don’t really have to worry about finding clips fast enough or making precise edits immediately. But when you are in that situation, you won’t have time re-think the thing you’ve been doing for years and years.

When FCPX is released in June, the countdown will be on for FCP7. Whether it takes a year or possibly less, support will dry up and eventually it won’t be a viable editing platform anymore. I’m not gonna wait that long. Instead, I’ll reacquaint myself with my old friend Avid, catch up on what I’ve missed and fall back into the warm embrace of my fully customized appearance and keyboard settings. It’ll take a minute to get completely familiar with it, to remember everything, and even to be reminded of all the things that drove me crazy. But at least I’ll still have a source monitor.


Has Apple dumbed down FCP X or is this a step up?

Apple just introduced a new version of Final Cut at the Final Cut Pro Supermeet during NAB 2011 in Las Vegas, Nevada. Touted as “revolutionary as the first version” from 1999, Apple introduced the new Final Cut Pro X saying that every major broadcaster and film maker nowadays relies on FCP for their video editing needs.

Basing on live updates coming from attendees at NAB 2011, Final Cut Pro X has been built from scratch, and it’s entirely 64-bit. It’s based on technologies like Cocoa, Core Animation, Open CL, Grand Central Dispatch and it focuses on image quality. It features a resolution independent timeline up to 4K for scalable rendering — in fact, it appears the old render dialog is gone entirely as the app uses the available CPU to keep files always rendered. FCP X allows you to edit while you’re importing thanks to its new engine, and it’s also got automatic media and people detection on import, as well as image stabilization.

Apple is promoting the new FCP X as a complete and total rebuild. Smart collections look very similar to iMovie, and overall there is a feeling Apple has borrowed some UI elements from the iLife application to make the general design more accessible, even for professionals. For instance, Apple has brought “single keystroke nesting” to Final Cut Pro — a new functionality that allows you to group chunks of media into a single clip in the timeline.  The “inline precision editor” allows you to make edits by revealing media with an iOS-like menu.


It’s possible that the GUI is more user friendly and the functionality has improved but based on the comments and features presented today the jury is still out as to whether or not FCP will be going head to head with the competition.  This means that ease of use may or may not improve functionality but instead lowers the playing field for all the non-editors out there.  I am all in favor of making editing easier but the roll-out today suggests a beta experience that does little to assist the professional editor in cutting a long form project.  Feels like a step backward on the time/space continuum and I always get a little queasy when the word iMovie is mentioned in the same breath as Final Cut.

-Scott Arundale

UPDATE 4.13.11

Upon watching the demonstration in full, I got the impression there were some shills in the audience shouting their appreciation for the new features.

Now that calmer minds prevail let’s look at the upside.  Instant nesting with a single keystroke. Easy keyword features. Better sync and collision options. Automatic color grading, stabilization and background rendering.  Excellent use of the 64 bit engine.  But all of this suggests Apple is more interested in the young editor cutting short form trailers than the longform editor trying to cut a feature, nevermind the hapless assistant who must keep it together.  Much of the work that FCP is trying to achieve is normally the work of the assistant but they pre-suppose that the editor is working solo.  Pity that editor if they don’t have a second pair of hands to help him or her.  I’m all in favor of having the machine do the work.  I’m ready as an Apple Certified Trainer to go back to school and re-learn how to cut faster and easier.  What I loved about FCP is that copy and paste makes it easy to move stuff around.  Apple has made it even “easier”, but again requires a new mindset i.e it takes more thought and fewer keystrokes to achieve the same thing.  If this is the future than I am in.  But it begs the question: who is the target audience for this product?  Surely not the Hollywood narrative professional.  Instead the trailer/bumper/extreme sports crowd may find these new features useful.  For improved storytelling techniques, the jury is still out.


Final Cut Pro is long overdue for a real upgrade

Many complained that FCP vers. 7 was not really worthy of a new number but belonged in the vers. 6 family.  In my opinion Apple has always been trigger happy with upgrades to all their software but nevertheless much of their brain trust has been noticeably absent when it comes to improving their editing platform.  It has been assumed that the tech wizards were otherwise engaged in the cash cows of iPhone and iPad.  This new version is long overdue.

Apple’s Final Cut Pro made its debut at NAB in 1998 before being released as a product the following year. The software has a history of April releases, though its last major version came in July 2009. The software itself hasn’t been a standalone product for quite a bit longer though, instead being wrapped up as part of Apple’s Final Cut Studio suite, which bundles together Final Cut Pro with Motion, DVD Studio, and Soundtrack Pro, as well as the Color and Compressor applications.

Reports began circulating in late February that Apple was nearing completion on a complete overhaul of the software that would bring Final Cut Pro into the 64-bit era and more importantly a release this spring. That report from Tech Crunch which cited anonymous sources, said that the design was both under the hood and sporting a new user interface.

A new report from ProVideoCoalition says Apple plans on “taking over” the 10th Annual SuperMeet event taking place on April 12 to announce a new version of the software.

It may be time to break out the champagne.

Apple presses towards 3D handhelds

Apple has been awarded a patent for a 3D stereoscopic display system fuelling rumour that it is considering adding 3D screen/projection technology to its products, including the iPhone, iPad or Mac computers. Alternatively, the company could be about to enter the 3DTV business – an intriguing prospect. The patent was first applied for back in 2006, but has just been granted, and it is a step in the right direction for Apple to bring about its own form of autostereoscopic (glasses-less) 3D display technology:

Source: HDGURU 3D

Cineform teams up with Aja to offer stereo workflow for the Kona 3

CineForm®, Inc., creators of high-fidelity compression-based workflow solutions for the post production marketplace, announced today that it has teamed up with AJA to offer full stereo 3D workflow support for the newly launched KONA 3G card, the multi-format SD/HD/Dual Link/3G/2K video I/O hardware for Mac.

As part of this cooperative effort, AJA released updated version 8.1 KONA software which has added 3D video controls to the KONA 3G’s Control Panel software interface enabling direct ingest into, and playout of, CineForm 3D files, further simplifying production workflows for customers working with 3D content. During ingest, KONA 3G enables simultaneous real-time capture of separate left eye and right eye sources through HD-SDI – including sources previously recorded in stereo mode on HDCAM SR – directly into CineForm 3D files. Each individual eye is multiplexed together into a CineForm 3D file that is available for immediate editing with CineForm’s Neo3D software when used in combination with Apple Final Cut Pro, Adobe Premiere Pro and other compatible software applications. The new KONA software also adds support for recording and playout of CineForm 4:2:2 2D media.

The AJA KONA 3G card featuring support for CineForm Neo3D is available immediately.

“One of our primary goals with the KONA 3G was to deliver a solution that could handle just about anything our customers are dealing with today. Stereo 3D, and our ability to support it, is rising to the top of the list for many customers,” said Nick Rashby, President, AJA. “Through our collaboration with CineForm, our customers who wish to work with stereo 3D in post can eliminate the time consuming step of transcoding material after ingest and instead be ready to edit immediately when using CineForm Neo3D.”

CineForm Neo3D is CineForm’s award-winning 3D post production workflow solution that enables users to edit 3D projects in real time with full frame rate playback to an external 3D monitor. With CineForm First Light 3D as the enabling 3D workflow and production engine, Neo3D users are provided comprehensive control of the 3D image processing workflow.

The new AJA KONA 3G provides professional editors with the utmost in workflow flexibility, supporting a broad range of video formats including: 10-bit uncompressed video 3G/HD/SD SDI I/O, new HDMI 1.4a output for stereoscopic monitoring to consumer 3D displays, 8-channel AES digital audio I/O (16-channel AES with optional K3G-Box) and 16-channel SDI embedded audio I/O, real-time hardware-based up/down/cross conversion to support a range of SD and HD formats, dual-link HD, even 2K formats, a hardware-based downstream keyer and more.


If you are old enough to remember the Viewmaster

Hasbro Inc. is betting that iPod and iPhone users want 3-D viewing on the go.

The nation’s second-largest toy maker is set to unveil to investors on Tuesday a handheld device called My3D that attaches to the two Apple Inc. devices. It promises three-dimensional content that offers a 360-degree experience in gaming, virtual travel experiences and entertainment content. It’s aimed at both children and adults.

The device, which resembles a pair of binoculars with a slot in which users insert their iPod or iPhone, will be priced at $30. It will be available starting next spring at stores where Apple’s iPhones and iPod Touches are available.

Shoppers can then visit Apple’s App store, which will allow shoppers to browse for additional My 3D content. Content varies in price; some apps will be free.

Hasbro said it was guided by Apple during development and believes there’s nothing available that matches the quality and 3-D experience on the iPhone or iPod Touch.

If it catches on, it has big potential. More than 125 million iPod Touches and iPhones have shipped, according to Shaw Wu, senior research analyst at Kaufman Bros. L.P. He predicts that will hit 200 million by end of 2011.

“The issue with this is whether they are going to get enough content for it,” Wu said.

Hasbro is confident it will and says it has teamed up with Dreamworks Animation, whose movie “Megamind” hit theaters last weekend, to develop material.

Separately, Hasbro’s My3D will use content from a 3-D television network from Discovery, Sony and Imax scheduled to make its debut next year. Viewers will be able to see trailers and exclusive behind-the-scenes snippets from films for up to 20 minutes. Hasbro says the device will be a key way to market its own brands in a 3-D experience, though details haven’t been set.

Meanwhile, Hasbro worked with LA Inc., the Los Angeles Convention and Vistors Bureau, to create virtual travel experiences that include visits to the Wax Museum and the Santa Monica Pier.

Through other apps, users can feel like they’re immersed in deep water, exploring coral reefs or playing a shark attacking a tuna, while all along learning facts about sea life. There are also shooter games in a virtual galaxy.

“The idea of being able to be somewhere in Los Angeles, in this 360-degree environment, to be in the shark tank, to be able to swim with the fish and chase after the fish. These are really breakthrough immersive experiences,” said Brian Goldner, president and CEO of Hasbro.


Apple making noise like it may re-enter TV sweepstakes (or why I hate cable and satellite)

7.6.10  Apple employees are sworn to secrecy but somebody is talking over there.

“The people familiar with the company’s plans also said that Apple executives are well aware that the battle for the living room is going to be arduous, and that the company must get it right the next time.”

My wife gave me an Apple TV last Christmas but I asked her to return it to the store.  Looking at the home entertainment console groaning under the weight of the cable box, the Wii and the DVD player, I wasn’t ready to add another level of confusion not to mention a fourth remote.  But quietly I hoped that someday there might be  way to cut the cable cord and lose some of the expensive clutter.

UPDATE  8.22.10
A la carte television sounds like a wonderful thing particularly since I don’t want to pay for all those sports and home shopping channels and besides the “over-the-air” nets should be free, right?  So why has it taken so long to break the cable and satellite monopolies?  Perhaps we need the prophet Jobs and his army Apple to topple the old temple of Sodom and Gomorrah.  Kevin Rose swears it will happen but don’t hold your breath.
Hell, I will even buy an iPad if it will free me of the local cable bill.  Just get me out of this indentured servitude. Please.
Update: 8.25.10
According to the The Wall Street Journal:
The company is working on a new device that would allow users to stream video, such as rentals, to their TV sets, according to a person with knowledge of Apple’s plans. Unlike Apple’s existing Apple TV hardware, which stores downloads users can access on their televisions, the new device would act as a conduit for streaming media more directly, and could be announced as early as September, the person said.

Apple declined to comment.

Lower prices for TV shows, along with the new TV-streaming device, could help Apple in the pitched battle to pipe content into American living rooms. Traditional cable- and satellite-TV providers are already facing competition from companies including Netflix Inc. and Hulu LLC. Google Inc. soon plans to roll out its own Web-TV service, too.

Media companies, however, have been wary of pumping too much content online, worried that they could encourage viewers to cancel their monthly TV subscriptions. The tens of billions of dollars media companies make each year from monthly bills are a key source of profits


Ladies and Gentlemen – Introducing Avid MC vers. 5

For those who would like to demo the latest version of Avid Media Composer vers.5, there will be a reception at Dodge College this Thursday, July 15 with a hands-on opportunity to work with the system hosted by  Keycode Media.

It starts at 6pm for food and drink

Demo starts at 6:30pm in the Folino and moves into separate rooms.

register today to save your spot.

Applications Specialist, Casey Richards, presented the new features that Avid is touting to a crowd of Hollywood editors and post supervisors today. The interface has changed somewhat and the reaction was generally positive as witnessed by the rounds of applause following the introductions.

Drag & Drop Editing Tools

First up was the Smart Tool Set, a small unobtrusive window that reminds one of the original toolbox found on Premiere and subsequently FCP.  With single mouse clicks, editing can be dragged and dropped in the timeline, trims effected and so on.

Audio is New & Improved

Audio waveforms are drawn immediately and there is no re-draw during changes as the information is already cached.  Real Time Audio Support (RTAS) offers sound processing plugins that do not require rendering and can be auditioned and tweaked on the fly. In fact the GUI is now looking and behaving more like a Pro-Tools session with Solo and Mute buttons on each track.  You also have 5 containers in which to drop each audio plug-in and arrange in terms of hierarchy. Stereo Pairs can be embedded in a single audio channel.  Clip Gain and Auto Gain can be accessed right at the track level.

Closed Caption Support

Will Brown of CPC demonstrated their Mac Caption software which integrates easily with Avid.  He showed how an .scc file is formatted and exported using AAF and then simply dropped on the timeline as a data stream.  It can even be edited alongside the picture.  Unfortunately the text windows can not be displayed anywhere in the Avid and he cautioned that trimming and editing the data stream can result in truncated captions. Avid now has the ability to ingest/capture/separate these embedded streams using the Media creation tool and supports AFD, V-Chip and XDS.

Edit Natively using AMA

Using Avid Media Access, Media Composer 5 can handle Red (.R3D), Quicktime (.MOV and H.264), XDCAM, AVCHD, and the latest Canon XF codec which was announced at NAB this year and supports their now famous DSLR cameras such as the 7D and 5D.

The Red workflow provides full, half or quarter debayering and allows for the inclusion of custom look up tables (LUT). MC5 can import RMD, RLX or RSX files, thus eliminating the need for Red Cine. It should be noted that the native footage can not be shared on ISIS or Unity environments without first consolidating (rewrapping) using MXF.

Mix & Match Frame Rates and Resolutions

Like AMA this is not a new feature but one that Avid continues to promote.  After years of intolerance on behalf of the project settings, the fine folkes of Tewkesbury (soon to be Burlington, Mass) have come up with a “swiss army knife” solution, meaning broadcast quality real-time conversion, with the ability to adjust frame interpolation and pull-down. Impressive although the green dot is always a little intimidating.

Support for External HD Display

Avid has partnered with Matrox MX02 Mini to provide for an inexpensive solution to the HD client monitoring problem.  Avid goes to great pains to stress that this is an “output only” device even though other applications can utilize the input functions. Instead they point out the firmware update on the Nitris DX box which now enables the second video spigot and allows for full-quality 4:4:4 HD-RGB color space processing in Media Composer using dual HD SDI connections.

The Bottom Line:

This upgrade is a must have.  They claim it is software based and does not require any acceleration, although the Nitris DX figures prominently in most profe$$ional solutions.  Avid has managed to place the shot across the bow against what FCP has had to offer and places the ball firmly in Cupertino’s court as to who can provide the most robust professional solution.  I, for one, am glad to see that Avid’s marketing dollars have FINALLY been spent where it was needed most, which is software development.  For too many years they have ignored Hollywood, believing that the global market would carry them to further success.  Game on!

YouTube aims to be 800 lb. gorilla of broadcasting

Not content to dominate online video viewing, yesterday YouTube unveiled new initiatives for viewing on both TVs and mobile devices. Taken together they demonstrate how aggressive YouTube plans to be in the 3-screen viewership era.

First up, YouTube introduced the beta version of “Leanback,” the new 10-foot experience that it introduced at the recent I/O conference. With Leanback, you only need to use the 4 arrow keys and Enter key on your keyboard to navigate the YouTube experience. Video plays in full-screen mode and in automatically in HD when available.

There are different options for what content Leanback delivers: if you have set up subscriptions, it will give you a feed of those videos; in addition, if you’ve connected your YouTube account to your Facebook account you’ll also get a feed of videos your friends are watching/sharing; alternatively, if you’ve done neither YouTube will simply give you the most popular comedy, entertainment, news, etc. You can also easily search and browse.

It is easy to envision Leanback being a couch potato’s nirvana: just fire up Leanback and sit back and have all the day’s most interesting videos begin unspooling. As Will Richmond wrote in in early June, he continues to believe Leanback is a whole new “channel” experience and will give Google TV, when it launches, a huge tailwind.

“YouTube was a superb acquisition for Google. The deal looks more strategic by the day, especially in light of Google’s intensifying battle with Apple.”

Meanwhile, YouTube also rolled out a newly upgraded experience for mobile phones yesterday. The new site at promises to be faster and also more mobile-friendly, with larger buttons and additional features included from the web site. The effort builds on the mobile site YouTube has offered for several years. YouTube also revealed that playbacks grew by 160% in 2009 and are now up to an astounding 100 million per day. More interestingly, YouTube appears to be trying to migrate users away from the YouTube mobile app which Apple has offered on the iPhone for a while. YouTube is promising web site updates will be followed quickly on the new mobile app, “unlike native apps which are not updated as frequently.”

With the new Leanback and mobile initiatives YouTube is continuing to evolve from its roots as just a user generated content web site. YouTube appears to be firing on all cylinders: hitting a record 14.6 billion views in May, landing big-time advertisers like VISA/Toy Story 3 and Xbox’s Kinnect and adding popular premium content like World Wrestling Entertainmentnew project produced by Ridley Scott. YouTube’s biggest day is still ahead of it, when it will no doubt take center stage in the high-profile launch of Google TV later this year.


We knew it would come to this…video editing on your iPhone!

A First Look at video editing for the iPhone

I can barely manage an editing session on my laptop.  Imagine my chagrin when   the client hands me the phone and asks to trim a few frames and add a new background and titles to the timeline.

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.