Archive for the ‘DI Workflow’ Category

Final Cut Pro is long overdue for a real upgrade

Many complained that FCP vers. 7 was not really worthy of a new number but belonged in the vers. 6 family.  In my opinion Apple has always been trigger happy with upgrades to all their software but nevertheless much of their brain trust has been noticeably absent when it comes to improving their editing platform.  It has been assumed that the tech wizards were otherwise engaged in the cash cows of iPhone and iPad.  This new version is long overdue.

Apple’s Final Cut Pro made its debut at NAB in 1998 before being released as a product the following year. The software has a history of April releases, though its last major version came in July 2009. The software itself hasn’t been a standalone product for quite a bit longer though, instead being wrapped up as part of Apple’s Final Cut Studio suite, which bundles together Final Cut Pro with Motion, DVD Studio, and Soundtrack Pro, as well as the Color and Compressor applications.

Reports began circulating in late February that Apple was nearing completion on a complete overhaul of the software that would bring Final Cut Pro into the 64-bit era and more importantly a release this spring. That report from Tech Crunch which cited anonymous sources, said that the design was both under the hood and sporting a new user interface.

A new report from ProVideoCoalition says Apple plans on “taking over” the 10th Annual SuperMeet event taking place on April 12 to announce a new version of the software.

It may be time to break out the champagne.

Roger Deakins may be calling it quits: “Whether I’ll Shoot on Film again, I Don’t Know”

Nine-time Academy Award nominated cinematographer Roger Deakins (The Coen Brothers films, The Shawshank Redemption, A Beautiful Mind, The Reader, Kundun) has seen the future, and it isn’t 35mm. Deakins has worked on film for 35 years. He is the type of veteran whom you would expect to be a film purist. Last year, for the first time in his long history, Deakins decided to shoot a feature length movie (Andrew Niccol’s science fiction thriller Now) using digital video cameras, and he’s not sure he’ll be going back to celluloid.

The technology and how it’s changing and the possibilities that are coming. This film Now, I’m shooting on a digital camera (Arri Alexa). First film I’ve shot digitally, because, frankly, it’s the first camera I’ve worked with that I’ve felt gives me something I can’t get on film. Whether I’ll shoot on film again, I don’t know. [Shooting on Digital] gives me a lot more options. It’s got more latitude, it’s got better color rendition. It’s faster. I can immediately see what I’m recording. I can time that image on set with a color-calibrated monitor. That coloring goes through the whole system, so it’s tied with the meta-data of the image. So that goes through the whole post-production chain, so it’s not a case of being in a lab and having to sit and then time a shot on a shot-by-shot because this has already got a control on it that’s set the timing for the shot, you know?

Am I nostalgic for film? … I mean, it’s had a good run, hasn’t it? You know, I’m not nostalgic for a technology. I’m nostalgic for the kind of films that used to be made that aren’t being made now.

The grain is unique, but on this film Now that I’m doing, I’m probably going to add grain for certain sequences where I feel that they would benefit having grain, just the look and the texture of it. Yeah, there are certain things about film emulsion that I love, and for certain projects, absolutely. I would certainly consider shooting film again, but you can add grain to a digital image. And, frankly, it’s not the technology that makes the great movies. I mean, if you went back to see Citizen Kane and you looked at it on a big screen and you looked at the quality of the image, I mean, frankly, some of it is not very…well, good’s not the right word, because technically it’s not as sharp. Some of it is very grainy. The lens quality is not as good as modern lenses. But…[Laughs] it’s still a better film than ninety-nine percent of what are made today. So, you know, it’s not just about technique and equipment.

Interview with David Chen /

Sony aims for new camera to compete with the Red One

Extending itself further into the independent filmmaking arena, Sony has unveiled its first professional handheld digital production camera with a Super 35mm imager. Dubbed PMW-F3, the camcorder will ship in February 2011, at a list price of at $16,000 for the body only, or at $23,000 for a kit that includes three Sony-branded T2.0 PL-mount prime lenses at 35mm, 50mm and 85mm.

Sony is positioning the camera as a bridge between high-end ENG acquisition and feature filmmaking. It’s based on the company’s XDCAM EX platform and has been designed to support high-end workflows with a Super 35mm-sized CMOS sensor and optional dual-link HD-SDI output.

The camera made its U.S. debut on November 17 at a gathering of students and pros at USC’s School of Cinematic Arts. Next month it will be shown at NYU. Why the academic flavor? “There are future Oscar winners in this room,” said Alec Shapiro, senior veep at Sony’s pro solutions group.

Former Panavision exec Andy Romanoff believes that, in addition to indies, studio filmmakers might end up using the F3 as a B camera. However, “the typical studio film will stick with higher-end cameras for most of their photography,” he said.

Naturally, Sony hopes those first-unit cameras will be its F35’s or SRW-9000’s.


Sony’s new PMW-F3 is Sony’s third 35mm Cinealta Digital Camcorder and is based on the XDCAM EX platform. Specs of the F35 and SRW-9000PL still exceed the F3, we’re told, but this camera is no slouch, and footage shot with all 3 cameras should intercut. The F3, above all, is still a handheld camcorder. It doesn’t sit on your shoulder. It weighs about 5 pounds, which is lighter than many of the lenses you’ll be using. There’s a tilting viewfinder at the rear of the top handle. It looks similar to the HVR-Z7U finder: about 1.2 million pixels. An LCD monitor pivots out from the camera left side. Click here for  Sony UK’s overview and specs.

The new F3 camcorder is based on Sony’s XDCAM EX technology. There are 2 Sony’s SxS ExpressCard slots in back. The Super 35mm CMOS imager promises high sensitivity and low noise levels. The ballpark sensitivity rating is approximately ISO 800 and unconfirmed reports hint at an exposure range greater than 13 stops. The adventure continues.

There are HD-SDI dual-link outputs at the rear of the F3 for external recording (4:2:2 1080 50/59.94P normal; and RGB 1080 23.98/25/29.97PsF as an option). You’ll be able to select S-Log and Hyper Gamma to seriously increase the dynamic range. S-Log is Sony’s take on RAW “Digital Negatives.” The image, uncorrected, looks pale and washed out (like a negative), but when a Look-Up Table (LUT) is applied, shows the full dynamic range of the image, giving you greater flexibility for color and contrast correction in post.

The F3 records natively onto SxS Cards at 35 Mbps at 4:2:0 8-bit in XDCAM EX format. The SxS Cards are formatted in standard FAT file format; a 32 GB Card will record 100 minutes in highest quality. Many users will be happy with this. But, like Oliver Twist, many will want more. And they can have more–with the ability to use the onboard SxS Cards as immediately editable proxies, while simultaneously recording to a higher standard. That might include 4:4:4 10-bit S-Log HD-SDI dual link to an SRW-1 /SRPC-1 SRW tape recorder at visually lossless 440 and 880 Mbps or (next year) 1 TB Solid State Memory Cards at 220 and 440 Mbps.

Recording formats include 1920 x 1080, 1440 x 1080, and 1280 x 720 at 23.98/25/29.97p, 50/59.94i and, in DVCAM mode, 25/29.97PsF and 50/59.94i. Under- and overcranking is called S & Q for “slow” and “quick” recording, from 1 to 30 fps at 1920 x 1080 (17 to 30 fps in dual-link mode) and 1 to 60 fps at 1280 x 720 (17 to 60 fps in dual-link mode).

Who’s going to shoot with Sony’s F3–and how? If you’re a student or independent, you’ll probably take the simplest package possible: a zoom or primes, record to SxS onboard cards, and go direct to edit. Of course, you’ll be sure to diligently back up those SxS cards using Sony’s PXU-MS240 Mobile Storage Device, which not only backs up the cards, but also carefully checks the data to be sure it’s all there (parity). Next, you’ll copy the SxS card onto your Avid or Final Cut Pro system.  Go to download the Sony Cinemon plug-in: it enables MPEG-4 to be transparent to FCP Quicktime. You’ll be able to edit natively in FCP, with drag and drop, and all files instantly viewable on a Mac. Avid’s AMA (Avid Media Access) plug-in mounts the XDCAM EX files directly into Avid Media Composer.

If you’re shooting documentaries, commercials or TV, you might follow a similar path. Of course, you will not reformat your SxS Cards until the job is safely completed and many archives and copies have been cloned. Cards are relatively cheap. The dreaded word “Oops” is very expensive when a once-in-a-lifetime scene is re-formatted.

High end productions, recording to SR tape or memory, should soon have native support of SR codec on Avid and Final Cut Pro. The HD-SDI outputs of the Sony F3 will be eyed with great interest by the high-end after-market storage gurus at Codex, Cinedeck and elsewhere.


Cineform teams up with Aja to offer stereo workflow for the Kona 3

CineForm®, Inc., creators of high-fidelity compression-based workflow solutions for the post production marketplace, announced today that it has teamed up with AJA to offer full stereo 3D workflow support for the newly launched KONA 3G card, the multi-format SD/HD/Dual Link/3G/2K video I/O hardware for Mac.

As part of this cooperative effort, AJA released updated version 8.1 KONA software which has added 3D video controls to the KONA 3G’s Control Panel software interface enabling direct ingest into, and playout of, CineForm 3D files, further simplifying production workflows for customers working with 3D content. During ingest, KONA 3G enables simultaneous real-time capture of separate left eye and right eye sources through HD-SDI – including sources previously recorded in stereo mode on HDCAM SR – directly into CineForm 3D files. Each individual eye is multiplexed together into a CineForm 3D file that is available for immediate editing with CineForm’s Neo3D software when used in combination with Apple Final Cut Pro, Adobe Premiere Pro and other compatible software applications. The new KONA software also adds support for recording and playout of CineForm 4:2:2 2D media.

The AJA KONA 3G card featuring support for CineForm Neo3D is available immediately.

“One of our primary goals with the KONA 3G was to deliver a solution that could handle just about anything our customers are dealing with today. Stereo 3D, and our ability to support it, is rising to the top of the list for many customers,” said Nick Rashby, President, AJA. “Through our collaboration with CineForm, our customers who wish to work with stereo 3D in post can eliminate the time consuming step of transcoding material after ingest and instead be ready to edit immediately when using CineForm Neo3D.”

CineForm Neo3D is CineForm’s award-winning 3D post production workflow solution that enables users to edit 3D projects in real time with full frame rate playback to an external 3D monitor. With CineForm First Light 3D as the enabling 3D workflow and production engine, Neo3D users are provided comprehensive control of the 3D image processing workflow.

The new AJA KONA 3G provides professional editors with the utmost in workflow flexibility, supporting a broad range of video formats including: 10-bit uncompressed video 3G/HD/SD SDI I/O, new HDMI 1.4a output for stereoscopic monitoring to consumer 3D displays, 8-channel AES digital audio I/O (16-channel AES with optional K3G-Box) and 16-channel SDI embedded audio I/O, real-time hardware-based up/down/cross conversion to support a range of SD and HD formats, dual-link HD, even 2K formats, a hardware-based downstream keyer and more.


Franhaufer Institute offers essential tool STAN for stereographers

The Stereoscopic Analyzer (STAN) combines realtime image analysis with visualization tools to assist cameramen and post-production staff in shooting correct stereo content.

Shooting and processing high-quality 3D content is a huge challenge for production teams. A wide range of parameters like color matching, stereo geometry and the orientation of the two cameras may vary from scene to scene depending on content, near and far objects, the convergence plane and the depth of focus.

Developed by the Fraunhofer Heinrich Herz Institute, the Stereoscopic Analyzer Assistance System for Perfect 3D Stereo STAN supports camera operators and stereographers in computing stereo parameters and camera settings critical for stereo quality. STAN ensures that the parameters are fed directly to both cameras so that incorrect setting can be identified and adjusted either manually or automatically if the set-up is a motorized stereo rig.

STAN captures and analyses stereo images in real-time. Metadata can be generated and saved for a streamlined post production process. Corresponding feature points in the scene are matched automatically to determine the given disparity range, and to compute stereo calibration data. Using actuators, the stereo baseline and other mechanical parameters of the stereo rig can be adjusted automatically so that the specified disparity range is not exceeded. Residual distortions in color and stereo geometry can be corrected using real-time color matching.

STAN uses a touch screen and viewing tools like crop/opacity overlay, side-by-side, checkerboard or anaglyphic stereo to analyze the stereo quality while tools like RGB parade, signal waveforms or color histograms assist in color control. Pixelby-Pixel disparity maps are used to visualize the depth structure of the scene. Basic stereo parameters like convergence planes can be adjusted manually and the results watched simultaneously. Related shift-cropscale processing is done on-the-fly.

STAN was developed by the Fraunhofer Heinrich Hertz Institute, Berlin in association with KUK Film Produktion, Munich as part of the German interdisciplinary project PRIME funded by the Federal Ministry of Economics and Technology (BMWi).

Location Filmmaking 2011 Finalists Announced

Dodge College of Film and Media Arts announced today the finalists for the new Location Filmmaking program.  During the month of January, two films will be shot, one a live action 3D film lead by Bill Dill, A.S.C. and the other a film combining live action and visual effects lead by Scott Arundale.  The completed films will be presented in the Folino Theater on Friday, April 29th at 7pm.

The two teams selected for either 3D or VFX film projects will be announced November 20.


3D Location Finalists

Cottontail by James Humphreys

Director:  Rob Himebaugh

Producer: Natalie Testa

Cinematographer: Scotty Field

Editor: Arica Westadt

Sound Designer: Sean Yap

Production Designer: Ryan Phillips


Gift of the Maggie by Ben Kepner

Director:  Chris Bryant

Producers: Jane Winternitz & Samantha Price

Cinematographer: Greg Cotton

Stereographer: Tashi Trieu

Editors: Chase Ogden & Matt Kendrick

Production Designer: Jeanette Sanker


The Harvest by Turner Jacobs

Director:  Alexander Gaeta

Producer: Missy Laney

Cinematographer: Trevor Wineman

Stereographer: Andrew Finch

Editor: Ryan Kaplan

Sound Designer: Cody Peterson

Production Designer: Christy Gray


A Smart Fly by Brandon Wade

Director:  Brandon Wade

Producer: Zach Mason

Cinematographer: Jason Bonninger

Editor: Sean Yap

Sound Designer: Andres de la Torre

Production Designer: Scheherazade Dadci


VFX Location Finalists

A Good Man by Gary Alvarez

Director:  Gary Alvarez

Producer: Ayelet Bick

Cinematographer: David Rivera

VFX Supervisor: Alessandro Struppa

Editor: Jonathan Melin

Sound Designer: Affan Tanner

Production Designer: Micah Embry


A Nervous Wreck by Jonathan Thompson and Norm Leonard

Director:  Jonathan Thompson

Producer:  Renee Mignosa

Cinematographer: John MacDonald

Editor: Andrew Carney

Sound Designer: Jeff Brown

Production Designer: Lauren DeWitt


Prey by David Thompson

Director: Jack Brungardt

Producer: Ian Dalesky

Cinematographer: Michael Althaus

VFX Supervisor: Bryan Chojnowski

Editor: Alex Griffin

Sound Designer: Derek Beamer

Production Designer: Kaitlin Kubiak


Time Capsule by Ira Parker

Director:  Shane McCarthy

Producer: Samer Imam

Cinematographer: Jared Wheeler

VFX Supervisor: Nader Owies

Editor: Affan Tanner

Sound Designer: Chris Mastellone

Denver is calling VFX veteran Doug Trumbull

The Hollywood veteran who oversaw special effects for science fiction classics such as “Blade Runner” and “2001: A Space Odyssey” wants to build a next-generation movie studio in Denver, a project that could change the way films are made and put Colorado on the map for big-budget productions.

At Douglas Trumbull’s proposed digital virtual studio, 3D and effects-driven films could be shot entirely on stage in front of a “green screen,” using patented technology such as a “zero-gravity” camera.

Virtual worlds of infinite forests and alien planets would be incorporated into the production in real-time via computer graphics.

Trumbull, a recipient of a lifetime achievement Oscar for his technical wizardry, calls filming on location with physical sets a “dying art.” He said the virtual process — with the ability to test and perfect shots using inexpensive stand-in actors — could cut production costs by more than 50 percent.

“I’m proposing a whole series of iterative live-action performance rehearsals of your entire screenplay, which could be shot in a couple of days because you have no sets, no props and almost no crew,” Trumbull said during a recent presentation at the Colorado Film School in Denver.

Trumbull, 68, visited Denver to solicit investments and scope metro-area locations for a multimillion-dollar project that was dreamed up a decade ago but is still in the early stages of development. Though private investors and venture capital officials attended the presentation, none have publicly expressed interest.

The studio could be a boon for a state that has long struggled to attract major motion film productions, which officials attribute to the lack of financial incentives.

“It could be a real game changer for Colorado,” said Kevin Shand, director of the Colorado Office of Film, Television & Media. “Right now, we’re just not competitive because of the incentives out there. We have everything else production companies need. We have the crew, we have the talent, we have the infrastructure, but we don’t have the money component.”

Trumbull presented the virtual studio idea 10 years ago to major film companies such as Warner Bros. and Columbia.

“Nobody called me back,” he said. “It was seen pretty unanimously as a twisted paradigm shifter that threatened their entire business model.”

He has since tweaked the pitch, proposing to couple the studio with a film production business unit to be backed by a hedge fund or venture capital to the tune of $100 million or more.

“Pixar makes their own animated films from ideas generated within their company,” Trumbull said. “I think we can adapt the Pixar business model very effectively and apply it to live-action production.”

He said there is no shortage of science fiction and fantasy material to fill the pipeline of content.

“There are a lot of pent-up movies out there in Hollywood that got budgeted by the major studios and rejected because they were $150 million,” Trumbull said. “They would’ve been happy to spend $65 million.”

Trumbull, who lives on a 55-acre farm in Massachusetts, said he’s interested in building the studio in Denver because of the quality of life and high-tech workforce, pointing to the presence of companies such as Ball Aerospace and RealD. The latter develops 3D technology for theaters and has a research hub in Boulder.

“This is the first time I’ve made this pitch to anybody since I made this pitch in Hollywood 10 years ago,” said Trumbull, creator of the “Back to the Future” simulator ride, which had a long run at three Universal Studios theme parks.

For the virtual studio, Trumbull envisions a circular stage housed in a two-story, 15,000-square-foot building, with a state-of-the-art camera as the centerpiece.

“That camera is weightless and almost mass-less and can be grabbed and moved anywhere around the stage,” he said of the camera, which he has used to film short features.

The studio would feature an automated lighting grid that could be preset and programmed in advance. To limit financial risk, Trumbull wants the studio constructed in a way where it could be turned into an office building overnight. “If we fail, this is not a dog,” he said. “This is not a white-elephant building.”

Ed Kramer, a professor of visual effects and computer graphics at Regis University’s film school, said the concept won’t replace the traditional method of filmmaking

“But if successful, it’s going to vastly reduce the cost and much of the need for location work,” said Kramer, who worked on special effects for movies such as “The Mummy” and “Twister.”

Shand of the Colorado Office of Film said the virtual studio could help Colorado land major movie productions.

“This facility, because of the way it’s going to be structured, overcomes the financial incentives that other states offer,” Shand said. “It can be as beneficial or more beneficial to film in Colorado than it would be in some other states.”


Universal Studios/EFILM Open Virtual DI Suite

Universal City, CA–In a joint venture, EFILM, a subsidiary of Deluxe Entertainment Services Group, and Universal Studios opened a Digital Intermediate suite on the Universal Studios lot, in proximity to the studio’s sound mixing stages, soundeditorial rooms, picture editing suites and other sound services. Since both final sound mixing and the DI come at the end of the post production chain for feature films, having both services in physical proximity allows the director to walk from room to room, rather than get in a car and battle traffic from Burbank to Hollywood or Santa Monica to Universal City.

What makes this new suite stand out is that it is a virtual DI room. The room is connected to EFILM’s Hollywood facility via a secure, private fiber link that transmits uncompressed 2K 4:4:4 images. That means that, in the Universal Studios’ DI suite, there is no machine room, no scanner and a minimum of hardware and software actually reside there.

Deluxe Entertainment Services Group COO Warren Stein notes that the company has similar configurations of adjacent sound stages and DI rooms in Toronto and Australia. Deluxe also has a similar virtual DI suite on the Fox lot, for internal use. The Universal Studios DI suite is the first such virtual suite to be available to incoming projects.

According to Universal senior vp/sound services Christopher Jenkins, his filmmaker clients have been asking for this kind of set-up. “All the directors want a DI suite [on the lot],” he says. “As soon as they’re into final mixing, it’s a loss of their time and attention to have sound services on the lot but have to leave to do the DI. We’ve got the sound facility here, and now we have a DI suite for all comers.”

The DI suite features both film and digital projection, with a 2K digital projector, with capabilities of screening 3D for both XpanD and RealD 3D systems.  EFILM executive vp/GM Kevin Dillon, who also manages the EFILM virtual DI room, notes that EFILM uses a proprietary version of Autodesk Lustre for color correction. “We have worked closely with Autodesk to build out from the Lustre,” he says. “We have our own image science team and we’ve built our own LUTs for the variety of film stocks and film labs, as well as the new digital cameras such as the Canon DSLRs and ARRI Alexa.”EFILM also works with VFX houses for plate timing. “We providfe them with viewing LUTs, so they don’t go off in a different direciton,” he says, adding that the company works on testing with VFX supervisors at no charge.

The new virtual suite has no resident DI; DI artists from EFILM’s Hollywood facility will work on the lot, as requested by specific directors. The first films to go through this new DI pipeline on the Universal lot are Fast & Furious 5The Little Fockers and The Thing. Filmmakers who work in the room will have their films scanned at EFILM’s Hollywood facility, but can see the exact same images they’d see in the DI room on the Universal Studios lot.


3D Entertainment and Technology Festival is free to the public

The 3D Experience, New York’s first Annual 3D Entertainment and Technology Festival, today announced the presenters lineup for the Executive Forum. To kick off the three day event, key industry leaders and professionals will converge at the AMC Empire 25 Theaters in Times Square on Sept. 24 for a day packed with informative keynotes, presentations and panels encompassing the full spectrum of the 3D industry.

“The Executive Forum brings together industry pioneers and newcomers to take the pulse of the rising 3D industry and learn to navigate the ever-changing entertainment and technology landscape,” said Nino Balistreri, managing director for The 3D Experience. “3D has altered the way consumers experience digital content and will continue to push the limits of creativity. The 3D Experience will be an incubator for enduring partnerships and new revenue opportunities.”

The inaugural Executive Forum features a dynamic lineup including an all-industry address by Phil McKinney, vice president and chief technology officer, HP, followed by presentations from Ken Venturi, chief creative officer & EVP, National CineMedia, Robert H. McCooey, Jr., senior vice president of new listings and capital markets, NASDAQ OMX, Richard Gelfond, CEO, IMAX, Jim Chabin, president, International 3D Society, and David Beal, president, National Geographic Entertainment. These key industry veterans will cover the emergence of 3D in recent years, its financial impact, how to take advantage of its robust growth and thrive in this exciting, uncharted territory.

3D technology experts and industry creatives from Samsung, LG, Mitsubishi, Panasonic, RealD, 3ality, 3D Eye Solutions, Legend 3D, Motorola, Technicolor and more will delve into a broad range of topics including home entertainment, broadcast, video games, sports, post-production conversion of 2D to 3D, 3D’s future in advertising and filmmaking. Of note, Adweek’s award-winning advertising critic, Barbara Lippert, will moderate “Getting Ahead of 3D for Advertising Professionals,” a panel geared toward marketing and advertising executives who are experimenting with 3D technology. In addition, IMS Research’s Anna Hunt, principal analyst, will present on “The Elusive Consumer and Expectations for 3D in the Home.”

The 3D Experience Executive Forum attendees will be offered unparalleled networking opportunities through the NASDAQ Opening Night VIP Reception presented by LG, VIP film screenings, dinner reception and the highly-anticipated 3D TV Test Drive. For the latest more information on the speaker lineup and panels, please visit: Speakers.

The Executive Forum is targeted at industry professionals, but The 3D Experience will engage entertainment enthusiasts and general consumers alike by simultaneously presenting the 3D Consumer Showroom at the Discovery Times Square Exposition from Friday, Sept. 24 to Sunday, Sept 26. Hosted by Best Buy, the Consumer Showroom will be free and allow visitors to interact with a myriad of 3D products from 3D TVs, gaming systems, home theatre accessories and more. Showroom hours are Friday, September 24, noon-8pm; Saturday, September 25, 10am to 8pm; and Sunday, September 26, 10am to 6pm. To enrich the festival weekend, AMC Theatres Empire 25 will feature screenings of classic and recent 3D blockbusters. For an up to date schedule of screenings, please visit:

About The 3D Experience
The 3D Experience is committed to creating large-scale interactive programs that bring together leading minds, leading products and leading experiences. Event Partners for The 3D Experience include NASDAQ OMX, Best Buy, National CineMedia, IMAX, AMC and Discovery TSX. Sponsors include LG Electronics USA, Mitsubishi Digital Electronics America, Inc., 3ality Digital, Panasonic, NVIDIA, AT&T, Northern Lights Entertainment, 3D Eye Solutions, BodySound Technologies, Texas Instruments, RealD, 3DMedia, Jump 3D, Hello Charlie and Passmore Lab. Supporters include National Geographic Entertainment, Paramount Pictures, Red Bull Records, IMS Research and International 3D Society. The 3D Experience is produced by e5 Global Media, a diversified company with leading assets in the media and entertainment arenas. For more information and to register for The 3D Experience visit Connect with The 3D Experience on Facebook at and Twitter at

iDailies is not a dream but a reality

Editors and Camera Operations have come closer together.  As Mike Cioni of LightIron Digital points out, moving post production onto film sets is currently a trend that has gained momentum. P3 reports on the latest trend.

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.