Archive for the ‘3D’ Category

Hands on: Sky 3D review

A whole new ball game

To capture its 3D broadcast pictures Sky uses two HD cameras to take left and right-aligned images of a chosen scene. The need for dedicated 3D camera rigs means that viewers watching a live event – such as the Ryder Cup golf tournament, for instance – don’t see the same images as the regular 2D transmission.

This also means separate commentary teams and studio presenters. The images are anamorphically compressed and positioned side by side before being encoded as a normal HD stream. Anyone watching in 2D who tunes in to channel number 217 will see the split screen showing two nearly-identical images. It’s then time to tell your TV it needs to engage its side-by-side 3D mode and the screen will display a single fuzzy image.

For perfect clarity you pop on your 3D specs and assume your viewing position. Sky’s 3D channel may now be fully-fledged, but as a glance at the programming guide shows, there aren’t that many original 3D broadcasts in a given week.

This, though, is deliberate, as Sky admits that 3D viewing is meant for specially planned events and the idea of watching uninterrupted 3D shows and adverts (not that there are any) is simply unimaginable.

The very nature of 3D viewing places you in a cinema-like situation and it’s largely down to the darkened, shuttering specs. Hence: no glancing at each other as you discuss Tiger Woods’ dire tee shot; no getting up to make a brew while keeping an eye on proceedings; and no reading magazines during the ad breaks.

3D programming on Sky

So despite several hours of preview footage and various repeats, the amount of original 3D programming available feels about right.

The first week was dominated by four days of golf, with the rest of the schedule given over to a couple of CGI movies (Monsters vs Aliens and Ice Age: Dawn of the Dinosaurs), some sporting archive footage (World Matchplay Darts, US Open tennis, Super League rugby and the 2010 Champions League final) and some bespoke 3D documentaries about dancing and wildlife.

The first time you watch any genre in 3D is undeniably exciting, although the process of switching from 2D viewing on a Panasonic 3D plasma was convoluted and involved several menu selections plus the need to switch from Normal mode to Dynamic to compensate for the reduction in brightness caused by the tinted 3D glasses.

The reversal of this process also makes it a chore to switch back to 2D and check what’s on another channel. Of all the sports currently on show, golf is perhaps the biggest challenge for 3D producers. While football and tennis lend themselves to some naturally good angles that give a welcome sense of depth, golf offers a lot of images that seem flat because there isn’t enough foreground interest.

3D TV in action

The best shots are those of the players teeing off, or caddies milling around the green, with a packed gantry behind them and the glorious Welsh hills in the background. Even then, the 3D effect is stronger when the sun is shining than when it is gloomy and wet.

And, despite the irritating commentators’ propaganda about how fabulous 3D is, sometimes the darkness and lack of definition make it impossible to see the hole.

But other material fares better. With macro-3D documentary The Bugs!, the curiosity of seeing things stereoscopically had me marvelling at certain scenes, while the documentary entitled Dance, Dance, Dance has some great wide shots of different dance styles, and seems to work better than the animated movies that play havoc with your eyes at times by using outward projecting objects whose disappearance at the edge of frame contravenes spatial logic – although Sky should be applauded for getting a good roster of new 3D movies on its channel.

There’s no doubt that there’s still some way to go before you can sit down in front of Sky 3D and feel completely happy with the experience, but even at this early stage it shows promise.

Source: 3dradar.techradar.com

Epic, Epic, Epic : Peter Jackson buys the new Red camera in bulk for The Hobbit

The train appears to be leaving the station as another “A-List” director, Bryan Singer endorses the new Red Camera system known as Epic:

From: bleedingcool.com

In my youth, Kubrick’s Barry Lyndon was almost an mythical movie, and a big part of the myth revolved around the “special lenses” that Kubrick used to shoot the film. Made by Zeiss from NASA-developed still-camera lenses, they allowed Kubrick and cinematographer John Alcott to shoot a number of scenes in the film that were lit entirely by candlelight.

From what I keep reading about its capabilities, I think Kubrick would have loved the upcoming Red EPIC camera, and here’s one hint as to why.

Bryan Singer has personally stopped by the Red User forums to leave a Christmas Eve message, revealing just a little of what he’s planning for his next picture:

I’m very much looking forward to using the EPIC Red for my next movie Jack the Giant Killer which will be shot in, what else, 3D. The camera’s incredibly compact size and extraordinary resolution are ideal for the 3D format.

But more importantly Jack the Giant Killer is my first movie set in a time before electricity. The EPIC’s extraordinary exposure latitude will allow me to more effectively explore the use of natural light.

“More importantly”? Yeah, I’m sure some people are going to read that as anti-3D sentiment. Either way, I’m reckoning that this is going to be a wonderfully shot movie and to know that Singer is feeling ambitious about the cinematography is nicely encouraging.

Update: 12.17.10

From Jim Jannard and Darius Wolski, A.S.C.

Ridley Scott’s upcoming Science Fiction film, which begins principal photography this spring, will be shot on EPIC.

“In my opinion, the new Red Epic camera is about to revolutionize all spectrums of the film industry.

I am going to use Epics in my new project directed by Ridley Scott. I am amazed with the quality of the image and the fact that you can shoot 5k at 120fps without compromising resolution, and most of all the size of the camera.

Combined with the Element Technica Atom 3d rig, we will be able to shoot a 3d movie with the flexibility of a conventional cinema camera.

I don’t see anything that comes close to it at the moment. I can’t even imagine the potential Epic will have on the big blockbuster industry as well as independent cinema.”

11.28.10 from Jim Jannard, owner and developer of the Red Camera systems:

Peter Jackson’s two film adaptation of The Hobbit will be shot in 3D using RED DIGITAL CINEMA’S soon to be released EPIC Digital Cameras.

The Hobbit will be amongst the first productions in the world to use the EPIC and at least thirty cameras will be required by the 3-D production. The EPIC’S small size and relatively low weight, makes it perfect for 3-D – where two cameras have to be mounted on each 3D rig.

The successor to RED’s industry changing RED ONE, the EPIC has 5K resolution, can shoot up to 120 frames per second and has a new HDRx™™ mode for the highest dynamic range of any digital cinema camera ever made. Taking everything they had learned from building their first camera, RED designed the EPIC from scratch and have produced a smaller, lighter camera that is an order of magnitude more powerful.

Jackson has a long history with RED, dating back to when he directed the short film ‘Crossing the Line’ as a very early test of prototype RED ONE cameras. “I have always liked the look of Red footage.” he says, “I’m not a scientist or mathematician, but the image Red produces has a much more filmic feel than most of the other digital formats. I find the picture quality appealing and attractive, and with the Epic, Jim and his team have gone even further. It is a fantastic tool, the Epic not only has cutting edge technology, incredible resolution and visual quality, but it is also a very practical tool for film makers. Many competing digital systems require the cameras to be tethered to large cumbersome VTR machines. The Epic gives us back the ability to be totally cable free, even when working in stereo.”

Jim Jannard the owner and founder of RED flew to New Zealand earlier this year with members of his team so that Jackson could test the EPIC and assess its suitability. “Everybody at RED is incredibly proud that Peter has chosen the Epic” says Jannard, “The Hobbit is a major production, and could have chosen any camera system that they wanted. The fact that they went with us is extremely gratifying.”

The Hobbit will start shooting in New Zealand early next year.

Jim

Will the Royal Wedding be broadcast in 3D?

Prince William and Kate Middleton intend to make their wedding a people’s event and on April 29 the happy couple may seem close enough for you to reach out and touch them. Broadcasters are considering plans to screen the royal wedding in 3D. If those plans come to fruition it would mean a worldwide audience of millions would watch the anticipated marriage ceremony through 3D glasses.

It is understood that Sky, the BBC and Virgin are in joint discussions about the possibility of screening the event live from Westminster Abbey in 3D.

Sky TV have pioneered the new 3D technology on the small screen, largely for sporting events, but it is more likely that a terrestrial broadcaster such as the BBC will get full access to footage of the event, in order to cater for the largest possible TV audience.

While the technology requires a special television set for home viewing, it is possible that the event could be screened in pubs and cinemas for mass public consumption in 3D.

Jana Bennett, director of BBC Vision, said early meetings had taken place with other broadcasters and she was aware of interest in using 3D technology. She said: ‘We are already planning with the other broadcasters so I know about the 3D thing as well. That is obviously of some interest but our responsibility is to bring things everybody can see on air and 3D has a very limited footprint.’

She added the royal couple were in ‘their own time, their own space and we shouldn’t make assumptions yet about what our coverage should amount to.’

There has been speculation on several technology websites that Sky is considering 3D coverage of the event, but a spokeswoman for the broadcaster said it was ’speculation at this stage’.

Prince William and Kate Middleton, both 28, announced their engagement last week, nine years after they met as students at St Andrews University.
Source: dailymail.co.uk

Cineform teams up with Aja to offer stereo workflow for the Kona 3

CineForm®, Inc., creators of high-fidelity compression-based workflow solutions for the post production marketplace, announced today that it has teamed up with AJA to offer full stereo 3D workflow support for the newly launched KONA 3G card, the multi-format SD/HD/Dual Link/3G/2K video I/O hardware for Mac.

As part of this cooperative effort, AJA released updated version 8.1 KONA software which has added 3D video controls to the KONA 3G’s Control Panel software interface enabling direct ingest into, and playout of, CineForm 3D files, further simplifying production workflows for customers working with 3D content. During ingest, KONA 3G enables simultaneous real-time capture of separate left eye and right eye sources through HD-SDI – including sources previously recorded in stereo mode on HDCAM SR – directly into CineForm 3D files. Each individual eye is multiplexed together into a CineForm 3D file that is available for immediate editing with CineForm’s Neo3D software when used in combination with Apple Final Cut Pro, Adobe Premiere Pro and other compatible software applications. The new KONA software also adds support for recording and playout of CineForm 4:2:2 2D media.

The AJA KONA 3G card featuring support for CineForm Neo3D is available immediately.

“One of our primary goals with the KONA 3G was to deliver a solution that could handle just about anything our customers are dealing with today. Stereo 3D, and our ability to support it, is rising to the top of the list for many customers,” said Nick Rashby, President, AJA. “Through our collaboration with CineForm, our customers who wish to work with stereo 3D in post can eliminate the time consuming step of transcoding material after ingest and instead be ready to edit immediately when using CineForm Neo3D.”

CineForm Neo3D is CineForm’s award-winning 3D post production workflow solution that enables users to edit 3D projects in real time with full frame rate playback to an external 3D monitor. With CineForm First Light 3D as the enabling 3D workflow and production engine, Neo3D users are provided comprehensive control of the 3D image processing workflow.

The new AJA KONA 3G provides professional editors with the utmost in workflow flexibility, supporting a broad range of video formats including: 10-bit uncompressed video 3G/HD/SD SDI I/O, new HDMI 1.4a output for stereoscopic monitoring to consumer 3D displays, 8-channel AES digital audio I/O (16-channel AES with optional K3G-Box) and 16-channel SDI embedded audio I/O, real-time hardware-based up/down/cross conversion to support a range of SD and HD formats, dual-link HD, even 2K formats, a hardware-based downstream keyer and more.

source: www.aja.com

If you are old enough to remember the Viewmaster

Hasbro Inc. is betting that iPod and iPhone users want 3-D viewing on the go.

The nation’s second-largest toy maker is set to unveil to investors on Tuesday a handheld device called My3D that attaches to the two Apple Inc. devices. It promises three-dimensional content that offers a 360-degree experience in gaming, virtual travel experiences and entertainment content. It’s aimed at both children and adults.

The device, which resembles a pair of binoculars with a slot in which users insert their iPod or iPhone, will be priced at $30. It will be available starting next spring at stores where Apple’s iPhones and iPod Touches are available.

Shoppers can then visit Apple’s App store, which will allow shoppers to browse for additional My 3D content. Content varies in price; some apps will be free.

Hasbro said it was guided by Apple during development and believes there’s nothing available that matches the quality and 3-D experience on the iPhone or iPod Touch.

If it catches on, it has big potential. More than 125 million iPod Touches and iPhones have shipped, according to Shaw Wu, senior research analyst at Kaufman Bros. L.P. He predicts that will hit 200 million by end of 2011.

“The issue with this is whether they are going to get enough content for it,” Wu said.

Hasbro is confident it will and says it has teamed up with Dreamworks Animation, whose movie “Megamind” hit theaters last weekend, to develop material.

Separately, Hasbro’s My3D will use content from a 3-D television network from Discovery, Sony and Imax scheduled to make its debut next year. Viewers will be able to see trailers and exclusive behind-the-scenes snippets from films for up to 20 minutes. Hasbro says the device will be a key way to market its own brands in a 3-D experience, though details haven’t been set.

Meanwhile, Hasbro worked with LA Inc., the Los Angeles Convention and Vistors Bureau, to create virtual travel experiences that include visits to the Wax Museum and the Santa Monica Pier.

Through other apps, users can feel like they’re immersed in deep water, exploring coral reefs or playing a shark attacking a tuna, while all along learning facts about sea life. There are also shooter games in a virtual galaxy.

“The idea of being able to be somewhere in Los Angeles, in this 360-degree environment, to be in the shark tank, to be able to swim with the fish and chase after the fish. These are really breakthrough immersive experiences,” said Brian Goldner, president and CEO of Hasbro.

Source: yahoo.com

Werner Herzog’s ‘Cave of Forgotten Dreams’ in 3D

The legendary German auteur Werner Herzog presented his newest film, “Cave of Forgotten Dreams,” in 3D, to kick off the brand new DOC NYC festival. New York’s avant-garde silver fox David Byrne and his pal, Annie Clark (a.k.a. St. Vincent), donned bulky 3D specs along with the assembled crowd of NYU film students and cinephiles in NYU’s Skirball Center to take Herzog’s three-dimensional tour of France’s Chauvet caves.

Discovered in 1994, the caves contain perfectly preserved paintings done during the ice age, over 32,000 years ago -– the earliest known images of mankind. Herzog is one of only a handful of people who have been granted access.

“Cave of Forgotten Dreams” includes the cheeky commentary you expect from Herzog as well as the breathtaking beauty. When at one point in the film a scientist demonstrates Cro-Magnon spear-throwing technology, Herzog remarks, “I think you would not kill a horse throwing that way.” Early in the film, he sets the caves’ striking images of warring bison, mating lions and galloping horses to the sounds of a human heartbeat.

In Herzog’s version of a twist ending, the director imagines that crocodiles have given birth to albino offspring due to the nuclear power plant nearby, then ponders futures crocodiles’ perceptions of Chauvet’s cave paintings. Fans who saw Herzog’s “Bad Lieutenant: Port of Call New Orleans” will note a trend in the director’s new reptile obsession.

In the Q&A that followed the screening, Herzog played the jovial provocateur, commenting that he is wary of labeling himself an artist, as he sees the art world as corrupt and misguided with its brokers and general interest in turning a profit. The statement drew applause.

Defending his use of 3D technology, which some film enthusiasts regards as a gimmick, Herzog declared, “A film like this absolutely must be in 3D.” He noted that while he’s very skeptical of the technology and its trendiness and overuse, it would have been impossible to capture the beauty of the stalagmites, stalactites, calcified bones and paintings of the Chauvet caves in any other format. “You need fireworks like ‘Avatar’,” he conceded. But the studios’ real interest in 3D movies? Herzog says it’s all about the profits. “3D films are impossible to pirate,” he said.

Next up for the prolific director? Herzog will narrate a shortened version of Russian filmmaker Dmitry Vasyukov’s four-hour-long black and white documentary about hunters in Siberia.

source:wsj.com

Franhaufer Institute offers essential tool STAN for stereographers

The Stereoscopic Analyzer (STAN) combines realtime image analysis with visualization tools to assist cameramen and post-production staff in shooting correct stereo content.

Shooting and processing high-quality 3D content is a huge challenge for production teams. A wide range of parameters like color matching, stereo geometry and the orientation of the two cameras may vary from scene to scene depending on content, near and far objects, the convergence plane and the depth of focus.

Developed by the Fraunhofer Heinrich Herz Institute, the Stereoscopic Analyzer Assistance System for Perfect 3D Stereo STAN supports camera operators and stereographers in computing stereo parameters and camera settings critical for stereo quality. STAN ensures that the parameters are fed directly to both cameras so that incorrect setting can be identified and adjusted either manually or automatically if the set-up is a motorized stereo rig.

STAN captures and analyses stereo images in real-time. Metadata can be generated and saved for a streamlined post production process. Corresponding feature points in the scene are matched automatically to determine the given disparity range, and to compute stereo calibration data. Using actuators, the stereo baseline and other mechanical parameters of the stereo rig can be adjusted automatically so that the specified disparity range is not exceeded. Residual distortions in color and stereo geometry can be corrected using real-time color matching.

STAN uses a touch screen and viewing tools like crop/opacity overlay, side-by-side, checkerboard or anaglyphic stereo to analyze the stereo quality while tools like RGB parade, signal waveforms or color histograms assist in color control. Pixelby-Pixel disparity maps are used to visualize the depth structure of the scene. Basic stereo parameters like convergence planes can be adjusted manually and the results watched simultaneously. Related shift-cropscale processing is done on-the-fly.

STAN was developed by the Fraunhofer Heinrich Hertz Institute, Berlin in association with KUK Film Produktion, Munich as part of the German interdisciplinary project PRIME funded by the Federal Ministry of Economics and Technology (BMWi).

Holographic Video moves closer to reality

Scientists say they have taken a big step toward displaying live video in three dimensions — a technology far beyond 3-D movies and more like the “Star Wars” scene in which a ghostly Princess Leia image pleads, “Help me, Obi-Wan Kenobi.

In that classic movie, the audience sees her back before a new camera perspective shows her face. Such a wraparound view of a moving image was just movie-trick fantasy in the 1977 film, but now?

“It is actually very, very close to reality. We have demonstrated the concept that it works. It’s no longer something that is science fiction,” said Nasser Peyghambarian of the University of Arizona.

Actually, the results he and colleagues report in Thursday’s issue of the journal Nature look more like a slide show than a video. In experiments, the technology displayed a new image only every two seconds. That’s only about one-sixtieth as fast as the system would need to produce true video.

The image also gave only a 45-degree range of viewing angles because the original was shot with 16 cameras in an arc.

But Peyghambarian figures that with more development — and more cameras — his team can produce a true 3-D video screen that might reach living rooms in perhaps a decade. And you wouldn’t need those funny glasses to enjoy it.

Apart from the possibilities for entertainment, it might allow doctors in multiple places around the world to collaborate on live surgery, he said. If the screen were placed flat on a table, they could get a 360-degree view by walking around, just as if the patient were lying there.

While the 3-D image would not actually be projected into the air, that’s how it would appear to a person looking into the screen.

Other possibilities, Peyghambarian said, including eye-catching ads at shopping malls and a technique to enable designers of cars or airplanes to make changes more quickly. Live 3-D video could also help the military train troops, he said.

We see objects by perceiving the light that bounces off them. Peyghambarian’s technology uses holograms, two-dimensional images that reconstruct the light that would have bounced off a physical object, making it look 3-D.

In contrast, technology used for 3-D movies like “Avatar” or the election-night “hologram” of a CNN reporter in 2008 produces images that don’t show different views from different angles, as a genuine hologram or a real object does, Peyghambarian said.

Many people have seen holograms of still images. The Arizona group is one of maybe half a dozen around the world that are trying to move that technology into 3-D video, said V. Michael Bove Jr. of the Massachusetts Institute of Technology Media Lab.

Bove said several groups, including his own, have in fact produced such videos, achieving the magic rate of 30 frames a second. But those displays are only about the size of a postcard or smaller, he said, and one big challenge is how to make the display bigger.

The Arizona group uses a plastic plate that stores and displays an image until another image is written electronically on it. That approach might someday allow for much bigger images, said Bove, who is collaborating with the Arizona researchers but did not participate in the new study.

Peyghambarian said he now gets an image every two seconds on a 4-by-4-inch device. His team also has a 1-foot-square plate, but that takes longer to replace images.

He would like to scale up to plates about 6 or 8 feet square to show people at full size, so they could appear at meetings without having to actually show up.

His work was sponsored by the National Science Foundation and the military.

Bove compared the state of holographic video research to that of developing television about 80 years ago. Different groups are taking different approaches, and it is not clear which technology will prove best, he said.

In any case, he said, the Arizona system “produces bright, sharp holographic images…. This thing is beautiful.”

Source: Detroit Free Press

Location Filmmaking 2011 Finalists Announced

Dodge College of Film and Media Arts announced today the finalists for the new Location Filmmaking program.  During the month of January, two films will be shot, one a live action 3D film lead by Bill Dill, A.S.C. and the other a film combining live action and visual effects lead by Scott Arundale.  The completed films will be presented in the Folino Theater on Friday, April 29th at 7pm.

The two teams selected for either 3D or VFX film projects will be announced November 20.

***************************************************************************************

3D Location Finalists

Cottontail by James Humphreys

Director:  Rob Himebaugh

Producer: Natalie Testa

Cinematographer: Scotty Field

Editor: Arica Westadt

Sound Designer: Sean Yap

Production Designer: Ryan Phillips

**********************************

Gift of the Maggie by Ben Kepner

Director:  Chris Bryant

Producers: Jane Winternitz & Samantha Price

Cinematographer: Greg Cotton

Stereographer: Tashi Trieu

Editors: Chase Ogden & Matt Kendrick

Production Designer: Jeanette Sanker

**********************************

The Harvest by Turner Jacobs

Director:  Alexander Gaeta

Producer: Missy Laney

Cinematographer: Trevor Wineman

Stereographer: Andrew Finch

Editor: Ryan Kaplan

Sound Designer: Cody Peterson

Production Designer: Christy Gray

**********************************

A Smart Fly by Brandon Wade

Director:  Brandon Wade

Producer: Zach Mason

Cinematographer: Jason Bonninger

Editor: Sean Yap

Sound Designer: Andres de la Torre

Production Designer: Scheherazade Dadci

**********************************

VFX Location Finalists

A Good Man by Gary Alvarez

Director:  Gary Alvarez

Producer: Ayelet Bick

Cinematographer: David Rivera

VFX Supervisor: Alessandro Struppa

Editor: Jonathan Melin

Sound Designer: Affan Tanner

Production Designer: Micah Embry

**********************************

A Nervous Wreck by Jonathan Thompson and Norm Leonard

Director:  Jonathan Thompson

Producer:  Renee Mignosa

Cinematographer: John MacDonald

Editor: Andrew Carney

Sound Designer: Jeff Brown

Production Designer: Lauren DeWitt

**********************************

Prey by David Thompson

Director: Jack Brungardt

Producer: Ian Dalesky

Cinematographer: Michael Althaus

VFX Supervisor: Bryan Chojnowski

Editor: Alex Griffin

Sound Designer: Derek Beamer

Production Designer: Kaitlin Kubiak

**********************************

Time Capsule by Ira Parker

Director:  Shane McCarthy

Producer: Samer Imam

Cinematographer: Jared Wheeler

VFX Supervisor: Nader Owies

Editor: Affan Tanner

Sound Designer: Chris Mastellone

Ang Lee to film ‘Life of Pi’ in 3D

Oscar-winning Taiwanese-American director Ang Lee announced he will start shooting his new and first 3D film ‘Life of Pi’ in Taiwan in January.The movie, set to be released in December 2012, is based on the Booker prize-winning novel by Yann Martel about an Indian boy adrift on a lifeboat in the Pacific with a zebra, a hyena, an orangutan and a tiger.

“This movie involves water, kids and animals, all the things you better not touch in a film,” Lee joked at a press conference in Taipei, confirming that about two-thirds of the Fox 2000-produced film will be shot in Taiwan.

“It’s very challenging to shoot a 3D film because it is very new and nobody really understands it… Everybody is exploring it and it is filled with the unknown,” he said.

“3D is a new film language. It has new appeals and represents new breakthroughs,” Lee said. “I am very excited to shoot the film and I hope the audience will enjoying watching a good movie.”

The film features a newcomer, 17-year-old Suraj Sharma, in the lead role of Pi Patel. Sharma was chosen out of 3, 000 candidates during casting sessions across India.

“He is the only kid with the appealing qualities that I’d had in mind. He is moving and very natural and makes the story seem real, so I thought it should be him,” Lee said.

The filmmaker, who is based in New York, was hailed as the “glory of Taiwan” after becoming the first Asian to win a best director Oscar for his gay cowboy drama “Brokeback Mountain” in 2007.

http://www.timeslive.co.za

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.