Archive for November, 2010

Hands on: Sky 3D review

A whole new ball game

To capture its 3D broadcast pictures Sky uses two HD cameras to take left and right-aligned images of a chosen scene. The need for dedicated 3D camera rigs means that viewers watching a live event – such as the Ryder Cup golf tournament, for instance – don’t see the same images as the regular 2D transmission.

This also means separate commentary teams and studio presenters. The images are anamorphically compressed and positioned side by side before being encoded as a normal HD stream. Anyone watching in 2D who tunes in to channel number 217 will see the split screen showing two nearly-identical images. It’s then time to tell your TV it needs to engage its side-by-side 3D mode and the screen will display a single fuzzy image.

For perfect clarity you pop on your 3D specs and assume your viewing position. Sky’s 3D channel may now be fully-fledged, but as a glance at the programming guide shows, there aren’t that many original 3D broadcasts in a given week.

This, though, is deliberate, as Sky admits that 3D viewing is meant for specially planned events and the idea of watching uninterrupted 3D shows and adverts (not that there are any) is simply unimaginable.

The very nature of 3D viewing places you in a cinema-like situation and it’s largely down to the darkened, shuttering specs. Hence: no glancing at each other as you discuss Tiger Woods’ dire tee shot; no getting up to make a brew while keeping an eye on proceedings; and no reading magazines during the ad breaks.

3D programming on Sky

So despite several hours of preview footage and various repeats, the amount of original 3D programming available feels about right.

The first week was dominated by four days of golf, with the rest of the schedule given over to a couple of CGI movies (Monsters vs Aliens and Ice Age: Dawn of the Dinosaurs), some sporting archive footage (World Matchplay Darts, US Open tennis, Super League rugby and the 2010 Champions League final) and some bespoke 3D documentaries about dancing and wildlife.

The first time you watch any genre in 3D is undeniably exciting, although the process of switching from 2D viewing on a Panasonic 3D plasma was convoluted and involved several menu selections plus the need to switch from Normal mode to Dynamic to compensate for the reduction in brightness caused by the tinted 3D glasses.

The reversal of this process also makes it a chore to switch back to 2D and check what’s on another channel. Of all the sports currently on show, golf is perhaps the biggest challenge for 3D producers. While football and tennis lend themselves to some naturally good angles that give a welcome sense of depth, golf offers a lot of images that seem flat because there isn’t enough foreground interest.

3D TV in action

The best shots are those of the players teeing off, or caddies milling around the green, with a packed gantry behind them and the glorious Welsh hills in the background. Even then, the 3D effect is stronger when the sun is shining than when it is gloomy and wet.

And, despite the irritating commentators’ propaganda about how fabulous 3D is, sometimes the darkness and lack of definition make it impossible to see the hole.

But other material fares better. With macro-3D documentary The Bugs!, the curiosity of seeing things stereoscopically had me marvelling at certain scenes, while the documentary entitled Dance, Dance, Dance has some great wide shots of different dance styles, and seems to work better than the animated movies that play havoc with your eyes at times by using outward projecting objects whose disappearance at the edge of frame contravenes spatial logic – although Sky should be applauded for getting a good roster of new 3D movies on its channel.

There’s no doubt that there’s still some way to go before you can sit down in front of Sky 3D and feel completely happy with the experience, but even at this early stage it shows promise.

Source: 3dradar.techradar.com

Microsoft makes a renewed effort in over-the-top streaming video

(Reuters) – Microsoft Corp has held talks with media companies to license TV networks for a new online pay-television subscription service through devices such as its Xbox video game console, two people familiar with the plans told Reuters.

The software giant’s possible push into the television business comes as Google Inc, Apple Inc and Netflix have jostled for a seat at the table of television’s future — a main topic of discussion at the Reuters Global Media Summit to be held this week.

The maker of the Windows operating system has proposed a range of possibilities in these early talks including creating a “virtual cable operator” delivered over the Internet for which users pay a monthly fee.

Other options include using the Xbox to authenticate existing cable subscribers to watch shows with enhanced interactivity similar to how pay TV operators have sought to do over the Web, said these people.

Microsoft is also exploring the possibility of creating content silos and selling more individual channels directly such as an HBO or Showtime. It already has Walt Disney Co’s ESPN on the XBox Live online service for example.

These people said a service may not arrive for another 12 months, but early discussions have been productive.

Microsoft said it does not comment on rumor or speculation. The people involved in the talks asked not to identified as the discussions were confidential.

News of Microsoft’s plans come as the pay-television industry has sought to allay investor concerns that consumers are fleeing expensive subscription packages for cheaper online services operated by companies such as Netflix Inc and Hulu, which both charge $7.99 per month for streamed shows and movies. The phenomenon is called “cord-cutting.”

The worry is that so-called over-the-top services could undermine the lucrative cable TV industry, whose dual-revenue stream model — cable networks such as ESPN are paid carriage fees by pay TV operators and also earn revenue from advertisers — has made pay-TV one of the most resilient sectors during the economic recession.

But programmers would welcome new types of competition to the cable and satellite companies, senior media executives said.

“We think the more competition the better, we will price and package it in such a way that we still make the dual revenue stream,” said one of the people who spoke to Reuters. “We could probably charge more for interactive advertising.”

Microsoft has long held ambitions to be a major player in the TV business and has previously invested in interactive television initiatives including Web TV and MSN TV set-top box software.

Its latest plans include offering interactivity to engage viewers through social media, interactive advertising and motion control technology, say people who have seen early demonstrations.

Microsoft has bet on new “gesture” technology that lets users of its Xbox, who buy a camera accessory called the Kinect, control on-screen functions using voice to launch channels and waving arms to fast-forward or rewind videos on ESPN.

The Redmond, Washington, company is said to be mulling feedback it has received from programmers including the expense of such a plan but it is not likely to roll out a service in the next 12 months, said one person.

The market to determine the future of television distribution and technology has accelerated over the past year.

Google has already launched Google TV, an enhanced Web-TV service with partners including Sony Corp televisions and Logitech set-top boxes. While Google has also announced Time Warner Inc’s Turner Networks as a programing partner, it is not yet planning to offer a full suite of cable networks in the near future.

Apple has also held talks with programmers, but faced resistance industry-wide over its plans to offer a lower-cost subscription TV plan, people familiar with the talks have said. Apple has begun to offer 99-cent TV show rentals for a limited number shows through News Corp’s Fox and Disney.

Epic, Epic, Epic : Peter Jackson buys the new Red camera in bulk for The Hobbit

The train appears to be leaving the station as another “A-List” director, Bryan Singer endorses the new Red Camera system known as Epic:

From: bleedingcool.com

In my youth, Kubrick’s Barry Lyndon was almost an mythical movie, and a big part of the myth revolved around the “special lenses” that Kubrick used to shoot the film. Made by Zeiss from NASA-developed still-camera lenses, they allowed Kubrick and cinematographer John Alcott to shoot a number of scenes in the film that were lit entirely by candlelight.

From what I keep reading about its capabilities, I think Kubrick would have loved the upcoming Red EPIC camera, and here’s one hint as to why.

Bryan Singer has personally stopped by the Red User forums to leave a Christmas Eve message, revealing just a little of what he’s planning for his next picture:

I’m very much looking forward to using the EPIC Red for my next movie Jack the Giant Killer which will be shot in, what else, 3D. The camera’s incredibly compact size and extraordinary resolution are ideal for the 3D format.

But more importantly Jack the Giant Killer is my first movie set in a time before electricity. The EPIC’s extraordinary exposure latitude will allow me to more effectively explore the use of natural light.

“More importantly”? Yeah, I’m sure some people are going to read that as anti-3D sentiment. Either way, I’m reckoning that this is going to be a wonderfully shot movie and to know that Singer is feeling ambitious about the cinematography is nicely encouraging.

Update: 12.17.10

From Jim Jannard and Darius Wolski, A.S.C.

Ridley Scott’s upcoming Science Fiction film, which begins principal photography this spring, will be shot on EPIC.

“In my opinion, the new Red Epic camera is about to revolutionize all spectrums of the film industry.

I am going to use Epics in my new project directed by Ridley Scott. I am amazed with the quality of the image and the fact that you can shoot 5k at 120fps without compromising resolution, and most of all the size of the camera.

Combined with the Element Technica Atom 3d rig, we will be able to shoot a 3d movie with the flexibility of a conventional cinema camera.

I don’t see anything that comes close to it at the moment. I can’t even imagine the potential Epic will have on the big blockbuster industry as well as independent cinema.”

11.28.10 from Jim Jannard, owner and developer of the Red Camera systems:

Peter Jackson’s two film adaptation of The Hobbit will be shot in 3D using RED DIGITAL CINEMA’S soon to be released EPIC Digital Cameras.

The Hobbit will be amongst the first productions in the world to use the EPIC and at least thirty cameras will be required by the 3-D production. The EPIC’S small size and relatively low weight, makes it perfect for 3-D – where two cameras have to be mounted on each 3D rig.

The successor to RED’s industry changing RED ONE, the EPIC has 5K resolution, can shoot up to 120 frames per second and has a new HDRx™™ mode for the highest dynamic range of any digital cinema camera ever made. Taking everything they had learned from building their first camera, RED designed the EPIC from scratch and have produced a smaller, lighter camera that is an order of magnitude more powerful.

Jackson has a long history with RED, dating back to when he directed the short film ‘Crossing the Line’ as a very early test of prototype RED ONE cameras. “I have always liked the look of Red footage.” he says, “I’m not a scientist or mathematician, but the image Red produces has a much more filmic feel than most of the other digital formats. I find the picture quality appealing and attractive, and with the Epic, Jim and his team have gone even further. It is a fantastic tool, the Epic not only has cutting edge technology, incredible resolution and visual quality, but it is also a very practical tool for film makers. Many competing digital systems require the cameras to be tethered to large cumbersome VTR machines. The Epic gives us back the ability to be totally cable free, even when working in stereo.”

Jim Jannard the owner and founder of RED flew to New Zealand earlier this year with members of his team so that Jackson could test the EPIC and assess its suitability. “Everybody at RED is incredibly proud that Peter has chosen the Epic” says Jannard, “The Hobbit is a major production, and could have chosen any camera system that they wanted. The fact that they went with us is extremely gratifying.”

The Hobbit will start shooting in New Zealand early next year.

Jim

Will the Royal Wedding be broadcast in 3D?

Prince William and Kate Middleton intend to make their wedding a people’s event and on April 29 the happy couple may seem close enough for you to reach out and touch them. Broadcasters are considering plans to screen the royal wedding in 3D. If those plans come to fruition it would mean a worldwide audience of millions would watch the anticipated marriage ceremony through 3D glasses.

It is understood that Sky, the BBC and Virgin are in joint discussions about the possibility of screening the event live from Westminster Abbey in 3D.

Sky TV have pioneered the new 3D technology on the small screen, largely for sporting events, but it is more likely that a terrestrial broadcaster such as the BBC will get full access to footage of the event, in order to cater for the largest possible TV audience.

While the technology requires a special television set for home viewing, it is possible that the event could be screened in pubs and cinemas for mass public consumption in 3D.

Jana Bennett, director of BBC Vision, said early meetings had taken place with other broadcasters and she was aware of interest in using 3D technology. She said: ‘We are already planning with the other broadcasters so I know about the 3D thing as well. That is obviously of some interest but our responsibility is to bring things everybody can see on air and 3D has a very limited footprint.’

She added the royal couple were in ‘their own time, their own space and we shouldn’t make assumptions yet about what our coverage should amount to.’

There has been speculation on several technology websites that Sky is considering 3D coverage of the event, but a spokeswoman for the broadcaster said it was ’speculation at this stage’.

Prince William and Kate Middleton, both 28, announced their engagement last week, nine years after they met as students at St Andrews University.
Source: dailymail.co.uk

MPA Europe takes down Pirate Bay operators

A press release from MPA Europe representing the Hollywood majors in their battle with those wishing to enrich themselves trading in intellectual properties that are not their own:

STOCKHOLM, SWEDEN — The Court of Appeals in Sweden this afternoon upheld the criminal convictions for copyright infringement against three of the individuals in The Pirate Bay case. The three, Frederik Neij, Peter Sunde and Carl Lundström, had appealed their convictions for copyright infringement imposed by the Stockholm District Court in April 2009.

Following this afternoon’s announcement, Chris Marcich, President and Managing Director of the MPA Europe said

“Now that a Swedish Court has declared the operators of The Pirate Bay guilty of copyright infringement for a second time, we hope the relevant authorities will take the appropriate action to ensure that the site ceases its illegal activities. The Pirate Bay has flaunted the law while continuing to cause serious harm to the creative economy globally, generating substantial revenues for its operators. The decision of the Swedish Court of Appeals today upholding the criminal convictions of the Pirate Bay operators is very much welcomed. This confirms that such activities are illegal and if you engage in them, you run the risk of very significant consequences.

The Pirate Bay’s sole purpose is to facilitate and promote the unlawful dissemination of copyrighted content for the profit of the site operators. The entire business model is built upon copyright infringement. Preventing illegal distribution of copyrighted material on the internet is central to protecting the rights of copyright holders, and also to supporting the continued investment in new online services and the creation of new films and television programmes. “

Note: The fourth defendant Gottfrid Svartholm was also convicted of the same offence and also appealed. His appeal was postponed due to his ill-health and is yet to be heard.

Following the appeal by the defendants against their convictions, rights-holders appealed the decision of the District Court in relation to the damages awarded against the operators for their infringing activities. In a welcome move, the Court of Appeal increased the amount of damages payable to 46 million SEK (up from 32m SEK).

The Court of Appeals did, however, revise the term of the prison sentences against each of the appellants based on their level of participation:. Neij was sentenced to 10 months, Sunde to 8 months and Lundström to 4 months. Each was originally sentenced to a one year term.

BACKGROUND:- In February 2009 four defendants; Frederik Neij, Gottfrid Svartholm, Peter Sunde and Carl Lundström, were charged with contributing to copyright infringement by facilitating the illegal distribution of copyrighted material in relation to the unauthorized online distribution service, The Pirate Bay. All four were convicted on April 17, 2009 and sentenced to one year’s imprisonment. Substantial damages were also awarded against them.

This was an important decision for rights-holders, underlining their right to have their creative works protected against illegal exploitation and to be fairly rewarded for their endeavours.

The four immediately appealed both their criminal conviction and the damages award. (The one year’s prison sentences were delayed pending the appeal).

source: variety.com

UPDATE: 11.27.10

A Swedish appeals court Friday shortened the prison terms of two founders and a financier of Swedish filesharing site The Pirate Bay, but increased the damages to be paid to movie and music firms.

“The Appeals Court, like the district court, finds that The Pirate Bay service makes possibly illegal filesharing in a way that entails a punishable offense for those who run the service,” the court said in its ruling.

Three founders of the site Peter Sunde and Fredrik Neij, both 32, and Gottfrid Svartholm Warg, 26, were in April 2009 found guilty of promoting copyright infringement with the website.

The verdict, considered an important symbolic victory for the movie and recording industry, handed the three founders along with an important financier of the site, 50-year-old Carl Lundstroem, sentences of one year in prison.

On Friday, the Svea Appeals Court shortened Neij’s sentence to 10 months, Sunde’s to eight months and Lundstroem’s to four months.

Warg, the third co-founder, received the same lower court sentence as the others, but did not take part in the appeals trial due to illness. He will face a separate trial probably next year.

“Unlike the lower court, the appeals court does not believe one can make such a collective decision entailing that everyone carries the same responsibility for what is done within the framework of The Pirate Bay,” the court explained.

However, it ruled that instead of paying around 32 million kronor (3.4 million euros, 4.5 million dollars) in damages to the movie and recording industries, the amount should be hiked to 46 million kronor.

“This is because the Appeals Court to a larger extent than the district court has accepted the plaintiffs’ presented evidence of their losses,” the court said.

Founded in 2003, The Pirate Bay, which claims to have more than 23 million users, makes it possible to skirt copyright fees and share music, film and computer game files using bit torrent technology, or peer-to-peer links offered on the site.

Source: breitbart.com

Sony aims for new camera to compete with the Red One

Extending itself further into the independent filmmaking arena, Sony has unveiled its first professional handheld digital production camera with a Super 35mm imager. Dubbed PMW-F3, the camcorder will ship in February 2011, at a list price of at $16,000 for the body only, or at $23,000 for a kit that includes three Sony-branded T2.0 PL-mount prime lenses at 35mm, 50mm and 85mm.

Sony is positioning the camera as a bridge between high-end ENG acquisition and feature filmmaking. It’s based on the company’s XDCAM EX platform and has been designed to support high-end workflows with a Super 35mm-sized CMOS sensor and optional dual-link HD-SDI output.

The camera made its U.S. debut on November 17 at a gathering of students and pros at USC’s School of Cinematic Arts. Next month it will be shown at NYU. Why the academic flavor? “There are future Oscar winners in this room,” said Alec Shapiro, senior veep at Sony’s pro solutions group.

Former Panavision exec Andy Romanoff believes that, in addition to indies, studio filmmakers might end up using the F3 as a B camera. However, “the typical studio film will stick with higher-end cameras for most of their photography,” he said.

Naturally, Sony hopes those first-unit cameras will be its F35’s or SRW-9000’s.

source: variety.com

Sony’s new PMW-F3 is Sony’s third 35mm Cinealta Digital Camcorder and is based on the XDCAM EX platform. Specs of the F35 and SRW-9000PL still exceed the F3, we’re told, but this camera is no slouch, and footage shot with all 3 cameras should intercut. The F3, above all, is still a handheld camcorder. It doesn’t sit on your shoulder. It weighs about 5 pounds, which is lighter than many of the lenses you’ll be using. There’s a tilting viewfinder at the rear of the top handle. It looks similar to the HVR-Z7U finder: about 1.2 million pixels. An LCD monitor pivots out from the camera left side. Click here for  Sony UK’s overview and specs.

The new F3 camcorder is based on Sony’s XDCAM EX technology. There are 2 Sony’s SxS ExpressCard slots in back. The Super 35mm CMOS imager promises high sensitivity and low noise levels. The ballpark sensitivity rating is approximately ISO 800 and unconfirmed reports hint at an exposure range greater than 13 stops. The adventure continues.

There are HD-SDI dual-link outputs at the rear of the F3 for external recording (4:2:2 1080 50/59.94P normal; and RGB 1080 23.98/25/29.97PsF as an option). You’ll be able to select S-Log and Hyper Gamma to seriously increase the dynamic range. S-Log is Sony’s take on RAW “Digital Negatives.” The image, uncorrected, looks pale and washed out (like a negative), but when a Look-Up Table (LUT) is applied, shows the full dynamic range of the image, giving you greater flexibility for color and contrast correction in post.

The F3 records natively onto SxS Cards at 35 Mbps at 4:2:0 8-bit in XDCAM EX format. The SxS Cards are formatted in standard FAT file format; a 32 GB Card will record 100 minutes in highest quality. Many users will be happy with this. But, like Oliver Twist, many will want more. And they can have more–with the ability to use the onboard SxS Cards as immediately editable proxies, while simultaneously recording to a higher standard. That might include 4:4:4 10-bit S-Log HD-SDI dual link to an SRW-1 /SRPC-1 SRW tape recorder at visually lossless 440 and 880 Mbps or (next year) 1 TB Solid State Memory Cards at 220 and 440 Mbps.

Recording formats include 1920 x 1080, 1440 x 1080, and 1280 x 720 at 23.98/25/29.97p, 50/59.94i and, in DVCAM mode, 25/29.97PsF and 50/59.94i. Under- and overcranking is called S & Q for “slow” and “quick” recording, from 1 to 30 fps at 1920 x 1080 (17 to 30 fps in dual-link mode) and 1 to 60 fps at 1280 x 720 (17 to 60 fps in dual-link mode).

Who’s going to shoot with Sony’s F3–and how? If you’re a student or independent, you’ll probably take the simplest package possible: a zoom or primes, record to SxS onboard cards, and go direct to edit. Of course, you’ll be sure to diligently back up those SxS cards using Sony’s PXU-MS240 Mobile Storage Device, which not only backs up the cards, but also carefully checks the data to be sure it’s all there (parity). Next, you’ll copy the SxS card onto your Avid or Final Cut Pro system.  Go towww.sony.com/cinemon to download the Sony Cinemon plug-in: it enables MPEG-4 to be transparent to FCP Quicktime. You’ll be able to edit natively in FCP, with drag and drop, and all files instantly viewable on a Mac. Avid’s AMA (Avid Media Access) plug-in mounts the XDCAM EX files directly into Avid Media Composer.

If you’re shooting documentaries, commercials or TV, you might follow a similar path. Of course, you will not reformat your SxS Cards until the job is safely completed and many archives and copies have been cloned. Cards are relatively cheap. The dreaded word “Oops” is very expensive when a once-in-a-lifetime scene is re-formatted.

High end productions, recording to SR tape or memory, should soon have native support of SR codec on Avid and Final Cut Pro. The HD-SDI outputs of the Sony F3 will be eyed with great interest by the high-end after-market storage gurus at Codex, Cinedeck and elsewhere.

source: fdtimes.com

Cineform teams up with Aja to offer stereo workflow for the Kona 3

CineForm®, Inc., creators of high-fidelity compression-based workflow solutions for the post production marketplace, announced today that it has teamed up with AJA to offer full stereo 3D workflow support for the newly launched KONA 3G card, the multi-format SD/HD/Dual Link/3G/2K video I/O hardware for Mac.

As part of this cooperative effort, AJA released updated version 8.1 KONA software which has added 3D video controls to the KONA 3G’s Control Panel software interface enabling direct ingest into, and playout of, CineForm 3D files, further simplifying production workflows for customers working with 3D content. During ingest, KONA 3G enables simultaneous real-time capture of separate left eye and right eye sources through HD-SDI – including sources previously recorded in stereo mode on HDCAM SR – directly into CineForm 3D files. Each individual eye is multiplexed together into a CineForm 3D file that is available for immediate editing with CineForm’s Neo3D software when used in combination with Apple Final Cut Pro, Adobe Premiere Pro and other compatible software applications. The new KONA software also adds support for recording and playout of CineForm 4:2:2 2D media.

The AJA KONA 3G card featuring support for CineForm Neo3D is available immediately.

“One of our primary goals with the KONA 3G was to deliver a solution that could handle just about anything our customers are dealing with today. Stereo 3D, and our ability to support it, is rising to the top of the list for many customers,” said Nick Rashby, President, AJA. “Through our collaboration with CineForm, our customers who wish to work with stereo 3D in post can eliminate the time consuming step of transcoding material after ingest and instead be ready to edit immediately when using CineForm Neo3D.”

CineForm Neo3D is CineForm’s award-winning 3D post production workflow solution that enables users to edit 3D projects in real time with full frame rate playback to an external 3D monitor. With CineForm First Light 3D as the enabling 3D workflow and production engine, Neo3D users are provided comprehensive control of the 3D image processing workflow.

The new AJA KONA 3G provides professional editors with the utmost in workflow flexibility, supporting a broad range of video formats including: 10-bit uncompressed video 3G/HD/SD SDI I/O, new HDMI 1.4a output for stereoscopic monitoring to consumer 3D displays, 8-channel AES digital audio I/O (16-channel AES with optional K3G-Box) and 16-channel SDI embedded audio I/O, real-time hardware-based up/down/cross conversion to support a range of SD and HD formats, dual-link HD, even 2K formats, a hardware-based downstream keyer and more.

source: www.aja.com

If you are old enough to remember the Viewmaster

Hasbro Inc. is betting that iPod and iPhone users want 3-D viewing on the go.

The nation’s second-largest toy maker is set to unveil to investors on Tuesday a handheld device called My3D that attaches to the two Apple Inc. devices. It promises three-dimensional content that offers a 360-degree experience in gaming, virtual travel experiences and entertainment content. It’s aimed at both children and adults.

The device, which resembles a pair of binoculars with a slot in which users insert their iPod or iPhone, will be priced at $30. It will be available starting next spring at stores where Apple’s iPhones and iPod Touches are available.

Shoppers can then visit Apple’s App store, which will allow shoppers to browse for additional My 3D content. Content varies in price; some apps will be free.

Hasbro said it was guided by Apple during development and believes there’s nothing available that matches the quality and 3-D experience on the iPhone or iPod Touch.

If it catches on, it has big potential. More than 125 million iPod Touches and iPhones have shipped, according to Shaw Wu, senior research analyst at Kaufman Bros. L.P. He predicts that will hit 200 million by end of 2011.

“The issue with this is whether they are going to get enough content for it,” Wu said.

Hasbro is confident it will and says it has teamed up with Dreamworks Animation, whose movie “Megamind” hit theaters last weekend, to develop material.

Separately, Hasbro’s My3D will use content from a 3-D television network from Discovery, Sony and Imax scheduled to make its debut next year. Viewers will be able to see trailers and exclusive behind-the-scenes snippets from films for up to 20 minutes. Hasbro says the device will be a key way to market its own brands in a 3-D experience, though details haven’t been set.

Meanwhile, Hasbro worked with LA Inc., the Los Angeles Convention and Vistors Bureau, to create virtual travel experiences that include visits to the Wax Museum and the Santa Monica Pier.

Through other apps, users can feel like they’re immersed in deep water, exploring coral reefs or playing a shark attacking a tuna, while all along learning facts about sea life. There are also shooter games in a virtual galaxy.

“The idea of being able to be somewhere in Los Angeles, in this 360-degree environment, to be in the shark tank, to be able to swim with the fish and chase after the fish. These are really breakthrough immersive experiences,” said Brian Goldner, president and CEO of Hasbro.

Source: yahoo.com

Werner Herzog’s ‘Cave of Forgotten Dreams’ in 3D

The legendary German auteur Werner Herzog presented his newest film, “Cave of Forgotten Dreams,” in 3D, to kick off the brand new DOC NYC festival. New York’s avant-garde silver fox David Byrne and his pal, Annie Clark (a.k.a. St. Vincent), donned bulky 3D specs along with the assembled crowd of NYU film students and cinephiles in NYU’s Skirball Center to take Herzog’s three-dimensional tour of France’s Chauvet caves.

Discovered in 1994, the caves contain perfectly preserved paintings done during the ice age, over 32,000 years ago -– the earliest known images of mankind. Herzog is one of only a handful of people who have been granted access.

“Cave of Forgotten Dreams” includes the cheeky commentary you expect from Herzog as well as the breathtaking beauty. When at one point in the film a scientist demonstrates Cro-Magnon spear-throwing technology, Herzog remarks, “I think you would not kill a horse throwing that way.” Early in the film, he sets the caves’ striking images of warring bison, mating lions and galloping horses to the sounds of a human heartbeat.

In Herzog’s version of a twist ending, the director imagines that crocodiles have given birth to albino offspring due to the nuclear power plant nearby, then ponders futures crocodiles’ perceptions of Chauvet’s cave paintings. Fans who saw Herzog’s “Bad Lieutenant: Port of Call New Orleans” will note a trend in the director’s new reptile obsession.

In the Q&A that followed the screening, Herzog played the jovial provocateur, commenting that he is wary of labeling himself an artist, as he sees the art world as corrupt and misguided with its brokers and general interest in turning a profit. The statement drew applause.

Defending his use of 3D technology, which some film enthusiasts regards as a gimmick, Herzog declared, “A film like this absolutely must be in 3D.” He noted that while he’s very skeptical of the technology and its trendiness and overuse, it would have been impossible to capture the beauty of the stalagmites, stalactites, calcified bones and paintings of the Chauvet caves in any other format. “You need fireworks like ‘Avatar’,” he conceded. But the studios’ real interest in 3D movies? Herzog says it’s all about the profits. “3D films are impossible to pirate,” he said.

Next up for the prolific director? Herzog will narrate a shortened version of Russian filmmaker Dmitry Vasyukov’s four-hour-long black and white documentary about hunters in Siberia.

source:wsj.com

Franhaufer Institute offers essential tool STAN for stereographers

The Stereoscopic Analyzer (STAN) combines realtime image analysis with visualization tools to assist cameramen and post-production staff in shooting correct stereo content.

Shooting and processing high-quality 3D content is a huge challenge for production teams. A wide range of parameters like color matching, stereo geometry and the orientation of the two cameras may vary from scene to scene depending on content, near and far objects, the convergence plane and the depth of focus.

Developed by the Fraunhofer Heinrich Herz Institute, the Stereoscopic Analyzer Assistance System for Perfect 3D Stereo STAN supports camera operators and stereographers in computing stereo parameters and camera settings critical for stereo quality. STAN ensures that the parameters are fed directly to both cameras so that incorrect setting can be identified and adjusted either manually or automatically if the set-up is a motorized stereo rig.

STAN captures and analyses stereo images in real-time. Metadata can be generated and saved for a streamlined post production process. Corresponding feature points in the scene are matched automatically to determine the given disparity range, and to compute stereo calibration data. Using actuators, the stereo baseline and other mechanical parameters of the stereo rig can be adjusted automatically so that the specified disparity range is not exceeded. Residual distortions in color and stereo geometry can be corrected using real-time color matching.

STAN uses a touch screen and viewing tools like crop/opacity overlay, side-by-side, checkerboard or anaglyphic stereo to analyze the stereo quality while tools like RGB parade, signal waveforms or color histograms assist in color control. Pixelby-Pixel disparity maps are used to visualize the depth structure of the scene. Basic stereo parameters like convergence planes can be adjusted manually and the results watched simultaneously. Related shift-cropscale processing is done on-the-fly.

STAN was developed by the Fraunhofer Heinrich Hertz Institute, Berlin in association with KUK Film Produktion, Munich as part of the German interdisciplinary project PRIME funded by the Federal Ministry of Economics and Technology (BMWi).

Return top

About 3D & Digital Cinema

If you are a tech head, cinema-phile, movie geek or digital imaging consultant, then we'd like to hear from you. Join us in our quest to explore all things digital and beyond. Of particular interest is how a product or new technology can be deployed and impacts storytelling. It may be something that effects how we download and enjoy filmed entertainment. It may pertain to how primary and secondary color grading will enhance a certain tale. The most important thing is that you are in the driver's seat as far as what you watch and how you choose to consume it.