viernes, 25 de febrero de 2011

The ScentScape THE WAY OF SMELL


THE SCENTSCAPE , A VISION OF THE SMELL


Introducing the ScentScape, a new device for fans of the new
entertainment experiences. In the world of cinema and video games, has always tried to bring the viewer into a realm of reality: widescreen LCD,speaker surround sound 5.1-7.1 or the recent arrival of 3D, but among these devices has never been thought in the smell.

The U.S. company, ScentSciencies, has developed a device that could change
the way that people play and watch movies or videogames, the ScentScape.
This device is able to reproduce the smell of our movies and games, and we are moved to the area of film or game.

The developers says "imagine you're watching Avatar, how can you imagine the feeling of being able to smell the flora, and touring the planet of Pandora ,
experiencing odors that ever you had not sense, a different planet to ours"

This device is already on the market, it may seem silly at first glance,but may be is the first step towards the entertainment of the future.

It may be that at home the odor intensity and lack of variety of odors can do it too monotonous, and such a annoying device. But, imagine it in theaters, If they put multiple devices on them, and is played a fantastic movie like avatar,
Lord of the Rings or Another movie that transports to another world of sensations, can be a fantastic experience. We're not going to the movies every day so it is not a feeling routine and boring, that's the key!
In the market ,right now, there are 22 scents and the device is compatible with different films and games like Avatar or World of Warcraft, although they are developing many more smells and compatibility with other works.

The price of the device, which connects via USB, is 70$
and comes with the pack of the 22 scents, with a duration of 200-350 hours per cartridge at maximum intensity. The value of each cartridge is unclear , and that came on 22 February.

This device promises to make different opinions, and may be one step closer to virtual reality.

I invite you to give your opinion about this product or send any question or issue to be addressed.

This is the address of developer:
http://www.scentsciences.com/products.html.

Greetings , and see you later!

Josep

Angeloud@hotmail.com

miércoles, 23 de febrero de 2011

New yechnologioen on the cinema

In the last of our series on the tech behind the major studios, we switch gear to facial animation and Dr. Mark Sagar of Weta Digital. Mark is a back to back 2010 and 2011 winner of the Sci-Tech Award (Scientific and Engineering Award) from the Academy. At Weta Digital he has directed the development of the performance driven facial animation system for Avatar and King Kong.
Mark specializes in facial motion capture, animation and rendering technologies and is currently focusing on bio-mechanical simulation of the face. Mark was a Post-Doctoral Fellow at the Massachusetts Institute of Technology and holds a Ph.D. in Engineering from The University of Auckland, New Zealand, where he worked on Virtual Reality Surgical Simulation and Anatomic Modeling with Peter Hunter’s Bioengineering group.

Here’s a run down of Mark’s background:


Pacific Title Mirage / LifeFX
Mark was the technology co-founder and co-director of Research and Development for LifeFX which set milestones in realism for digital humans for film at Pacific Title Mirage. He also developed interactive Internet based technologies for eCommerce and other Web based applications such as FaceMail at LifeFX Inc.

Siggraph Electronic Theatre
The short film “The Jester” was selected for the Siggraph Electronic Theatre in 1999, showing the LifeFX technology. This is considered by the graphics community to be a milestone in computer generated humans.

Watch ‘The Jester’
In 2000, “Young At Heart” was selected for the Siggraph Electronic Theatre, pushing the technology further which put a fully digital face in a standard dramatic context, and also demonstrated digital aging – creating an 80 year old version of a 20 year old actress (to put in context this was 10 years before Benjamin Button).
Watch ‘Young at Heart’
Sony Pictures Imageworks

Prior to Weta, Mark was R&D Supervisor at Sony Pictures Imageworks and developed the Image Based Rendering system for the Doctor Octopus and Peter Parker faces in Spider-Man 2 using Paul Debevec’s Lightstage, and the Performance Driven Facial Animation system for Monster House.
In 2004, Monster House (not released until July 2006) was the first use for film production of the Facial Action Coding System (FACS) for performance capture and mapping onto an arbitrary character face. The system analyzes the motion capture data for facial expression (rather than skin motion) and represents this as the fundamental information of the performance, and then translates this expression data onto an arbitrary digital character.

Weta Digital
In 2005, FACS was implemented at Weta Digital for King Kong and was able to capture extremely subtle to highly dramatic expression and faithfully translate Andy Serkis’s performance to King Kong’s face.
For Avatar, Weta knew that James Cameron wanted to use helmet cameras for motion capture, so Mark did an initial proof of concept test to see if they could use a single video camera and 2D image based tracking for facial motion capture rather than 3D tracking. They used the FACS system to solve and map the performance to Gollum for the test.



The FACS system was used for all the Navi character performance capture in Avatar. The Real-time Facial Motion Capture framework uses video from a small, helmet-mounted camera to record the actor’s facial performance in real time. New custom facial tracking software based on FACS rapidly tracks the movements of the face, and maps them onto a rigged model that replicates the actor’s expressions onto a facial puppet in the virtual monitor used by the director on the virtual stage.

The facial expression solver is a real-time version of the system developed for King Kong. This new system takes a live stream from the tracking workstation and determines which muscle groups or FACS units are being used to make the actor’s expression. The system automatically maps these FACS poses onto a puppet’s face. By using the standardised ‘language’ of the FACS poses, the system could drive a rig which could make 10,000 different expressions and included the detail needed to capture the subtle eye and mouth movement necessary to bring the characters to life. Also, it also enabled faithful representation of the original actor’s expression on a face with quite different proportions (for example, the Na’vi), enabling the actor to drive the virtual puppet with their performance.

Sagar’s awards

2010 Sci-Tech Award (known by some as a Technical Oscar)
Scientific and Engineering Academy Award – Dr. Mark Sagar, Paul Debevec, Tim Hawkins and John Monos for the design and engineering of the Light Stage capture devices and the image-based facial rendering system developed for character relighting in motion pictures.

2011 Sci-Tech Award
 Scientific and Engineering Academy Award – Dr Mark Sagar, for his early and continuing development of influential facial motion retargeting solutions.


Sagar’s Credits
AVATAR – Special Projects Supervisor
(2009) Director: James Cameron (20th Century Fox)
KING KONG – Special Projects Supervisor
(2005) Director: Peter Jackson (Universal)
MONSTER HOUSE – CG Special Projects Supervisor
(2006) Director: Gil Kenan (Sony Pictures Imagworks)
SPIDER-MAN 2 – CG Special Projects Supervisor
(2006) Director: Sam Raimi (Sony Pictures Imagworks)

The latest new technoligie: The iHam



Nothing to say!

miércoles, 9 de febrero de 2011