Image Metrics

Last updated

An actress driving the facial performance of a 3D avatar Virtualavatar.jpg
An actress driving the facial performance of a 3D avatar

Image Metrics is a 3D facial animation and Virtual Try-on company headquartered in El Segundo, with offices in Las Vegas, and research facilities in Manchester. Image Metrics are the makers of the Live Driver and Portable You SDKs for software developers and are providers of facial animation software and services to the visual effects industries.

Contents

About

The Image Metrics proprietary facial animation system is a Markerless motion capture method in which an actor's performance is filmed and a 3D animated model is generated directly from the raw images. The process uses pre-existing or newly recorded video of an actor's facial performance shot with a video or High definition camera. All detail seen in the recorded video is then analyzed and mapped onto a computer-generated 3D model, including the detailed movements of teeth, tongue, lips and eyes.

The Image Metrics animation process captures facial details like eye movement, lip and tongue synching, subtle expressions and skin textures that can be compromised in the use of traditional motion capture methods. Advanced mapping technology allows Image Metrics to deliver facial animation results up to five times faster than motion capture methods and ten times faster than key frame animation (although this figure is subjective, since Keyframe Animation can be used to produce very different results.)

History

Image Metrics was founded in 2000 in Manchester by Gareth Edwards, Kevin Walker and Alan Brett. The Image Metrics office in Santa Monica was opened six years later.

The Image Metrics performance capture animation process was developed over the past seven years by an in-house team of physicists, programmers and animators. Since its inception, Image Metrics facial animation technology has been applied in over 40 video games, movies and commercials worldwide. [1] In 2006, The New York Times heralded Image Metrics' techniques as "technology that captures the soul". [2]

Image Metrics technology was also used to animate the deceased actor Richard Burton for a computer-generated encore performance. The company provided the modeling and facial animation for a photo-realistic, 11-foot 3D hologram of the late Burton for the Live on Stage! production of "Jeff Wayne's Musical Version of The War of The Worlds." Image Metrics animated a Richard Burton in his mid 30s to recreate his role as journalist George Herbert. Image Metrics completed a total of 23 minutes of facial animation synchronized to the original audio recording of the international star of stage and screen.

Credits

Image Metrics has created facial animation for feature films including The Wolfman (2010), Meet Dave (2008), Foodfight! (2012) and Harry Potter and the Order of the Phoenix (2007). In addition to its work with feature films, Image Metrics has created facial animation for a number of popular video games including Red Dead Redemption , Devil May Cry 4 , SOCOM U.S. Navy SEALs: Combined Assault , SOCOM U.S. Navy SEALs: Fireteam Bravo 2 , Bully , Grand Theft Auto: Vice City Stories , Syphon Filter: Dark Mirror , 24: The Game , Call of Duty 2 , Grand Theft Auto: Liberty City Stories , The Warriors , Midnight Club 3: DUB Edition , Metal Gear Solid 3: Snake Eater , The Getaway: Black Monday , Grand Theft Auto: San Andreas , and Grand Theft Auto IV . They are also an integral component of Sony's SOEmote feature which debuted in the successful MMORPG, Everquest 2 in August 2012.

Related Research Articles

<span class="mw-page-title-main">Computer animation</span> Art of creating moving images using computers

Computer animation is the process used for digitally generating animations. The more general term computer-generated imagery (CGI) encompasses both static scenes and dynamic images, while computer animation only refers to moving images. Modern computer animation usually uses 3D computer graphics to generate a three-dimensional picture. The target of the animation is sometimes the computer itself, while other times it is film.

<span class="mw-page-title-main">Stop motion</span> Animation technique to make a physically manipulated object appear to move on its own

Stop motion is an animated filmmaking technique in which objects are physically manipulated in small increments between individually photographed frames so that they will appear to exhibit independent motion or change when the series of frames is played back. Any kind of object can thus be animated, but puppets with movable joints or plasticine figures are most commonly used. Puppets, models or clay figures built around an armature are used in model animation. Stop motion with live actors is often referred to as pixilation. Stop motion of flat materials such as paper, fabrics or photographs is usually called cutout animation.

<span class="mw-page-title-main">Animator</span> Person who makes animated sequences out of still images

An animator is an artist who creates multiple images, known as frames, which give an illusion of movement called animation when displayed in rapid sequence. Animators can work in a variety of fields including film, television, and video games. Animation is closely related to filmmaking and like filmmaking is extremely labor-intensive, which means that most significant works require the collaboration of several animators. The methods of creating the images or frames for an animation piece depend on the animators' artistic styles and their field.

Visual effects is the process by which imagery is created or manipulated outside the context of a live-action shot in filmmaking and video production. The integration of live-action footage and other live-action footage or CGI elements to create realistic imagery is called VFX.

<span class="mw-page-title-main">Motion capture</span> Process of recording the movement of objects or people

Motion capture is the process of recording the movement of objects or people. It is used in military, entertainment, sports, medical applications, and for validation of computer vision and robots. In filmmaking and video game development, it refers to recording actions of human actors and using that information to animate digital character models in 2D or 3D computer animation. When it includes face and fingers or captures subtle expressions, it is often referred to as performance capture. In many fields, motion capture is sometimes called motion tracking, but in filmmaking and games, motion tracking usually refers more to match moving.

<span class="mw-page-title-main">Ragdoll physics</span> Type of procedural animation used by physics engines

Ragdoll physics is a type of procedural animation used by physics engines, which is often used as a replacement for traditional static death animations in video games and animated films. As computers increased in power, it became possible to do limited real-time physical simulations, which made death animations more realistic.

<i>Grand Theft Auto IV</i> 2008 video game

Grand Theft Auto IV is a 2008 action-adventure game developed by Rockstar North and published by Rockstar Games. It is the sixth main entry in the Grand Theft Auto series, following 2004's Grand Theft Auto: San Andreas, and the eleventh instalment overall. Set within the fictional Liberty City, based on New York City, the single-player story follows Eastern European war veteran Niko Bellic and his attempts to escape his past while under pressure from high-profile criminals. The open world design lets players freely roam Liberty City, consisting of three main islands, and the neighbouring state of Alderney, which is based on New Jersey.

Computer facial animation is primarily an area of computer graphics that encapsulates methods and techniques for generating and animating images or models of a character face. The character can be a human, a humanoid, an animal, a legendary creature or character, etc. Due to its subject and output type, it is also related to many other scientific and artistic fields from psychology to traditional animation. The importance of human faces in verbal and non-verbal communication and advances in computer graphics hardware and software have caused considerable scientific, technological, and artistic interests in computer facial animation.

<span class="mw-page-title-main">Virtual cinematography</span> CGI essentially

Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. It includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for the automated creation of real and simulated camera angles. Virtual cinematography can be used to shoot scenes from otherwise impossible camera angles, create the photography of animated films, and manipulate the appearance of computer-generated effects.

Facial motion capture is the process of electronically converting the movements of a person's face into a digital database using cameras or laser scanners. This database may then be used to produce computer graphics (CG), computer animation for movies, games, or real-time avatars. Because the motion of CG characters is derived from the movements of real people, it results in a more realistic and nuanced computer character animation than if the animation were created manually.

Digital puppetry is the manipulation and performance of digitally animated 2D or 3D figures and objects in a virtual environment that are rendered in real time by computers. It is most commonly used in filmmaking and television production, but has also been used in interactive theme park attractions and live theatre.

Human image synthesis is technology that can be applied to make believable and even photorealistic renditions of human-likenesses, moving or still. It has effectively existed since the early 2000s. Many films using computer generated imagery have featured synthetic images of human-like characters digitally composited onto the real or other simulated film material. Towards the end of the 2010s deep learning artificial intelligence has been applied to synthesize images and video that look like humans, without need for human assistance, once the training phase has been completed, whereas the old school 7D-route required massive amounts of human work.

Euphoria was a game animation middleware created by NaturalMotion based on Dynamic Motion Synthesis, NaturalMotion's proprietary technology for animating 3D characters on-the-fly "based on a full simulation of the 3D character, including body, muscles and motor nervous system". Instead of using predefined animations, the characters' actions and reactions are synthesized in real-time; they are different every time, even when replaying the same scene. While it is common for current video games to use limp "ragdolls" for animations generated on the fly, Euphoria employed a more complex method to animate the entirety of physically bound objects within the game environment. The engine was to be used in an Indiana Jones game that was later cancelled. According to its web site, Euphoria ran on the Microsoft Windows, OS X, Linux, PlayStation 3, PlayStation 4, Xbox 360, Xbox One, iOS and Android platforms and was compatible with all commercial physics engines.

iClone is a real-time 3D animation and rendering software program. Real-time playback is enabled by using a 3D videogame engine for instant on-screen rendering.

The history of computer animation began as early as the 1940s and 1950s, when people began to experiment with computer graphics – most notably by John Whitney. It was only by the early 1960s when digital computers had become widely established, that new avenues for innovative computer graphics blossomed. Initially, uses were mainly for scientific, engineering and other research purposes, but artistic experimentation began to make its appearance by the mid-1960s – most notably by Dr Thomas Calvert. By the mid-1970s, many such efforts were beginning to enter into public media. Much computer graphics at this time involved 2-dimensional imagery, though increasingly as computer power improved, efforts to achieve 3-dimensional realism became the emphasis. By the late 1980s, photo-realistic 3D was beginning to appear in film movies, and by mid-1990s had developed to the point where 3D animation could be used for entire feature film production.

<span class="mw-page-title-main">Computer-generated imagery</span> Application of computer graphics to create or contribute to images

Computer-generated imagery (CGI) is a specific-technology or application of computer graphics for creating or improving images in art, printed media, simulators, videos and video games. These images are either static or dynamic. CGI both refers to 2D computer graphics and 3D computer graphics with the purpose of designing characters, virtual worlds, or scenes and special effects. The application of CGI for creating/improving animations is called computer animation, or CGI animation.

<span class="mw-page-title-main">Mixamo</span> Technology company

Mixamo is a 3D computer graphics technology company. Based in San Francisco, the company develops and sells web-based services for 3D character animation. Mixamo's technologies use machine learning methods to automate the steps of the character animation process, including 3D modeling to rigging and 3D animation.

CrazyTalk is Reallusion's brand name for its 2D animation software. The product series includes CrazyTalk, a 2D facial animation software tool, and CrazyTalk Animator, a face and body 2D animation suite.

Faceware Technologies is an American facial animation and motion capture development company. The company was established under Image Metrics and became its own company at the beginning of 2012.

<span class="mw-page-title-main">Visage SDK</span> Software development kit

Visage SDK is a multi-platform software development kit (SDK) created by Visage Technologies AB. Visage SDK allows software programmers to build facial motion capture and eye tracking applications.

References

  1. "With Image Metrics (Sorted by Popularity Ascending)". IMDb .
  2. "Cyberface: New Technology That Captures the Soul (Published 2006)". The New York Times . Archived from the original on January 26, 2021.