Faceware Technologies

Last updated
Faceware Technologies
Formerly Image Metrics
Industry Entertainment, software development
Founded2012
ProductsFacial Animation and Motion Capture (MoCap) Technology
Website facewaretech.com

Faceware Technologies is a facial animation and motion capture development company in America. [1] [2] The company was established under Image Metrics and became its own company at the beginning of 2012. [3] [4]

Contents

Faceware produces software used to capture an actor's performance and transfer it onto an animated character, as well as the hardware needed to capture the performances. [5] The software line includes Faceware Analyzer, Faceware Retargeter, and Faceware Live. [6] [7]

Faceware software is used by film studios and video game developers including Rockstar Games, Bungie, Cloud Imperium Games, and 2K in games such as Grand Theft Auto V, Destiny, Star Citizen, and Halo: Reach. [6] [8] [9] [10] [11] [12]

Through its application in the video game industry, Faceware won the Develop Award while it was still part of Image Metrics for Technical Innovation in 2008. [13] It won the Develop Award again for Creative Contribution: Visuals in 2014. [14] Faceware received Best of Show recognition at the Game Developers Conference 2011 in San Francisco [15] as well as Computer Graphics World's Silver Edge Award at SIGGRAPH 2014 [16] and 2016. [17] Faceware won the XDS Gary Award in 2016 for its contributions to the Faceware-EA presentation at the 2016 XDS Summit.

History

Image Metrics, founded in 2000, is a provider of facial animation and motion capture technology within the video game and entertainment industries. [1] [3] In 2008, Image Metrics offered a beta version of its facial animation technology to visual effects and film studios. The technology captured an actor's performance on video, analyzed it, and mapped it onto a CG model. [1] [2] The release of the beta allowed studios to incorporate facial animation technology into internal pipelines rather than going to the Image Metrics studio as they had in the past. [2] [18] The first studio to beta test Image Metric's software in 2009 was the visual effects studio Double Negative out of London. [18]

In 2010, Image Metrics launched the facial animation technology platform Faceware. Faceware focused on increasing creative control, efficiency and production speed for animators. The software could be integrated into any pipeline or used with any game engine. Image Metrics provided training to learn the Faceware platform. The first studio to sign on as a Faceware customer was Bungie, which incorporated the software into its in-house production. [5] Image Metrics acquired FacePro in 2010, a company that provided automated lip synchronization which could be altered for accurate results, and Image Metrics integrated the acquired technology into its facial animation software. [19] [20] Also in 2010, Image Metrics bought Character-FX, a character animation company. [8] [21] [22] [23] Character-FX produced tools for use in Autodesk’s Maya and 3DS Max which aid in the creation of character facial rigs using an automated weighting transfer system that rapidly shifts facial features on a character to create lifelike movement. [4] [8] [23]

Image Metrics raised $8 million in funding and went public through a reverse merger in 2010 with International Cellular Industries. Image Metrics became wholly owned by International Cellular industries, which changed its name and took on facial animation technology as its sole line of business. [6] [24] [25] Faceware 3.0 was announced in March 2011. The upgrade included auto-pose, a shared pose database, and curve refinement. [26] Image Metrics led a workshop and presentation about Faceware 3.0 at the CTN Animation Expo 2011 titled "Faceware: Creating an Immersive Experience through Facial Animation." [27] Faceware's technology was displayed at Edinburgh Interactive in August 2011 to show its ability to add player facial animation from a webcam or Kinect sensor into a game in real-time. [28] [29]

Image Metrics sold the Faceware software to its spinoff company, Faceware Technologies, in January 2012. [3] [4] Following the spinoff, Faceware Technologies focused on producing and distributing its technology to professional animators. [3] The technology was tested through Universities, including the University of Portsmouth. [30]

Faceware launched its 3D facial animation tools, software packages Faceware Analyzer, and Faceware Retargeter with the Head-Mounted Camera System (HMCS). Analyzer tracks and processes live footage of an actor and Retargeter transfers that movement onto the face of a computer-generated character. The Head-Mounted Camera System is not required to use the software. Six actors can be captured simultaneously. [31]

Faceware Live was shown for the first time at SIGGRAPH 2013. [32] It was created to enable the real-time capture and retargeting of facial movements. The live capture of facial performance can use any video source to track and translate facial expressions into a set of animation values and transfer the captured data onto a 3D animated character in real time. [7] In 2014, Faceware released Faceware Live 2.0. The update included the option to stream multiple characters simultaneously, instant calibration, improved facial tracking, consistent calibration, and support for high-frame-rate cameras. [33]

In 2015, Faceware launched a plugin for Unreal Engine 4 called Faceware Live. The company co-developed the plugin with Australia-based Opaque Multimedia. [34] It makes motion capture of expressions and other facial movements possible with any video camera through Faceware's markerless 3D facial motion capture software. [35]

In 2016, Faceware announced the launch of Faceware Interactive, which is focused on the development of software and hardware that can be used in the creation of digital characters with whom real people can interact with. [36]

Partners

Faceware Technologies partnered with Binari Sonori in 2014 to develop a video-based localization service. [37] [38] Also in 2014, Faceware Technologies entered a global partnership with Vicon, a company focused on motion capture. The partnership would focus on developing new technology to expand into full-body motion capture data. The first step of the integration was to make the Faceware software compatible with Vicon's head rig, Cara, to allow data acquired from Cara to be processed and transferred into Faceware products. [39] [40]

Overview

Faceware Technologies has two main aspects of facial animation software. [6]

Faceware Analyzer is a stand-alone single-camera facial tracking software that converts videos of facial motion into files that can be used for Faceware Retargeter. The Lite version of the software can automatically track facial movements which can be applied to 3D models with Faceware Retargeter. The Pro version can perform shot specific custom calibrations, import and export actor data, auto indicate tracking regions, and has server and local licensing options. The data captured by Faceware Analyzer is then processed in Faceware Retargeter. [6]

Faceware Retargeter 4.0 was announced in 2014. Faceware Retargeter uses facial tracking data created in Analyzer to create facial animation in a pose-based workflow. The upgrade has a plug-in for Autodesk animation tools, advanced character expression sets, visual tracking data, shared pose thumbnails, and batch processing. The Lite version of the Retargeter software transfers actor's performances onto animated characters and reduces and smooths key frames. The Pro version includes custom poses, intelligent pose suggestions, shared pose libraries, and the ability to backup and restore jobs. [6]

Faceware Live aims to create natural looking faces and facial expressions in real-time. Any video source can be used with the software's one-button calibration. The captured video is transferred onto a 3D animated character. This process combines image processing and data streaming to translate facial expressions into a set of animation values. [7] [33]

Faceware has hardware options that can be rented or purchased. Available hardware is the entry level GoPro Headcam Kit and the Professional Headcam System. [6] The Indie Facial Mo-cap package includes hardware, a camera and headmount, and the tools to use it. [41]

Selected works

Faceware's software is used by companies such as Activision-Blizzard, Bethesda, Ubisoft, Electronic Arts, Sony, Cloud Imperium Games, and Microsoft. Rockstar Games used the software in games such as Grand Theft Auto V and Red Dead Redemption and Bungie used Faceware in games including Destiny and Halo: Reach. Faceware has also been used in other games like XCOM2, Dying Light: The Following, Hitman, EA Sports UFC 2, Fragments for Microsoft's HoloLens, DOOM, Mirror's Edge Catalyst, Kingsglaive, F1 2016, ReCore, Destiny: Rise of Iron, Mafia III, Call of Duty Infinite Warfare, Killzone:Shadow Fall, NBA 2K10-2K17, Sleeping Dogs, Crysis 2 and 3, Star Citizen, and in movies like The Curious Case of Benjamin Button and Robert Zemeckis's The Walk. [6] [8] [9] [10] [31] [42] [43] [44]

Related Research Articles

<span class="mw-page-title-main">Computer animation</span> Art of creating moving images using computers

Computer animation is the process used for digitally generating animations. The more general term computer-generated imagery (CGI) encompasses both static scenes and dynamic images, while computer animation only refers to moving images. Modern computer animation usually uses 3D computer graphics to generate a three-dimensional picture. The target of the animation is sometimes the computer itself, while other times it is film.

Visual effects is the process by which imagery is created or manipulated outside the context of a live-action shot in filmmaking and video production. The integration of live-action footage and other live-action footage or CGI elements to create realistic imagery is called VFX.

<span class="mw-page-title-main">Motion capture</span> Process of recording the movement of objects or people

Motion capture is the process of recording the movement of objects or people. It is used in military, entertainment, sports, medical applications, and for validation of computer vision and robots. In filmmaking and video game development, it refers to recording actions of human actors and using that information to animate digital character models in 2D or 3D computer animation. When it includes face and fingers or captures subtle expressions, it is often referred to as performance capture. In many fields, motion capture is sometimes called motion tracking, but in filmmaking and games, motion tracking usually refers more to match moving.

<span class="mw-page-title-main">Lance Williams (graphics researcher)</span> American graphics researcher

Lance J. Williams was a prominent graphics researcher who made major contributions to texture map prefiltering, shadow rendering algorithms, facial animation, and antialiasing techniques. Williams was one of the first people to recognize the potential of computer graphics to transform film and video making.

<span class="mw-page-title-main">Real-time computer graphics</span> Sub-field of computer graphics

Real-time computer graphics or real-time rendering is the sub-field of computer graphics focused on producing and analyzing images in real time. The term can refer to anything from rendering an application's graphical user interface (GUI) to real-time image analysis, but is most often used in reference to interactive 3D computer graphics, typically using a graphics processing unit (GPU). One example of this concept is a video game that rapidly renders changing 3D environments to produce an illusion of motion.

<span class="mw-page-title-main">Character generator</span> Device for adding text and graphics to a video stream

A character generator, often abbreviated as CG, is a device or software that produces static or animated text for keying into a video stream. Modern character generators are computer-based, and they can generate graphics as well as text.

Computer facial animation is primarily an area of computer graphics that encapsulates methods and techniques for generating and animating images or models of a character face. The character can be a human, a humanoid, an animal, a legendary creature or character, etc. Due to its subject and output type, it is also related to many other scientific and artistic fields from psychology to traditional animation. The importance of human faces in verbal and non-verbal communication and advances in computer graphics hardware and software have caused considerable scientific, technological, and artistic interests in computer facial animation.

<span class="mw-page-title-main">Virtual cinematography</span> CGI essentially

Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. It includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for the automated creation of real and simulated camera angles. Virtual cinematography can be used to shoot scenes from otherwise impossible camera angles, create the photography of animated films, and manipulate the appearance of computer-generated effects.

Facial motion capture is the process of electronically converting the movements of a person's face into a digital database using cameras or laser scanners. This database may then be used to produce computer graphics (CG), computer animation for movies, games, or real-time avatars. Because the motion of CG characters is derived from the movements of real people, it results in a more realistic and nuanced computer character animation than if the animation were created manually.

Digital puppetry is the manipulation and performance of digitally animated 2D or 3D figures and objects in a virtual environment that are rendered in real time by computers. It is most commonly used in filmmaking and television production, but has also been used in interactive theme park attractions and live theatre.

Human image synthesis is technology that can be applied to make believable and even photorealistic renditions of human-likenesses, moving or still. It has effectively existed since the early 2000s. Many films using computer generated imagery have featured synthetic images of human-like characters digitally composited onto the real or other simulated film material. Towards the end of the 2010s deep learning artificial intelligence has been applied to synthesize images and video that look like humans, without need for human assistance, once the training phase has been completed, whereas the old school 7D-route required massive amounts of human worK.

<span class="mw-page-title-main">PlayStation Eye</span> Digital camera device for the PlayStation 3

The PlayStation Eye is a digital camera device, similar to a webcam, for the PlayStation 3. The technology uses computer vision and gesture recognition to process images taken by the camera. This allows players to interact with games using motion and color detection as well as sound through its built-in microphone array. It is the successor to the EyeToy for the PlayStation 2, which was released in 2003.

<span class="mw-page-title-main">Image Metrics</span> American company providing facial animation software for the visual effects industries

Image Metrics is a 3D facial animation and Virtual Try-on company headquartered in El Segundo, with offices in Las Vegas, and research facilities in Manchester. Image Metrics are the makers of the Live Driver and Portable You SDKs for software developers and are providers of facial animation software and services to the visual effects industries.

iClone is a real-time 3D animation and rendering software program. Real-time playback is enabled by using a 3D videogame engine for instant on-screen rendering.

The history of computer animation began as early as the 1940s and 1950s, when people began to experiment with computer graphics – most notably by John Whitney. It was only by the early 1960s when digital computers had become widely established, that new avenues for innovative computer graphics blossomed. Initially, uses were mainly for scientific, engineering and other research purposes, but artistic experimentation began to make its appearance by the mid-1960s – most notably by Dr Thomas Calvert. By the mid-1970s, many such efforts were beginning to enter into public media. Much computer graphics at this time involved 2-dimensional imagery, though increasingly as computer power improved, efforts to achieve 3-dimensional realism became the emphasis. By the late 1980s, photo-realistic 3D was beginning to appear in film movies, and by mid-1990s had developed to the point where 3D animation could be used for entire feature film production.

<span class="mw-page-title-main">Light stage</span> Equipment used for shape, texture, reflectance and motion capture

A light stage is an active illumination system used for shape, texture, reflectance and motion capture often with structured light and a multi-camera setup.

CrazyTalk is Reallusion's brand name for its 2D animation software. The product series includes CrazyTalk, a 2D facial animation software tool, and CrazyTalk Animator, a face and body 2D animation suite.

Tony de Peltrie is a Canadian computer-animated short film from 1985. The short shows the first animated human character to express emotion through facial expressions and body movements, which touched the feelings of the audience. The film was produced from 1982 to 1985 at the French-speaking University of Montreal, Quebec, and Canada.

<span class="mw-page-title-main">Visage SDK</span> Software development kit

Visage SDK is a multi-platform software development kit (SDK) created by Visage Technologies AB. Visage SDK allows software programmers to build facial motion capture and eye tracking applications.

References

  1. 1 2 3 "Image Metrics Launches Beta Version of Facial Animation Solution". Red Orbit. August 14, 2008. Retrieved May 5, 2015.
  2. 1 2 3 "Image Metrics launches beta of its facial animation system". CG Press. Aug 13, 2008. Retrieved May 5, 2015.
  3. 1 2 3 4 Dean Takahashi (January 19, 2012). "Image Metrics sells its Faceware face animation technology to spin-off". Venture Beat. Retrieved May 5, 2015.
  4. 1 2 3 Rachel Weber (January 19, 2012). "Image Metrics launches spin-off company". Games Industry. Retrieved May 5, 2015.
  5. 1 2 Eric Caoili (March 8, 2010). "Pre-GDC: Image Metrics Launches Faceware, Signs Bungie". Gamasutra. Retrieved May 5, 2015.
  6. 1 2 3 4 5 6 7 8 Randall Newton (October 3, 2013). "Faceware upgrades facial motion capture product line". Graphic Speak. Retrieved May 5, 2015.
  7. 1 2 3 "Face Off With Faceware Live". Organic Motion. Retrieved May 5, 2015.
  8. 1 2 3 4 Dean Takahashi (June 30, 2010). "Image Metrics Acquires Character-FX to build realistic 3D game avatars". Venture Beat. Retrieved May 5, 2015.
  9. 1 2 David Veselka (March 20, 2014). "Destiny to Use Faceware Tech For Increased Character Detail". Mp1st. Retrieved May 5, 2015.
  10. 1 2 Alex Osborn. "Interview: Faceware's Peter Busch Talks Motion Capture for Destiny". GameRevolution. Retrieved May 5, 2015.
  11. Alex Osborn. "Interview: Faceware's Peter Busch Talks Motion Capture for Destiny". Game Revolution. Retrieved May 6, 2015.
  12. "Cloud Imperium Games Adds Faceware Player-Driven Facial Animation To 'Star Citizen'". Tom's Hardware. 2017-08-25. Retrieved 2017-08-26.
  13. "Develop Awards 2008". Games Industry. July 31, 2008. Retrieved April 21, 2015.
  14. Rob Crossley (August 11, 2011). "'Your face is animation's future' - Image Metrics". Develop. Retrieved May 6, 2015.
  15. David Scarpitta (April 14, 2011). "Image Metrics' FACEWARE 3.0 Named 'Best of Show' at GDC 2011". Das Reviews. Retrieved April 20, 2015.
  16. "CGW Announces Silver Edge Winners from SIGGRAPH 2014". CGW. August 20, 2014. Retrieved April 20, 2015.
  17. "Computer Graphics World Selects SIGGRAPH 2016 Best of Show | Computer Graphics World". www.cgw.com. Retrieved 2016-12-13.
  18. 1 2 "Image Metrics and Double Negative testing markerless facial capture". Variety. March 3, 2009. Retrieved May 5, 2015.
  19. "Image Metrics Acquires Leading Lip Sync Technology Company FacePro". .Net Developer's Journal. August 5, 2010. Retrieved May 5, 2015.
  20. Tom McLean (August 6, 2010). "Image Metrics Seals Deal for Lip-Syncers FacePro". Animation Magazine. Retrieved May 5, 2015.
  21. Kathleen Maher (March 29, 2011). "Image Metrics looks for new opportunities". Graphic Speak. Retrieved May 5, 2015.
  22. David Jenkins (July 1, 2010). "Image Metrics acquires facial animation firm Character-FX". Games Industry. Retrieved May 5, 2015.
  23. 1 2 "Character-FX sold to Image Metrics". Broadcast Now. July 6, 2010. Retrieved May 5, 2015.
  24. "What is the history of Image Metrics Inc and the latest information about Image Metrics Inc?". Beaver Creek News. Retrieved May 6, 2015.
  25. David Jenkins (March 11, 2010). "Image Metrics settles $8 million funding deal". Games Industry. Retrieved May 6, 2015.
  26. "Image Metrics' Faceware 3.0 Facial Animation Software Now Available". Computer Graphics World. March 1, 2011. Retrieved May 6, 2015.
  27. "Image Metrics to Host Faceware Demos and Workshop at CTN Animation Expo 2011". Develop. Retrieved April 21, 2015.
  28. Connor Beaton (August 11, 2011). "Image Metrics To Reinvent Facial Animation". zConnection International. Retrieved May 6, 2015.
  29. James Batchelor (April 29, 2014). "Almost 140 companies shortlisted for the 2014 Develop Awards across 24 categories – are you among them?". Develop. Retrieved May 6, 2015.
  30. Chris McMahon. "Faceware rolls out new educational program". 3D Artist Online. Retrieved May 5, 2015.
  31. 1 2 Jim Thacker (May 31, 2012). "Faceware launches facial animation tools line-up". CG Channel. Retrieved May 5, 2015.
  32. Jon K. Carroll (July 29, 2013). "SIGGRAPH 2013: Faceware Technologies Debuts Faceware Live". Tom's Hardware. Retrieved May 5, 2015.
  33. 1 2 Jim Thacker (August 8, 2014). "Faceware Technologies ships Faceware Live 2.0". CG Channel. Retrieved May 5, 2015.
  34. Kevin Carbotte (2015-08-11). "Facial Motion Capture In Unreal Engine Can Now Be Done With Any Camera". Tom's Hardware. Retrieved 2022-01-05.
  35. Carbotte, Kevin (August 11, 2015). "Facial Motion Capture In Unreal Engine Can Now Be Done With Any Camera". TomsHardware. Retrieved September 16, 2015.
  36. Hurst, Adriene. "Faceware's Interactive Division to Focus on Live-to-Virtual Interactions". www.digitalmediaworld.tv. Retrieved 2016-12-13.
  37. "Faceware Technologies Partners with Binari Sonori to Offer New Video-Based Localization Service". Gaming Bolt. April 30, 2014. Retrieved May 5, 2015.
  38. "Faceware & Binari Sonori Talk the Talk on Video Voice Localization". Digital Media World. Archived from the original on March 20, 2015. Retrieved May 5, 2015.
  39. "Faceware and Vicon Partner to Improve Facial & Full-Body Mocap". Digital Media World. January 28, 2014. Archived from the original on May 30, 2015. Retrieved May 5, 2015.
  40. Brittany Hillen (January 28, 2014). "Faceware and Vicon pair up motion capture products in new partnership". Slash Gear. Retrieved May 5, 2015.
  41. Will Freeman (April 29, 2014). "The Facial animation firm has conceived a new offering for indies and mobile teams. Peter Busch tells Develop what it delivers". Develop. Retrieved May 6, 2015.
  42. "Step-by-step: a walkthrough of The Walk". fxguide. 2015-10-11. Retrieved 2016-12-13.
  43. "Faceware Launches the Indie Facial Mocap Package for Mobile Game Devs". Game Academy. March 5, 2014. Retrieved May 5, 2015.
  44. Jennifer Wolfe (March 6, 2014). "Faceware Tech Launches New Indie Facial Mocap Package". Animation World Network. Retrieved May 5, 2015.