GestureTek

Last updated

GestureTek
Founded1986
FounderCanadians Vincent John Vincent and Francis MacDougall
Headquarters
Silicon Valley, California
Website gesturetek.com

GestureTek is an American-based interactive technology company headquartered in Silicon Valley, California, with offices in Toronto and Ottawa, Ontario and Asia. [1]

Contents

Founding

Founded in 1986 by Canadians Vincent John Vincent [2] and Francis MacDougall, [3] this privately held company develops and licenses gesture recognition software based on computer vision techniques. The partners invented video gesture control in 1986 and received their base patent in 1996 for the GestPoint video gesture control system. GestPoint technology is a camera-enabled video tracking software system that translates hand and body movement into computer control. [4] The system enables users to navigate and control interactive multi-media and menu-based content, engage in virtual reality game play, experience immersion in an augmented reality environment or interact with a consumer device (such a television, mobile phone or set top box) without using touch-based peripherals. [5] [6] [7] Similar companies include gesture recognition specialist LM3LABS based in Tokyo, Japan.

Technology

GestureTek's gesture interface applications include multi-touch and 3D camera tracking. GestureTek's multi-touch technology powers the multi-touch table in Melbourne's Eureka Tower. [8] A GestureTek multi-touch table with object recognition is found at the New York City Visitors Center. [9] Telefónica has a multi-touch window with technology from GestureTek. [10] GestureTek's 3D tracking technology is used in a 3D television prototype from Hitachi and various digital signage and display solutions based on 3D interaction. [11]

Patents

GestureTek currently has 8 patents awarded, including: 5,534,917 [12] (Video Gesture Control Motion Detection); 7,058,204 [13] (Multiple Camera Control System, Point to Control Base Patent); 7,421,093 [14] (Multiple Camera Tracking System for Interfacing With an Application); 7,227,526 [15] (Stereo Camera Control, 3D-Vision Image Control System); 7,379,563 [16] (Two-Handed Movement Tracker Tracking Bi-Manual Movements); 7,379,566 [17] (Optical Flow-Based Tilt Sensor For Phone Tilt Control); 7,389,591 [18] (Phone Tilt for Typing & Menus/Orientation-Sensitive Signal Output); 7,430,312 [19] (Five Camera 3D Face Capture).

GestureTek's software and patents have been licensed by Microsoft for the Xbox 360, [20] Sony for the EyeToy, [21] NTT DoCoMo for their mobile phones [22] and Hasbro for the ION Educational Gaming System. [23] In addition to software provision, GestureTek also fabricates interactive gesture control display systems with natural user interface for interactive advertising, games and presentations. [24]

In addition, GestureTek's natural user interface virtual reality system has been the subject of research by universities and hospitals for its application in both physical therapy [25] and physical rehabilitation. [26]

In 2008, GestureTek received the Mobile Innovation Global Award [27] from the GSMA for its software-based, gesture-controlled user interface for mobile games and applications. The technology is used by Java platform integration providers [28] and mobile developers. [29] Katamari Damacy is one example of a gesture control mobile game powered by GestureTek software. [30]

Competitors

Other companies in the industry of interactive projections for marketing and retail experiences include Po-motion Inc., [31] Touchmagix [32] and LM3LABS. [33]

Related Research Articles

<span class="mw-page-title-main">Computer mouse</span> Pointing device used to control a computer

A computer mouse is a hand-held pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of the pointer on a display, which allows a smooth control of the graphical user interface of a computer.

<span class="mw-page-title-main">Graphical user interface</span> User interface allowing interaction through graphical icons and visual indicators

A graphical user interface, or GUI, is a form of user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation. In many applications, GUIs are used instead of text-based UIs, which are based on typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs), which require commands to be typed on a computer keyboard.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.

<span class="mw-page-title-main">Touchscreen</span> Input and output device

A touchscreen is a type of display that can detect touch input from a user. It consists of both an input device and an output device. The touch panel is typically layered on the top of the electronic visual display of a device. Touchscreens are commonly found in smartphones, tablets, laptops, and other electronic devices.

<span class="mw-page-title-main">Scrolling</span> Sliding motion vertically or horizontally over display devices

In computer displays, filmmaking, television production, and other kinetic displays, scrolling is sliding text, images or video across a monitor or display, vertically or horizontally. "Scrolling," as such, does not change the layout of the text or pictures but moves the user's view across what is apparently a larger image that is not wholly seen. A common television and movie special effect is to scroll credits, while leaving the background stationary. Scrolling may take place completely without user intervention or, on an interactive device, be triggered by touchscreen or a keypress and continue without further intervention until a further user action, or be entirely controlled by input devices.

Synaptics Incorporated is a publicly owned San Jose, California-based developer of human interface (HMI) hardware and software, including touchpads for computer laptops; touch, display driver, and fingerprint biometrics technology for smartphones; and touch, video and far-field voice technology for smart home devices and automotives. Synaptics sells its products to original equipment manufacturers (OEMs) and display manufacturers.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.

The Technology and Engineering Emmy Awards, or Technology and Engineering Emmys, are one of two sets of Emmy Awards that are presented for outstanding achievement in engineering development in the television industry. The Technology and Engineering Emmy Awards are presented by the National Academy of Television Arts and Sciences (NATAS), while the separate Primetime Engineering Emmy Awards are given by its sister organization the Academy of Television Arts & Sciences (ATAS).

<span class="mw-page-title-main">Multi-touch</span> Technology

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Multi-touch may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures using gesture recognition.

Katamari Damacy Mobile was a video game developed by Namco Bandai Games for the Mitsubishi P904i series of mobile phones for NTT Docomo. It was released in June 2007 in Japan. It is a spin-off of the Katamari Damacy series, the second game on a handheld game console and the third game produced without the involvement of series creator Keita Takahashi.

<span class="mw-page-title-main">Pen computing</span> Uses a stylus and tablet/touchscreen

Pen computing refers to any computer user-interface using a pen or stylus and tablet, over input devices such as a keyboard or a mouse.

<span class="mw-page-title-main">Microsoft PixelSense</span> Interactive surface computing platform by Microsoft

Microsoft PixelSense was an interactive surface computing platform that allowed one or more people to use and touch real-world objects, and share digital content at the same time. The PixelSense platform consists of software and hardware products that combine vision based multitouch PC hardware, 360-degree multiuser application design, and Windows software to create a natural user interface (NUI).

Sony Depthsensing Solutions SA/NV, formerly known as SoftKinetic Systems, is a Belgian company originating from the merger of Optrima NV, founded by André Miodezky, Maarten Kuijk, Daniël Van Nieuwenhove, Ward Van der Tempel and Tomas Van den Hauwe and SoftKinetic SA founded by Eric Krzeslo, Thibaud Remacle, Gilles Pinault and Xavier Baele. Sony Depthsensing Solutions develops gesture recognition hardware and software for real-time range imaging (3D) cameras. SoftKinetic was founded in July 2007 providing gesture recognition solutions based on its technology to the interactive digital entertainment, consumer electronics, health & fitness, and serious game industries. SoftKinetic technology has been applied to interactive digital signage and advergaming, interactive television, and physical therapy.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

<span class="mw-page-title-main">PrimeSense</span> Former Israeli company

PrimeSense was an Israeli 3D sensing company based in Tel Aviv. PrimeSense had offices in Israel, North America, Japan, Singapore, Korea, China and Taiwan. PrimeSense was bought by Apple Inc. for $360 million on November 24, 2013.

Project Digits is a Microsoft Research Project under Microsoft's computer science laboratory at the University of Cambridge; researchers from Newcastle University and University of Crete are also involved in this project. Project is led by David Kim a Microsoft Research PhD and also a PhD Student in computer science at Newcastle University. Digits is an input device which can be mounted on the wrist of human hand and it captures and displays a complete 3D graphical representation of the user's hand on screen without using any external sensing device or hand covering material like data gloves. This project aims to make gesture controlled interfaces completely hands free with greater mobility and accuracy. It allows user to interact with whatever hardware while moving from room to room or walking down the street without any line of sight connection with the hardware.

uSens, Inc. is a Silicon Valley startup founded in 2014 in San Jose, California. The company builds interactive computer-vision tracking solutions. The uSens team has extensive experience in artificial intelligence (AI), computer vision, 3D Human–computer interaction (HCI) technology and augmented reality and virtual reality. uSens has been applying computer vision and AI technologies in AR/VR, Automotive and smartphones. 

<span class="mw-page-title-main">Force Touch</span> Force-sensing touch technology developed by Apple Inc.

Force Touch is a haptic pressure-sensing technology developed by Apple Inc. that enables trackpads and touchscreens to sense the amount of force being applied to their surfaces. Software that uses Force Touch can distinguish between various levels of force for user interaction purposes. Force Touch was first unveiled on September 9, 2014, during the introduction of Apple Watch. Starting with the Apple Watch, Force Touch has been incorporated into many Apple products, including MacBooks and the Magic Trackpad 2.

<span class="mw-page-title-main">Patrick Piemonte</span> American interface designer and computer scientist

Patrick Piemonte is an American inventor, computer scientist and user interface designer best known for his contributions to the iPhone and iPad at Apple. He is listed as an inventor on over 180 patents.

References

  1. "Gesture Recognition & Computer Vision Control Technology & Motion Sensing Systems for Presentation & Entertainment". Gesturetek.com. Retrieved October 20, 2011.
  2. "Vincent John Vincent". Vjvincent.com. Retrieved October 20, 2011.
  3. "ATIS TechThink". Techthink.org. Retrieved October 20, 2011.
  4. "Putting our arms around the future of touch". Archived from the original on April 26, 2009. Retrieved May 7, 2009.
  5. "GestureTek brings 3D and gestures together for remote control". Engadget. January 5, 2009. Retrieved October 20, 2011.
  6. "Watch out, Surface; GestureTek is straight frontin' | TechCrunch". Crunchgear.com. November 12, 2008. Retrieved October 20, 2011.
  7. "GestureTek Mobile is Overall Winner at the 2008 Mobile Innovation Global Awards at the GSMA's Mobile World Congress in Barcelona – Feb 13, 2008". Mobileworldcongress.mediaroom.com. February 13, 2008. Archived from the original on October 7, 2011. Retrieved October 20, 2011.
  8. "Microsoft Surface versus GestureTek's Illuminate Table". Aboutmicrosoftsurface.com. Archived from the original on October 9, 2011. Retrieved October 20, 2011.
  9. "The New York Center Information Center Installation". Svconline.com. Archived from the original on September 28, 2011. Retrieved October 20, 2011.
  10. Chris Morrison (December 12, 2007). "GestureTek receives investment from Telefonica | VentureBeat". Deals.venturebeat.com. Retrieved October 20, 2011.
  11. "LCD TV". Lcdtvreviews.org.uk. Archived from the original on October 9, 2011. Retrieved October 20, 2011.
  12. "Video image based control system – Very Vivid, Inc". Freepatentsonline.com. Retrieved October 20, 2011.
  13. "Multiple camera control system – GestureTek, Inc". Freepatentsonline.com. Retrieved October 20, 2011.
  14. "Multiple camera control system – GestureTek, Inc". Freepatentsonline.com. Retrieved October 20, 2011.
  15. "Video-based image control system – US Patent 7227526 Abstract". Patentstorm.us. Archived from the original on June 12, 2011. Retrieved October 20, 2011.
  16. "Tracking bimanual movements – US Patent 7379563 Abstract". Patentstorm.us. Archived from the original on June 12, 2011. Retrieved October 20, 2011.
  17. "Optical flow based tilt sensor – Patent # 7379566". PatentGenius. Retrieved October 20, 2011.
  18. "Orientation-sensitive signal output – GestureTek, Inc". Freepatentsonline.com. Retrieved October 20, 2011.
  19. "Creating 3D images of objects by illuminating with infrared patterns – GestureTek, Inc". Freepatentsonline.com. Retrieved October 20, 2011.
  20. "News – Q&A: GestureTek Talks Xbox 360 Camera Innovation". Gamasutra. October 11, 2006. Retrieved October 20, 2011.
  21. "GestureTek Grants Patent License to Sony Computer Entertainment America for EyeToy and PlayStation2 Game Development | Business Wire". Find Articles. February 18, 2005. Retrieved October 20, 2011.
  22. "Success Stories". Archived from the original on March 3, 2016. Retrieved May 12, 2009.
  23. "News – GestureTek Preparing 'Wii-like' Control Wand". Gamasutra. February 15, 2008. Retrieved October 20, 2011.
  24. "Gesturetek's interactive digital signage lineup | Digital Signage Today". Archived from the original on February 1, 2010. Retrieved May 7, 2009.
  25. Weiss, P. L.; Rand, D; Katz, N; Kizony, R (2004). "Video capture virtual reality as a flexible and effective rehabilitation tool". Journal of NeuroEngineering and Rehabilitation. 1 (1): 12. doi: 10.1186/1743-0003-1-12 . PMC   546410 . PMID   15679949.
  26. "Retrieved on 2009-05-07". Hw.haifa.ac.il. Retrieved October 20, 2011.
  27. "FirstNews – February 13, 2008". Wireless Week. February 13, 2008. Archived from the original on June 7, 2009. Retrieved October 20, 2011.
  28. Archived July 7, 2011, at the Wayback Machine
  29. "Gaming News – Get the latest updates on the gaming industry". gamezone.com. October 3, 2011. Retrieved October 20, 2011.
  30. Hall, L.E. (2018). "Keita's Mixed Media". Katamari Damacy. Boss Fight Books. ISBN   978-1-940535-17-3.
  31. Po-motion website
  32. Touchmagix website
  33. LM3LABS blog