SixthSense

Last updated
SixthSense SteveMann 1998 image1210.jpg
Steve Mann wearing a camera+projector dome in 1998, which he used as one node of the collaborative Telepointer system [1]
Pranavmistry.jpg
Pranav Mistry wearing a similar device in 2012, which he and Maes and Chang named "WUW", for Wear yoUr World. [2]

SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997 (headworn gestural interface), and 1998 (neckworn version), and further developed by Pranav Mistry (also at MIT Media Lab), in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 (by Steve Mann) that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers). [3] [4] [5] [6]

Contents

Fingertrack backview.gif
Fingertracking.gif
Fingertrack frontview.gif
1994 prototype of headworn SixthSense gesture-based wearable computing apparatus invented, designed, built, and worn by Steve Mann, MIT Media Lab. [7] Finger-pointing gesture to outline and select a physical object. [8] Front-view shows cameras attached to head-mounted display with wireless communications antennae on helmet.

SixthSense is a name for extra information supplied by a wearable computer, such as the device called EyeTap (Mann), Telepointer (Mann), and "WuW" (Wear yoUr World) by Pranav Mistry. [9] [10]

Origin of the name

Sixth Sense technology (a camera combined with a light source) was developed in 1997 as a headworn device, and in 1998 as a neckworn object, but the Sixth Sense name for this work was not coined and published until 2001, when Mann coined the term "Sixth Sense" to describe such devices. [11] [12]

Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", believing that wearable computing and digital information could act in addition to the five traditional senses. [13] Ten years later, Pattie Maes, also with MIT Media Lab, used the term "Sixth Sense" in this same context, in a TED talk.

Similarly, other inventors have used the term sixth-sense technology to describe new capabilities that augment the traditional five human senses. For example, in U.S. patent no. 9,374,397, timo platt et als, refer to their new communications invention as creating a new social and personal sense, i.e., a "metaphorical sixth sense", enabling users (while retaining their privacy and anonymity) to sense and share the "stories" and other attributes and information of those around them.

Related Research Articles

Ubiquitous computing is a concept in software engineering, hardware engineering and computer science where computing is made to appear anytime and everywhere. In contrast to desktop computing, ubiquitous computing can occur using any device, in any location, and in any format. A user interacts with the computer, which can exist in many different forms, including laptop computers, tablets, smart phones and terminals in everyday objects such as a refrigerator or a pair of glasses. The underlying technologies to support ubiquitous computing include Internet, advanced middleware, operating system, mobile code, sensors, microprocessors, new I/O and user interfaces, computer networks, mobile protocols, location and positioning, and new materials.

Wearable computer Small computing devices worn with clothing

A wearable computer, also known as a wearable or body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.

Steve Mann (inventor) Professor and wearable computing researcher

William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann is sometimes labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children.

EyeTap

An EyeTap is a concept for a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.

Computer-mediated reality Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

Thad Eugene Starner is a founder and director of the Contextual Computing Group at Georgia Tech's College of Computing, where he is a full professor. He is a pioneer of wearable computing as well as human-computer interaction, augmented environments, and pattern recognition. Starner is a strong advocate of continuous-access, everyday-use systems, and has worn his own customized wearable computer continuously since 1993. His work has touched on handwriting and sign-language analysis, intelligent agents and augmented realities. He also helped found Charmed Technology.

Handheld projector Image projector in a handheld device

A handheld projector is an image projector in a handheld device. It was developed as a computer display device for compact portable devices such as mobile phones, personal digital assistants, and digital cameras, which have sufficient storage capacity to handle presentation materials but are too small to accommodate a display screen that an audience can see easily. Handheld projectors involve miniaturized hardware, and software that can project digital images onto a nearby viewing surface.

Gesture recognition Topic in computer science and language technology

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices.

Equiveillance is a state of equilibrium, or a desire to attain a state of equilibrium, between surveillance and sousveillance. It is sometimes confused with transparency. The balance (equilibrium) provided by equiveillance allows individuals to construct their own cases from evidence they gather themselves, rather than merely having access to surveillance data that could possibly incriminate them.

Rosalind Picard American computer scientist

Rosalind Wright Picard is an American scholar and inventor who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica. In 2005, she was named a Fellow of the Institute of Electrical and Electronics Engineers for contributions to image and video analysis and affective computing. In 2019 she received one of the highest professional honors accorded an engineer, election to the National Academy of Engineering for her contributions on affective computing and wearable computing.

Multi-touch Technology

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Plural-point awareness may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures.

Humanistic intelligence

Humanistic Intelligence (HI) is defined, in the context of wearable computing, by Marvin Minsky, Ray Kurzweil, and Steve Mann, as follows:

Humanistic Intelligence [HI] is intelligence that arises because of a human being in the feedback loop of a computational process, where the human and computer are inextricably intertwined. When a wearable computer embodies HI and becomes so technologically advanced that its intelligence matches our own biological brain, something much more powerful emerges from this synergy that gives rise to superhuman intelligence within the single “cyborg” being.

Lifelog Personal record of one’s daily life

A lifelog is a personal record of one’s daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers.

Cyborg Being with both organic and biomechatronic body parts

A cyborg —a portmanteau of cybernetic and organism—is a being with both organic and biomechatronic body parts. The term was coined in 1960 by Manfred Clynes and Nathan S. Kline.

Pranav Mistry Indian computer scientist (born 1981)

Pranav Mistry is a computer scientist and inventor. He was the President and CEO of STAR Labs. He is best known for his work on SixthSense, Samsung Galaxy Gear and Project Beyond.

Telepointer

Telepointer is a neck-worn gestural interface system developed by MIT Media Lab student Steve Mann in 1998. Mann originally referred to the device as "Synthetic Synesthesia of the Sixth Sense". In the 1990s and early 2000s Mann used this project as a teaching example at the University of Toronto.

Scratch input

In computing, scratch input is an acoustic-based method of Human-Computer Interaction (HCI) that takes advantage of the characteristic sound produced when a finger nail or other object is dragged over a surface, such as a table or wall. The technique is not limited to fingers; a stick or writing implements can also be used. The sound is often inaudible to the naked ear. However, specialized microphones can digitize the sounds for interactive purposes. Scratch input was invented by Mann et al. in 2007, though the term was first used by Chris Harrison et al.

'SPARSH' is a novel interaction method to transfer data between digital devices by simple touch gestures. Sparsh prototype system is designed and developed by Pranav Mistry, Suranga Nanayakkara of the MIT Media Lab. Sparsh lets the user touch whatever data item he or she wants to copy from a device. At that moment, the data item is conceptually saved in the user. Next, the user touches the other device he or she wants to paste/pass the saved content into. Sparsh uses touch-based interactions as indications for what to copy and where to pass it. Technically, the actual transfer of media happens via the information cloud. The user authentication is achieved by face recognition, fingerprint detection or username-password combination. Sparsh lets the user conceptually transfer media from one digital device to one's body and pass it to the other digital device by simple touch gestures. At present, Sparsh system support Android and Windows platform.

Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.

References

  1. "Telepointer: Hands-Free Completely Self Contained Wearable Visual Augmented Reality without Headwear and without any Infrastructural Reliance", IEEE International Symposium on Wearable Computing (ISWC00), pp. 177, 2000, Los Alamitos, CA, USA
  2. "WUW – wear Ur world: a wearable gestural interface", Proceedings of CHI EA '09 Extended Abstracts on Human Factors in Computing Systems Pages 4111-4116, ACM New York, NY, USA
  3. IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
  4. [Sensularity with a Sixth Sense https://blog.metavision.com/professor-steve-mann-society-of-sensularity-with-a-sixth-sense/]
  5. [Sixth Sense Technology, International Journal of Science and Research (IJSR) ISSN 2319-7064 https://www.ijsr.net/archive/v3i12/U1VCMTQ1Nzc=.pdf]
  6. Kedar Kanel, SIXTH SENSE TECHNOLOGY, 2014, CENTRIA UNIVERSITY OF APPLIED SCIENCES
  7. Wearable, tetherless computer–mediated reality, Steve Mann. February 1996. In Presentation at the American Association of Artificial Intelligence, 1996 Symposium; early draft appears as MIT Media Lab Technical Report 260, December 1994
  8. IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
  9. "IEEE ISWC P. 177" (PDF). Retrieved 2013-10-07.
  10. "Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer", Steve Mann with Hal Niedzviecki, ISBN   0-385-65825-7 (Hardcover), Random House Inc, 304 pages, 2001.
  11. Cyborg, 2001
  12. Geary 2002
  13. An Anatomy of the New Bionic Senses [Hardcover], by James Geary, 2002, 214pp

Further reading