SixthSense

Last updated
SixthSense SteveMann 1998 image1210.jpg
Steve Mann wearing a camera+projector dome in 1998, which he used as one node of the collaborative Telepointer system [1]
Pranavmistry.jpg
Pranav Mistry wearing a similar device in 2012, which he and Maes and Chang named "WUW", for Wear yoUr World. [2]

SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997 (headworn gestural interface), and 1998 (neckworn version), and further developed by Pranav Mistry (also at MIT Media Lab), in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 (by Steve Mann) that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers). [3] [4] [5] [6]

Gesture recognition

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at this point will move accordingly. This could make conventional input on devices such and even redundant.

Wearable computer body-wearble miniature electronic devices

Wearable computers, also known as wearables or body-borne computers, are small computing devices that are worn under, with, or on top of clothing.

MIT Media Lab interdisciplinary research laboratory at the Massachusetts Institute of Technology

The MIT Media Lab is an interdisciplinary research laboratory at the Massachusetts Institute of Technology, growing out of MIT's Architecture Machine Group in the School of Architecture. Its research does not restrict to fixed academic disciplines, but draws from technology, media, science, art and design. As of 2014, Media Lab's research groups include neurobiology, biologically inspired fabrication, socially engaging robots, emotive computing, bionics, and hyperinstruments.

Contents

Fingertrack backview.gif
Fingertracking.gif
Fingertrack frontview.gif
1994 prototype of headworn SixthSense gesture-based wearable computing apparatus invented, designed, built, and worn by Steve Mann, MIT Media Lab. [7] Finger-pointing gesture to outline and select a physical object. [8] Front-view shows cameras attached to head-mounted display with wireless communications antennae on helmet.

SixthSense is a name for extra information supplied by a wearable computer, such as the device called EyeTap (Mann), Telepointer (Mann), and "WuW" (Wear yoUr World) by Pranav Mistry. [9] [10]

Pranav Mistry Indian computer scientist

Pranav Mistry is a computer scientist and inventor. He is the Global Senior Vice President of Research at Samsung and the head of Think Tank Team. He is best known for his work on SixthSense, Samsung Galaxy Gear and Project Beyond. The World Economic Forum had honored Mistry as one of the Young Global Leader in 2013.

Origin of the name

Sixth Sense technology (a camera combined with a light source) was developed in 1997 as a headworn device, and in 1998 as a neckworn object, but the Sixth Sense name for this work was not coined and published until 2001, when Mann coined the term "Sixth Sense" to describe such devices. [11] [12]

Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", believing that wearable computing and digital information could act in addition to the five traditional senses. [13] Ten years later, Pattie Maes, also with MIT Media Lab, used the term "Sixth Sense" in this same context, in a TED talk.

Sense Physiological capacity of organisms that provides data for perception

A sense is a physiological capacity of organisms that provides data for perception. The senses and their operation, classification, and theory are overlapping topics studied by a variety of fields, most notably neuroscience, cognitive psychology, and philosophy of perception. The nervous system has a specific sensory nervous system, and a sense organ, or sensor, dedicated to each sense.

Pattie Maes Belgian computer scientist

Pattie Maes is a professor in MIT's program in Media Arts and Sciences. She founded and directed the MIT Media Lab's Fluid Interfaces Group. Previously, she founded and ran the Software Agents group. She served for several years as both the head and associate head of the Media Lab's academic program. Prior to joining the Media Lab, Maes was a visiting professor and a research scientist at the MIT Artificial Intelligence Lab. She holds bachelor's and PhD degrees in computer science from the Vrije Universiteit Brussel in Belgium.

Similarly, other inventors have used the term sixth-sense technology to describe new capabilities that augment the traditional five human senses. For example, in U.S. patent no. 9,374,397, timo platt et als, refer to their new communications invention as creating a new social and personal sense, i.e., a "metaphorical sixth sense", enabling users (while retaining their privacy and anonymity) to sense and share the "stories" and other attributes and information of those around them.

Related Research Articles

Steve Mann Professor and wearable computing researcher

Steven Mann is a Canadian researcher and inventor best known for his work on augmented reality, computational photography, particularly wearable computing and high dynamic range imaging.

Sousveillance recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies

Sousveillance is the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The term "sousveillance", coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the camera or other means of observation down to human level, either physically, or hierarchically.

EyeTap

An EyeTap is a device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive. The EyeTap is a hard technology to categorize under the three main headers for wearable computing for while it is in theory a constancy technology in nature it also has the ability to augment and mediate the reality the user perceives.

Thad Eugene Starner is a founder and director of the Contextual Computing Group at Georgia Tech's College of Computing, where he is a Full Professor. He is a pioneer of wearable computing as well as human-computer interaction augmented environments and pattern recognition. Starner is a strong advocate of continuous-access, everyday-use systems, and has worn his own customized wearable computer continuously since 1993. His work has touched on handwriting and sign-language analysis, intelligent agents and augmented realities. He also helped found Charmed Technology.

Handheld projector image projector in a handheld device

A handheld projector is an image projector in a handheld device. It was developed to as a computer display device for compact portable devices such as mobile phones, personal digital assistants, and digital cameras, which have sufficient storage capacity to handle presentation materials but are too small to accommodate a display screen that an audience can see easily. Handheld projectors involve miniaturized hardware, and software that can project digital images onto a nearby viewing surface.

Rosalind Picard American computer scientist

Rosalind Wright Picard is an American scholar who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica. In 2005, she was named a Fellow of the Institute of Electrical and Electronics Engineers for contributions to image and video analysis and affective computing. In 2019 she received one of the highest professional honors accorded an engineer, election to the National Academy of Engineering for her contributions on affective computing and wearable computing.

Multi-touch

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. Multi-touch was in use as early as 1985. Apple popularized the term "multi-touch" in 2007. Plural-point awareness may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures.

Humanistic intelligence

Humanistic Intelligence (HI) is defined, in the context of wearable computing, by Marvin Minsky, Ray Kurzweil, and Steve Mann, as follows:

Humanistic Intelligence [HI] is intelligence that arises because of a human being in the feedback loop of a computational process, where the human and computer are inextricably intertwined. When a wearable computer embodies HI and becomes so technologically advanced that its intelligence matches our own biological brain, something much more powerful emerges from this synergy that gives rise to superhuman intelligence within the single “cyborg” being.

Lifelog

A lifelog is a detailed chronicle of a person's life involving large amounts of data. In recent years the data is usually captured automatically by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers.

Cyborg being with both organic and biomechatronic body parts

A cyborg, short for "cyberneticorganism", is a being with both organic and biomechatronic body parts. The term was coined in 1960 by Manfred Clynes and Nathan S. Kline.

The Mouseless is a proposed input device for personal computers. A prototype Mouseless, designed by Pranav Mistry of the MIT Media Lab. Mouseless replaces conventional hardware mouse with a set of infrared laser strobe, an infrared camera and image recognition software. The laser beam is optically split into a wide beam illuminating an imaginary plane above the working desk. The camera captures the pattern of invisible infrared light as it illuminates user's hand. The user rests the palm on the desk and commands the system in the same way as he or she would do with a conventional mouse.

Telepointer

Telepointer is a neck-worn gestural interface system developed by MIT Media Lab student Steve Mann in 1998. Mann originally referred to the device as "Synthetic Synesthesia of the Sixth Sense". In the 1990s and early 2000s Mann used this project as a teaching example at the University of Toronto.

Mobile Cloud Computing (MCC) is the combination of cloud computing, mobile computing and wireless networks to bring rich computational resources to mobile users, network operators, as well as cloud computing providers. The ultimate goal of MCC is to enable execution of rich mobile applications on a plethora of mobile devices, with a rich user experience. MCC provides business opportunities for mobile network operators as well as cloud providers. More comprehensively, MCC can be defined as "a rich mobile computing technology that leverages unified elastic resources of varied clouds and network technologies toward unrestricted functionality, storage, and mobility to serve a multitude of mobile devices anywhere, anytime through the channel of Ethernet or Internet regardless of heterogeneous environments and platforms based on the pay-as-you-use principle."

Jocelyn Scheirer

Jocelyn Scheirer is an American entrepreneur, scientist, and artist who has been working in wearable technology since the late 1990s. Her research focuses on Affective Computing, which she pursued while pursuing her PhD (pending) at MIT Media's Lab Affective Computing Group with Rosalind Picard. Scheirer invented and, along with MIT, patented the Galvactivator glove which measured skin conductance through sensors on the palm and relayed the varying intensity through an LED display. She founded the intercommunication equipment and systems company Empathyx, Inc. in 2006 and co-founded the emotional analytics company Affectiva in 2009, serving as their director of operations until 2010. Scheirer has also created several visual and performance art pieces that have been featured in several galleries in Massachusetts including the MIT Museum, the Galatea Fine Art Gallery, and the Bromfield Gallery. She currently serves as CEO of the wearable company Bionolux Labs, LLC.

Muse (headband)

Muse is a wearable brain sensing headband. The device measures brain activity via 4 electroencephalography (EEG) sensors. An accompanying mobile app converts the EEG signal into audio feedback that is fed to the user via headphones. Muse is manufactured by InteraXon, a company based in Toronto, Ontario, Canada that was founded in 2007 by Ariel Garten, Trevor Coleman, Chris Aimone, and Steve Mann originally at 330 Dundas Street West, in Toronto, Ontario, Canada. Development of the Muse product began in 2003, and after several rounds of fundraising, was released to the public in May 2014.

Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.

References

  1. "Telepointer: Hands-Free Completely Self Contained Wearable Visual Augmented Reality without Headwear and without any Infrastructural Reliance", IEEE International Symposium on Wearable Computing (ISWC00), pp. 177, 2000, Los Alamitos, CA, USA
  2. "WUW – wear Ur world: a wearable gestural interface", Proceedings of CHI EA '09 Extended Abstracts on Human Factors in Computing Systems Pages 4111-4116, ACM New York, NY, USA
  3. IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
  4. [Sensularity with a Sixth Sense https://blog.metavision.com/professor-steve-mann-society-of-sensularity-with-a-sixth-sense/]
  5. [Sixth Sense Technology, International Journal of Science and Research (IJSR) ISSN 2319-7064 https://www.ijsr.net/archive/v3i12/U1VCMTQ1Nzc=.pdf]
  6. Kedar Kanel, SIXTH SENSE TECHNOLOGY, 2014, CENTRIA UNIVERSITY OF APPLIED SCIENCES
  7. Wearable, tetherless computer–mediated reality, Steve Mann. February 1996. In Presentation at the American Association of Artificial Intelligence, 1996 Symposium; early draft appears as MIT Media Lab Technical Report 260, December 1994
  8. IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
  9. "IEEE ISWC P. 177" (PDF). Retrieved 2013-10-07.
  10. "Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer", Steve Mann with Hal Niedzviecki, ISBN   0-385-65825-7 (Hardcover), Random House Inc, 304 pages, 2001.
  11. Cyborg, 2001
  12. Geary 2002
  13. An Anatomy of the New Bionic Senses [Hardcover], by James Geary, 2002, 214pp

Further reading