SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997 (headworn gestural interface), and 1998 (neckworn version), and further developed by Pranav Mistry (also at MIT Media Lab), in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 (by Steve Mann) that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers).
Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at this point will move accordingly. This could make conventional input on devices such and even redundant.
Wearable computers, also known as wearables or body-borne computers, are small computing devices that are worn under, with, or on top of clothing.
The MIT Media Lab is a research laboratory at the Massachusetts Institute of Technology, growing out of MIT's Architecture Machine Group in the School of Architecture. Its research does not restrict to fixed academic disciplines, but draws from technology, media, science, art, and design. As of 2014, Media Lab's research groups include neurobiology, biologically inspired fabrication, socially engaging robots, emotive computing, bionics, and hyperinstruments.
SixthSense is a name for extra information supplied by a wearable computer, such as the device called EyeTap (Mann), Telepointer (Mann), and "WuW" (Wear yoUr World) by Pranav Mistry.
Pranav Mistry is a computer scientist and inventor. He is President and CEO of Samsung STAR Labs since October 2019. He is best known for his work on SixthSense, Samsung Galaxy Gear and Project Beyond.
Sixth Sense technology (a camera combined with a light source) was developed in 1997 as a headworn device, and in 1998 as a neckworn object, but the Sixth Sense name for this work was not coined and published until 2001, when Mann coined the term "Sixth Sense" to describe such devices.
Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", believing that wearable computing and digital information could act in addition to the five traditional senses.Ten years later, Pattie Maes, also with MIT Media Lab, used the term "Sixth Sense" in this same context, in a TED talk.
A sense is a physiological capacity of organisms that provides data for perception. The senses and their operation, classification, and theory are overlapping topics studied by a variety of fields, most notably neuroscience, cognitive psychology, and philosophy of perception. The nervous system has a specific sensory nervous system, and a sense organ, or sensor, dedicated to each sense.
Pattie Maes is a professor in MIT's program in Media Arts and Sciences. She founded and directed the MIT Media Lab's Fluid Interfaces Group. Previously, she founded and ran the Software Agents group. She served for several years as both the head and associate head of the Media Lab's academic program. Prior to joining the Media Lab, Maes was a visiting professor and a research scientist at the MIT Artificial Intelligence Lab. She holds bachelor's and PhD degrees in computer science from the Vrije Universiteit Brussel in Belgium.
Similarly, other inventors have used the term sixth-sense technology to describe new capabilities that augment the traditional five human senses. For example, in U.S. patent no. 9,374,397, timo platt et als, refer to their new communications invention as creating a new social and personal sense, i.e., a "metaphorical sixth sense", enabling users (while retaining their privacy and anonymity) to sense and share the "stories" and other attributes and information of those around them.
An image is an artifact that depicts visual perception, such as a photograph or other two-dimensional picture, that resembles a subject—usually a physical object—and thus provides a depiction of it. In the context of signal processing, an image is a distributed amplitude of color(s).
Sousveillance is the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The term "sousveillance", coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the camera or other means of observation down to human level, either physically, or hierarchically.
An EyeTap is a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.
Thad Eugene Starner is a founder and director of the Contextual Computing Group at Georgia Tech's College of Computing, where he is a Full Professor. He is a pioneer of wearable computing as well as human-computer interaction augmented environments and pattern recognition. Starner is a strong advocate of continuous-access, everyday-use systems, and has worn his own customized wearable computer continuously since 1993. His work has touched on handwriting and sign-language analysis, intelligent agents and augmented realities. He also helped found Charmed Technology.
A handheld projector is an image projector in a handheld device. It was developed to as a computer display device for compact portable devices such as mobile phones, personal digital assistants, and digital cameras, which have sufficient storage capacity to handle presentation materials but are too small to accommodate a display screen that an audience can see easily. Handheld projectors involve miniaturized hardware, and software that can project digital images onto a nearby viewing surface.
Rosalind Wright Picard is an American scholar who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica. In 2005, she was named a Fellow of the Institute of Electrical and Electronics Engineers for contributions to image and video analysis and affective computing. In 2019 she received one of the highest professional honors accorded an engineer, election to the National Academy of Engineering for her contributions on affective computing and wearable computing.
In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. Multi-touch was in use as early as 1985. Apple popularized the term "multi-touch" in 2007. Plural-point awareness may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures.
Microsoft's SenseCam is a lifelogging camera with fisheye lens and trigger sensors, such as accelerometers, heat sensing, and audio, invented by Lyndsay Williams, patent granted in 2009. Usually worn around the neck, Sensecam is used for the MyLifeBits project, a lifetime storage database. Early developers were James Srinivasan and Trevor Taylor.
Humanistic Intelligence (HI) is defined, in the context of wearable computing, by Marvin Minsky, Ray Kurzweil, and Steve Mann, as follows:
Humanistic Intelligence [HI] is intelligence that arises because of a human being in the feedback loop of a computational process, where the human and computer are inextricably intertwined. When a wearable computer embodies HI and becomes so technologically advanced that its intelligence matches our own biological brain, something much more powerful emerges from this synergy that gives rise to superhuman intelligence within the single “cyborg” being.
A lifelog is a personal record of one’s daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers.
A cyborg, short for "cyberneticorganism", is a being with both organic and biomechatronic body parts. The term was coined in 1960 by Manfred Clynes and Nathan S. Kline.
In computing, a natural user interface, or NUI, or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word natural is used because most computer interfaces use artificial control devices whose operation has to be learned.
The Mouseless is a proposed input device for personal computers. A prototype Mouseless, designed by Pranav Mistry of the MIT Media Lab. Mouseless replaces conventional hardware mouse with a set of infrared laser strobe, an infrared camera and image recognition software. The laser beam is optically split into a wide beam illuminating an imaginary plane above the working desk. The camera captures the pattern of invisible infrared light as it illuminates user's hand. The user rests the palm on the desk and commands the system in the same way as he or she would do with a conventional mouse.
Telepointer is a neck-worn gestural interface system developed by MIT Media Lab student Steve Mann in 1998. Mann originally referred to the device as "Synthetic Synesthesia of the Sixth Sense". In the 1990s and early 2000s Mann used this project as a teaching example at the University of Toronto.
Jocelyn Scheirer is an American entrepreneur, scientist, and artist who has been working in wearable technology since the late 1990s. Her research focuses on Affective Computing, which she pursued while pursuing her PhD (pending) at MIT Media's Lab Affective Computing Group with Rosalind Picard. Scheirer invented and, along with MIT, patented the Galvactivator glove which measured skin conductance through sensors on the palm and relayed the varying intensity through an LED display. She founded the intercommunication equipment and systems company Empathyx, Inc. in 2006 and co-founded the emotional analytics company Affectiva in 2009, serving as their director of operations until 2010. Scheirer has also created several visual and performance art pieces that have been featured in several galleries in Massachusetts including the MIT Museum, the Galatea Fine Art Gallery, and the Bromfield Gallery. She currently serves as CEO of the wearable company Bionolux Labs, LLC.
Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.
Sajid Sadi is a computer scientist and inventor. He is the Vice President of Research and Head of the Think Tank Team at Samsung Research America. He is best known for his work on Samsung Galaxy Gear, Samsung Gear 360 Round and Samsung Bot Chef.