EyeTap

Last updated
Man wearing a one-eyed injection-molded EyeTap Aimoneyetap.jpg
Man wearing a one-eyed injection-molded EyeTap
EyeTap inventor Steve Mann wearing a metal frame Laser EyeTap (computer-controlled laser light source run from "GlassEye" camera) MannGlassEye1999crop.jpg
EyeTap inventor Steve Mann wearing a metal frame Laser EyeTap (computer-controlled laser light source run from "GlassEye" camera)

An EyeTap [1] [2] [3] is a concept for a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. [3] [4] This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.

Contents

In order to capture what the eye is seeing as accurately as possible, an EyeTap uses a beam splitter [5] to send the same scene (with reduced intensity) to both the eye and a camera. The camera then digitizes the reflected image of the scene and sends it to a computer. The computer processes the image and then sends it to a projector. The projector sends the image to the other side of the beam splitter so that this computer-generated image is reflected into the eye to be superimposed on the original scene. Stereo EyeTaps modify light passing through both eyes, but many research prototypes (mainly for reasons of ease of construction) only tap one eye.

EyeTap is also the name of an organization founded by inventor Steve Mann [6] [7] [8] [9] to develop and promote EyeTap-related technologies such as wearable computers. [4] [10]

Possible uses

Inventor Steve Mann using weather-resistant EyeTap together with a hydraulophone Musical applications of eyetap.jpg
Inventor Steve Mann using weather-resistant EyeTap together with a hydraulophone

An EyeTap is somewhat like a head-up display (HUD). The important difference is that the scene available to the eye is also available to the computer that projects the head-up display. This enables the EyeTap to modify the computer generated scene in response to the natural scene. One use, for instance, would be a sports EyeTap: here the wearer, while in a stadium, would be able to follow a particular player in a field and have the EyeTap display statistics relevant to that player as a floating box above the player. Another practical use for the EyeTap would be in a construction yard as it would allow the user to reference the blueprints, especially in a 3D manner, to the current state of the building, display a list of current materials and their current locations as well perform basic measurements. Or, even in the business world, the EyeTap has great potential, for it would be capable of delivering to the user constant up to date information on the stock market, the user's corporation, and meeting statuses. On a more day-to-day basis some of Steve Mann's first uses for the technology was using it to keep track of names of people and places, his to-do lists, and keeping track of his other daily ordeals. [11] The EyeTap Criteria[ clarification needed ] are an attempt to define how close a real, practical device comes to such an ideal. EyeTaps could have great use in any field where the user would benefit from real-time interactive information that is largely visual in nature. This is sometimes referred to as computer-mediated reality , [12] [13] commonly known as augmented reality . [14]

Eyetap has been explored as a potential tool for individuals with visual disabilities due to its abilities to direct visual information to parts of the retina that function well. [15] As well, Eyetap's role in sousveillance has been explored by Mann, Jason Nolan and Barry Wellman. [16] [17] [18]

Possible side effects

Users may find that they experience side effects such as headaches and difficulty sleeping if usage occurs shortly before sleep.[ citation needed ] Mann finds that due to his extensive use of the device that going without it can cause him to feel "nauseous, unsteady, naked" when he removes it. [2]

Cyborglogs & EyeTaps

The EyeTap has applications in the world of cyborg logging, as it allows the user the ability to perform real-time visual capture of their daily lives from their own point of view. In this way, the EyeTap could be used to create a lifelong cyborg log or “glog” of the user's life and the events they participate in, potentially recording enough media to allow producers centuries in the future to present the user's life as interactive entertainment (or historical education) to consumers of that era.

History

Steve Mann created the first version of the EyeTap, which consisted of a computer in a backpack wired up to a camera and its viewfinder which in turn was rigged to a helmet. Ever since this first version, it has gone through multiple models as wearable computing evolves, allowing the EyeTap to shrink down to a smaller and less weighty version.

Currently the EyeTap consists of the eyepiece used to display the images, the keypad which the user can use to interface with the EyeTap and have it perform the desired tasks, a CPU which can be attached to most articles of clothing and in some cases even a Wi-Fi device so the user can access the Internet and online data.

Principle of operation

The EyeTap is essentially a half-silvered mirror in front of the user's eye, reflecting some of the light into a sensor. The sensor then sends the image to the aremac, a display device capable of displaying data at any fitting depth. The output rays from the aremac are reflected off the half-silvered mirror back into the eye of the user along with the original light rays.

In these cases, the EyeTap views infrared light, as well as the overall design schematic of how the EyeTap manipulates lightrays. [19]

A conceptual diagram of an EyeTap:

Thermal eyetap.png

Components

CCD Cameras (Charge-coupled device) are the most common type of digital camera used today.[ citation needed ]

See also

Related Research Articles

<span class="mw-page-title-main">Wearable computer</span> Small computing device worn on the body

A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.

<span class="mw-page-title-main">Steve Mann (inventor)</span> Professor and wearable computing researcher

William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann is sometimes labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. In 2023, Mann unsuccessfully ran for mayor of Toronto.

<span class="mw-page-title-main">Stereoscopy</span> Technique for creating or enhancing the illusion of depth in an image

Stereoscopy is a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision. The word stereoscopy derives from Greek στερεός (stereos) 'firm, solid', and σκοπέω (skopeō) 'to look, to see'. Any stereoscopic image is called a stereogram. Originally, stereogram referred to a pair of stereo images which could be viewed using a stereoscope.

<span class="mw-page-title-main">Sousveillance</span> Recording of an activity by a participant

Sousveillance is the recording of an activity by a member of the public, rather than a person or organisation in authority, typically by way of small wearable or portable personal technologies. The term, coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the means of observation down to human level, either physically or hierarchically.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

A keyer is an electronic device used for signaling by hand, by way of pressing one or more switches. The technical term keyer has two very similar meanings, which are nonetheless distinct: One for telegraphy and the other for accessory devices built for computer-human communication:

<span class="mw-page-title-main">Head-mounted display</span> Type of display device

A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet, that has a small display optic in front of one or each eye. An HMD has many uses including gaming, aviation, engineering, and medicine. Virtual reality headsets are HMDs combined with IMUs. There is also an optical head-mounted display (OHMD), which is a wearable display that can reflect projected images and allows a user to see through it.

<span class="mw-page-title-main">Virtual retinal display</span> Display technology

A virtual retinal display (VRD), also known as a retinal scan display (RSD) or retinal projector (RP), is a display technology that draws a raster display directly onto the retina of the eye.

Equiveillance is a state of equilibrium, or a desire to attain a state of equilibrium, between surveillance and sousveillance. It is sometimes confused with transparency. The balance (equilibrium) provided by equiveillance allows individuals to construct their own cases from evidence they gather themselves, rather than merely having access to surveillance data that could possibly incriminate them.

High dynamic range (HDR), also known as wide dynamic range, extended dynamic range, or expanded dynamic range, is a dynamic range higher than usual.

<span class="mw-page-title-main">Lifestreaming</span> Act of documenting and sharing aspects of ones daily experiences online

Lifestreaming is an act of documenting and sharing aspects of one's daily experiences online, via a lifestream website that publishes things of a person's choosing.

<span class="mw-page-title-main">Lifelog</span> Personal record of ones daily life

A lifelog is a personal record of one's daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers.

<span class="mw-page-title-main">Cyborg</span> Being with both organic and biomechatronic body parts

A cyborg —a portmanteau of cybernetic and organism—is a being with both organic and biomechatronic body parts. The term was coined in 1960 by Manfred Clynes and Nathan S. Kline. In contrast to biorobots and androids, the term cyborg applies to a living organism that has restored function or enhanced abilities due to the integration of some artificial component or technology that relies on feedback.

<span class="mw-page-title-main">SixthSense</span> Gesture-based wearable computer system

SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997, and 1998, and further developed by Pranav Mistry, in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

<span class="mw-page-title-main">Golden-i</span>

The Golden-i platform consists of multiple mobile wireless wearable headset computers operated by voice commands and head movements. It was developed at Kopin Corporation by a team led by Jeffrey Jacobsen, chief Golden-i architect and senior ddvisor to the CEO. Utilizing a speech controlled user interface and head-tracking functionality, Golden-i enables the user to carry out common computer functions whilst keeping their hands free.

<span class="mw-page-title-main">Smartglasses</span> Wearable computers glasses

Smartglasses or smart glasses are eye or head-worn wearable computers that offer useful capabilities to the user. Many smartglasses include displays that add information alongside or to what the wearer sees. Alternatively, smartglasses are sometimes defined as glasses that are able to change their optical properties, such as smart sunglasses that are programmed to change tint by electronic means. Alternatively, smartglasses are sometimes defined as glasses that include headphone functionality.

Cyborg data mining is the practice of collecting data produced by an implantable device that monitors bodily processes for commercial interests. As an android is a human-like robot, a cyborg, on the other hand, is an organism whose physiological functioning is aided by or dependent upon a mechanical/electronic device that relies on some sort of feedback.

Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.

References

  1. "Seattle band has already benefited by using ringtone". The Seattle Times. 18 April 2005. Retrieved 3 September 2009.
  2. 1 2 "Why life as a cyborg is better". Daily Times. 19 January 2004. Retrieved 3 September 2009.
  3. 1 2 Bergstein, Brian (12 January 2004). "Professor's 25 years of cyborg life mirrors tech advances". USA Today. Associated Press . Retrieved 2 September 2009.
  4. 1 2 "The ultimate wearable computer". USA Today. 25 June 2001. Retrieved 3 September 2009.
  5. Grieser, Andy (20 June 2001). "Now computers are built to suit Wearable technology has a few wrinkles, but usage is expanding". Chicago Tribune. Retrieved 3 September 2009.
  6. "Being Steve Mann: Cyberwear pioneer alters his reality". The Atlanta Journal-Constitution . 26 March 2000. p. A17. Retrieved 3 September 2009.
  7. Brad King (12 March 2002). "Part Man, Part Film, All Mann". Wired.com. Retrieved 3 September 2009.
  8. "Cyborg genius claims he's the next step in human evolution". The Jamaica Observer . 3 September 2009. Archived from the original on 4 March 2016. Retrieved 24 February 2013.
  9. Schechter, Bruce (25 September 2001). "SCIENTIST AT WORK: STEVE MANN; Real-Life Cyborg Challenges Reality With Technology". The New York Times. Retrieved 3 September 2009.
  10. Shinn, Eric (8 July 2001). "Part man, part machine – all nerd ; 'Wearable computer' pioneer Steve Mann keeps one eye locked on the future". Toronto Star. Retrieved 3 September 2009.
  11. Mann, Steve; James Fung; Chris Aimone; Anurag Sehgal; Daniel Chen. "Designing EyeTap Digital Eyeglasses for Continuous Lifelong Capture and Sharing of Personal Experiences" (PDF). EyeTap Personal Imaging Lab. Retrieved 14 March 2012.{{cite journal}}: Cite journal requires |journal= (help)
  12. McCullagh, Declan (27 August 2003). "Newsmaker: Cyborgs unite!". CNET News. Retrieved 6 September 2009.
  13. Brian, Bergstein (15 January 2004). "Computer's eye view". CJOnline.com. Retrieved 3 September 2009.
  14. McCullagh, Declan (26 October 2004). "Snap photo first, answer questions later". CNET News. Retrieved 6 September 2009.
  15. Nolan, Jason. "Blind photographer Arun Blake consulting on Eyetap and blindness in 2004". Lemmingworks.org. Retrieved 6 September 2009.
  16. Butler, Don (5 February 2009). "Part VI: Everyone's watching". Ottawa Citizen . Retrieved 2 September 2009.
  17. Mann, Steve; Jason Nolan; Barry Wellman (2003). "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments" (PDF). Surveillance & Society. 1 (3). ISSN   1477-7487. Archived from the original (PDF) on 25 March 2009. Retrieved 6 September 2009.
  18. Haines, Lester (15 January 2004). "Captain Cyborg faces Canadian challenge". The Register. Retrieved 2 September 2009.
  19. "EyeTap: The eye itself as display and camera". EyeTap Personal Imaging Lab. 2004. Retrieved 14 March 2012.