Hands-free computing

Last updated

Hands-free computing is any computer configuration where a user can interface without the use of their hands, an otherwise common requirement of human interface devices such as the mouse and keyboard. Hands-free computing is important because it is useful to both able and disabled users. Speech recognition systems can be trained to recognize specific commands and upon confirmation of correctness instructions can be given to systems without the use of hands. This may be useful while driving or to an inspector or engineer in a factory environment. Likewise disabled persons may find hands-free computing important in their everyday lives. Just like visually impaired have found computers useful in their lives. [1]

This can range from using the tongue, lips, mouth, or movement of the head to voice activated interfaces utilizing speech recognition software and a microphone or bluetooth technology.

Examples of available hands-free computing devices include mouth-operated joystick types such as the TetraMouse, the QuadJoy, the Jouse2, the QuadStick, and the IntegraMouse, camera based head tracking systems such as SmartNav, Tracker Pro, FreeTrack, HeadMouse Extreme, HeadMaster, KinesicMouse [2] and Smyle Mouse [3] , and speech recognition specialized for disabilities such as Voice Finger. The joystick types require no physical connections to the user and enhances the user's feeling of independence. Camera types require targets mounted on the user, usually with the help of a caregiver, that are sensed by the camera and associated software. Camera types are sensitive to ambient lighting and the mouse pointer may drift and inaccuracies result from head movements not intended to be mouse movements. Other examples of hands-free mice are units that are operated using switches that may be operated by the feet (or other parts of the body), such as the NoHands Mouse and the switch-adapted TetraMouse. Speech recognition specialized for disabilities and hands-free computing focus more on low-level control of the keyboard and mouse than on usual areas like dictation.

Related Research Articles

<span class="mw-page-title-main">Assistive technology</span> Assistive devices for people with disabilities

Assistive technology (AT) is a term for assistive, adaptive, and rehabilitative devices for people with disabilities and the elderly. Disabled people often have difficulty performing activities of daily living (ADLs) independently, or even with assistance. ADLs are self-care activities that include toileting, mobility (ambulation), eating, bathing, dressing, grooming, and personal device care. Assistive technology can ameliorate the effects of disabilities that limit the ability to perform ADLs. Assistive technology promotes greater independence by enabling people to perform tasks they were formerly unable to accomplish, or had great difficulty accomplishing, by providing enhancements to, or changing methods of interacting with, the technology needed to accomplish such tasks. For example, wheelchairs provide independent mobility for those who cannot walk, while assistive eating devices can enable people who cannot feed themselves to do so. Due to assistive technology, disabled people have an opportunity of a more positive and easygoing lifestyle, with an increase in "social participation," "security and control," and a greater chance to "reduce institutional costs without significantly increasing household expenses." In schools, assistive technology can be critical in allowing students with disabilities access the general education curriculum. Students who experience challenges writing or keyboarding, for example, can use voice recognition software instead. Assistive technologies assist people who are recovering from strokes and people who have abstained injuries that effect their daily tasks.

<span class="mw-page-title-main">Computer mouse</span> Pointing device used to control a computer

A computer mouse is a hand-held pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of a pointer on a display, which allows a smooth control of the graphical user interface of a computer.

The GUI, graphical user interface, is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based UIs, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of CLIs, which require commands to be typed on a computer keyboard.

<span class="mw-page-title-main">Pointing device</span> Human interface device for computers

A pointing device is a human interface device that allows a user to input spatial data to a computer. CAD systems and graphical user interfaces (GUI) allow the user to control and provide data to the computer using physical gestures by moving a hand-held mouse or similar device across the surface of the physical desktop and activating switches on the mouse. Movements of the pointing device are echoed on the screen by movements of the pointer and other visual changes. Common gestures are point and click and drag and drop.

<span class="mw-page-title-main">Game controller</span> Device used with games or entertainment systems

A game controller, gaming controller, or simply controller, is an input device used with video games or entertainment systems to provide input to a video game, typically to control an object or character in the game. Before the seventh generation of video game consoles, plugging in a controller into one of a console's controller ports was the primary means of using a game controller, although since then they have been replaced by wireless controllers, which do not require controller ports on the console but are battery-powered. USB game controllers could also be connected to a computer with a USB port. Input devices that have been classified as game controllers include keyboards, mouses, gamepads, joysticks, etc. Special purpose devices, such as steering wheels for driving games and light guns for shooting games, are also game controllers.

<span class="mw-page-title-main">Accessibility</span> Design of products, services, and environments for usability by disabled people

Accessibility is the design of products, devices, services, vehicles, or environments so as to be usable by people with disabilities. The concept of accessible design and practice of accessible development ensures both "direct access" and "indirect access" meaning compatibility with a person's assistive technology.

Computer accessibility refers to the accessibility of a computer system to all people, regardless of disability type or severity of impairment. The term accessibility is most often used in reference to specialized hardware or software, or a combination of both, designed to enable the use of a computer by a person with a disability or impairment. Computer accessibility often has direct positive effects on people with disabilities.

<span class="mw-page-title-main">Screen magnifier</span>

A screen magnifier is software that interfaces with a computer's graphical output to present enlarged screen content. By enlarging part of a screen, people with visual impairments can better see words and images. This type of assistive technology is useful for people with some functional vision; people with visual impairments and little or no functional vision usually use a screen reader.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state, but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a better bridge between machines and humans than older text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices.

A voice-user interface (VUI) makes spoken human interaction with computers possible, using speech recognition to understand spoken commands and answer questions, and typically text to speech to play a reply. A voice command device (VCD) is a device controlled with a voice user interface.

<span class="mw-page-title-main">Apricot Portable</span>

The Apricot Portable was a personal computer manufactured by ACT Ltd., and was released to the public in November 1984. It was ACT's first attempt at manufacturing a portable computer, which were gaining popularity at the time. Compared to other portable computers of its time like the Compaq Portable and the Commodore SX-64, the Apricot Portable was the first system to have an 80-column and 25-line LCD screen and the first with a speech recognition system.

A text entry interface or text entry device is an interface that is used to enter text information in an electronic device. A commonly used device is a mechanical computer keyboard. Most laptop computers have an integrated mechanical keyboard, and desktop computers are usually operated primarily using a keyboard and mouse. Devices such as smartphones and tablets mean that interfaces such as virtual keyboards and voice recognition are becoming more popular as text entry systems.

<span class="mw-page-title-main">FreeTrack</span>

FreeTrack is a general-purpose optical motion tracking application for Microsoft Windows, released under the GNU General Public License, that can be used with common inexpensive cameras. Its primary focus is head tracking with uses in virtual reality, simulation, video games, 3D modeling, computer aided design and general hands-free computing to improve computer accessibility. Tracking can be made sensitive enough that only small head movements are required so that the user's eyes never leave the screen.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

<span class="mw-page-title-main">Input device</span> Provides data and signals to a computer

In computing, an input device is a piece of equipment used to provide data and control signals to an information processing system, such as a computer or information appliance. Examples of input devices include keyboards, mouse, scanners, cameras, joysticks, and microphones.

Vancouver Adapted Music Society (VAMS) is a not-for-profit organization that encourages, supports and promotes musicians with physical disabilities in Vancouver, Canada.

<span class="mw-page-title-main">Microsoft Tablet PC</span> Microsoft

Microsoft Tablet PC is a term coined by Microsoft for tablet computers conforming to a set of specifications announced in 2001 by Microsoft, for a pen-enabled personal computer, conforming to hardware specifications devised by Microsoft and running a licensed copy of Windows XP Tablet PC Edition operating system or a derivative thereof.

<span class="mw-page-title-main">Golden-i</span>

The Golden-i platform consists of multiple mobile wireless wearable headset computers operated by voice commands and head movements. It was developed at Kopin Corporation by a team led by Jeffrey Jacobsen, chief Golden-i architect and senior ddvisor to the CEO. Utilizing a speech controlled user interface and head-tracking functionality, Golden-i enables the user to carry out common computer functions whilst keeping their hands free.

References

  1. Burgy, Christian (March 17–18, 2005). "Speech-Controlled Wearables: Are We There, Yet?". 2nd International Forum on Applied Wearable Computing. Switzerland: VDE VERLAG. pp. 17–27. Retrieved June 12, 2012.
  2. KinesicMouse: a Kinect based camera mouse for hands-free computing detecting more than 50 facial expressions http://kinesicmouse.com
  3. Smyle Mouse: Head mouse software for hands-free computer control via face gestures captured by a web-cam https://smylemouse.com

See also