Tactile technology

Last updated

Tactile technology is the integration of multi-sensory triggers within physical objects, allowing "real world" interactions with technology. It is similar to haptic technology, as both focus on touch interactions with technology, but whereas haptic is simulated touch, tactile is physical touch. Rather than using a digital interface to interact with the physical world, as augmented reality does, tactile technology involves a physical interaction that triggers a digital response.

Contents

Benefits

The word "tactile" means "related to the sense of touch" [1] or "that can be perceived by the touch; tangible". [2] Touch is incredibly important to human communication and learning, but increasingly, most of the content people interact with is purely visual. Tactile technology presents a way to use advances in technology and combined with touch.

Tactile learning is an approach that engages a learner's sense of touch to explore and understand the world around them. It is grounded in the understanding that sensory experiences are crucial to cognitive development, particularly in the early years. This learning style involves the direct handling and manipulation of objects, allowing learners to experience concepts with their hands as well as their minds. [3]

Studies show that humans work and learn better in a multi-sensory environment. Something as simple as having toys (like the fidget spinner) in the workplace, or using physical props to teach children in schools, can have significant impacts on productivity and information retention according to the multisensory learning theory.

As stated in one article, "Many teachers are turning to tactile learning and evolving technologies as a way to engage students across different learning styles and needs. As part of a multi-sensory learning approach, tactile technology can help students across a range of skill development areas and a broad range of subjects". [4] [ better source needed ]

Implementations

Buttons

At the simplest level, a physical trigger that can be used to create a technological reaction is nothing new: it can be as basic as a button or switch.

More modern versions of buttons include conductive paint [5] and projectors - both are tools that can make a non-digital surface act like a touchscreen, turning anything from tables to sculptures into interactive displays.

Gaming

Games are an example of a field that transformed from entirely tactile to largely digital, and where the trend is now turning back to a more multi-sensory experience. With video games, players want the element of touch that controllers provide, while researchers suggest that incorporating an element of physical interaction to digital games for children may mitigate concerns about excessive screen-time. For example: [8]

Textiles

Technology is increasingly being incorporated into physical objects that we already use - and one of the most significant examples of this is in the textile industry. Companies are creating curtains that control light or detect smoke, clothing that monitors temperature, or fabric that integrates lighting. This is a swiftly growing field of "smart" apparel and home goods. [13] [14] [15] [16] This is also an example of wearable technology.

Art installations and museums

In order to experience art or communicate information, art galleries and museums are increasingly incorporating technology, especially as it makes art and education more immersive and personalized.

Related Research Articles

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

<span class="mw-page-title-main">Stylus</span> Writing utensil or small tool for marking or shaping

A stylus is a writing utensil or a small tool for some other form of marking or shaping, for example, in pottery. It can also be a computer accessory that is used to assist in navigating or providing more precision when using touchscreens. It usually refers to a narrow elongated staff, similar to a modern ballpoint pen. Many styluses are heavily curved to be held more easily. Another widely used writing tool is the stylus used by blind users in conjunction with the slate for punching out the dots in Braille.

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

<span class="mw-page-title-main">Touchscreen</span> Input and output device

A touchscreen is a type of display that can detect touch input from a user. It consists of both an input device and an output device. The touch panel is typically layered on the top of the electronic visual display of a device. Touchscreens are commonly found in smartphones, tablets, laptops, and other electronic devices.

<span class="mw-page-title-main">Tangible user interface</span>

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

<span class="mw-page-title-main">Interactive whiteboard</span> Large interactive display

An interactive whiteboard (IWB), also known as interactive board or smart board, is a large interactive display board in the form factor of a whiteboard. It can either be a standalone touchscreen computer used independently to perform tasks and operations, or a connectable apparatus used as a touchpad to control computers from a projector. They are used in a variety of settings, including classrooms at all levels of education, in corporate board rooms and work groups, in training rooms for professional sports coaching, in broadcasting studios, and others.

<span class="mw-page-title-main">Multi-touch</span> Technology

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Multi-touch may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures using gesture recognition.

A touch user interface (TUI) is a computer-pointing technology based upon the sense of touch (haptics). Whereas a graphical user interface (GUI) relies upon the sense of sight, a TUI enables not only the sense of touch to innervate and activate computer-based functions, it also allows the user, particularly those with visual impairments, an added level of interaction based upon tactile or Braille input.

<span class="mw-page-title-main">Microsoft PixelSense</span> Interactive surface computing platform by Microsoft

Microsoft PixelSense was an interactive surface computing platform that allowed one or more people to use and touch real-world objects, and share digital content at the same time. The PixelSense platform consists of software and hardware products that combine vision based multitouch PC hardware, 360-degree multiuser application design, and Windows software to create a natural user interface (NUI).

Machine perception is the capability of a computer system to interpret data in a manner that is similar to the way humans use their senses to relate to the world around them. The basic method that the computers take in and respond to their environment is through the attached hardware. Until recently input was limited to a keyboard, or a mouse, but advances in technology, both in hardware and software, have allowed computers to take in sensory input in a way similar to humans.

Surface computing is the use of a specialized computer GUI in which traditional GUI elements are replaced by intuitive, everyday objects. Instead of a keyboard and mouse, the user interacts with a surface. Typically the surface is a touch-sensitive screen, though other surface types like non-flat three-dimensional objects have been implemented as well. It has been said that this more closely replicates the familiar hands-on experience of everyday object manipulation.

Hands-on computing is a branch of human-computer interaction research which focuses on computer interfaces that respond to human touch or expression, allowing the machine and the user to interact physically. Hands-on computing can make complicated computer tasks more natural to users by attempting to respond to motions and interactions that are natural to human behavior. Thus hands-on computing is a component of user-centered design, focusing on how users physically respond to virtual environments.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

<span class="mw-page-title-main">Stylus (computing)</span> Pen-shaped instrument used as a human-computer interface

In computing, a stylus is a small pen-shaped instrument whose tip position on a computer monitor can be detected. It is used to draw, or make selections by tapping. While devices with touchscreens such as newer computers, mobile devices, game consoles, and graphics tablets can usually be operated with a fingertip, a stylus provides more accurate and controllable input. The stylus has the same function as a mouse or touchpad as a pointing device; its use is commonly called pen computing.

<span class="mw-page-title-main">Tactile sensor</span>

A tactile sensor is a device that measures information arising from physical interaction with its environment. Tactile sensors are generally modeled after the biological sense of cutaneous touch which is capable of detecting stimuli resulting from mechanical stimulation, temperature, and pain. Tactile sensors are used in robotics, computer hardware and security systems. A common application of tactile sensors is in touchscreen devices on mobile phones and computing.

A virtual touch screen (VTS) is a user interface system that augments virtual objects into reality either through a projector or optical display using sensors to track a person's interaction with the object. For instance, using a display and a rear projector system a person could create images that look three-dimensional and appear to float in midair. Some systems utilize an optical head-mounted display to augment the virtual objects onto the transparent display utilizing sensors to determine visual and physical interactions with the virtual objects projected.

<span class="mw-page-title-main">Skinput</span> Input technology

Skinput is an input technology that uses bio-acoustic sensing to localize finger taps on the skin. When augmented with a pico-projector, the device can provide a direct manipulation, graphical user interface on the body. The technology was developed by |Chris Harrison], [[Desne Tanl ]] and Dan Morris, at Microsoft Research's Computational User Experiences Group. Skinput represents one way to decouple input from electronic devices with the aim of allowing devices to become smaller without simultaneously shrinking the surface area on which input can be performed. While other systems, like SixthSense have attempted this with computer vision, Skinput employs acoustics, which take advantage of the human body's natural sound conductive properties. This allows the body to be annexed as an input surface without the need for the skin to be invasively instrumented with sensors, tracking markers, or other items.

<span class="mw-page-title-main">PrimeSense</span> Former Israeli company

PrimeSense was an Israeli 3D sensing company based in Tel Aviv. PrimeSense had offices in Israel, North America, Japan, Singapore, Korea, China and Taiwan. PrimeSense was bought by Apple Inc. for $360 million on November 24, 2013.

Sensory design aims to establish an overall diagnosis of the sensory perceptions of a product, and define appropriate means to design or redesign it on that basis. It involves an observation of the diverse and varying situations in which a given product or object is used in order to measure the users' overall opinion of the product, its positive and negative aspects in terms of tactility, appearance, sound and so on.

<span class="mw-page-title-main">Force Touch</span> Force-sensing touch technology developed by Apple Inc.

Force Touch is a haptic pressure-sensing technology developed by Apple Inc. that enables trackpads and touchscreens to sense the amount of force being applied to their surfaces. Software that uses Force Touch can distinguish between various levels of force for user interaction purposes. Force Touch was first unveiled on September 9, 2014, during the introduction of Apple Watch. Starting with the Apple Watch, Force Touch has been incorporated into many Apple products, including MacBooks and the Magic Trackpad 2.

References

  1. "TACTILE". Cambridge English Dictionary. Retrieved 2018-11-12.
  2. "Tactile definition and meaning". Collins English Dictionary. Retrieved 2018-11-12.
  3. "Tactile learning" . Retrieved 27 April 2024.
  4. "How tactile technology can help those with learning disabilities". the3doodler.com. March 2017. Retrieved 19 November 2020.
  5. "Ingenious 'Electric Paint' Lets You Paint Wires That Can Conduct Electricity". My Modern Met. 2018-05-14. Retrieved 2018-11-12.
  6. "Fujitsu makes paper interactive with touchscreen interface". Digital Trends. 2013-04-15. Retrieved 2018-11-12.
  7. "Lü - Interactive Playground". Lü - Interactive Playground. Retrieved 2018-11-12.
  8. Bolat, John. "Tactile Installation" . Retrieved 8 March 2022.
  9. "Project Zanzibar - Microsoft Research". Microsoft Research. Retrieved 2018-11-12.
  10. "Lego Fusion combines brick-building with game apps". Engadget. Retrieved 2018-11-12.
  11. "Your Next Touchscreen Might Bulge With Inflatable Buttons". Popular Science. Retrieved 2018-11-12.
  12. "The Logitech Craft keyboard's giant button is a tactile dream". The Verge. Retrieved 2018-11-12.
  13. "Tactile Technology". International Interior Design Association. Retrieved 2018-11-12.
  14. Gaddis, Rebecca. "What Is The Future Of Fabric? These Smart Textiles Will Blow Your Mind". Forbes. Retrieved 2018-11-12.
  15. "U.S. Textile Industry Turns to Tech as Gateway to Revival". The New York Times. April 2016. Retrieved 2018-11-12.
  16. "Technological Advances in the Textile Industry". Study.com. Retrieved 2018-11-12.
  17. "The how and why behind a multisensory art display". ACM Interactions. Retrieved 2018-11-12.
  18. "teamLab Borderless Tokyo Official Site :MORI Building Digital Art Museum". borderless.teamlab.art. Retrieved 2018-11-12.
  19. "Rain Room, Random International". Barbican. Retrieved 2018-11-12.
  20. "'Rain Room' at London's Barbican". Digital meets Culture. Retrieved 2018-11-12.
  21. "Five Brands Using Digital Art Experiences to Engage Fans". Event Marketer. 2018-09-20. Retrieved 2018-11-12.
  22. "Lessons in Launch Events from Perrier's Flavorful Pop-Up". Event Marketer. 2017-08-07. Retrieved 2018-11-12.
  23. Rosenborg, Rutger Ansley (2018-03-23). "Creative Tech Roundup: SXSW Edition". Exponential Creativity Ventures. Retrieved 2018-11-12.