Tangible user interface

Last updated

Reactable, an electronic musical instrument example of tangible user interface Reactable Multitouch.jpg
Reactable, an electronic musical instrument example of tangible user interface
SandScape device installed in the Children's Creativity Museum in San Francisco SandScape.jpg
SandScape device installed in the Children's Creativity Museum in San Francisco

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials. [1]

Contents

This was first conceived by Radia Perlman as a new programming language that would teach much younger children similar to Logo, but using special "keyboards" and input devices. Another pioneer in tangible user interfaces is Hiroshi Ishii, a professor at the MIT who heads the Tangible Media Group at the MIT Media Lab. His particular vision for tangible UIs, called Tangible Bits, is to give physical form to digital information, making bits directly manipulable and perceptible. Tangible bits pursues the seamless coupling between physical objects and virtual data.

Characteristics

There are several frameworks describing the key characteristics of tangible user interfaces. Brygg Ullmer and Hiroshi Ishii describe six characteristics concerning representation and control: [2]

  1. Physical representations are computationally coupled to underlying digital information.
  2. Physical representations embody mechanisms for interactive control.
  3. Physical representations are perceptually coupled to actively mediated digital representations.
  4. Physical state of tangibles embodies key aspects of the digital state of a system

Eva Hornecker and Jacob Buur describe a structured framework with four themes: [3]

  1. Tangible manipulation: material representations with distinct tactile qualities, which are typically physically manipulated. A typical example is haptic direct manipulation: can user grab, feel and move important elements in the interface?
  2. Spatial interaction: tangible interaction is embedded in real space; interaction occurs as movement in this space. An example is full body interaction: can the user make use of their whole body?
  3. Embodied facilitation: the configuration of material objects and space affects how multiple users interact jointly with the tangible user interface. Examples include multiple access points: can all users in the space see what is going on and interact with central elements of the interface?
  4. Expressive representation: expressiveness and legibility of material and digital representations employed by tangible interaction systems. An example is representational significance: do physical and digital representations have the same strength and salience?

According to Mi Jeong Kim and Mary Lou Maher, the five basic defining properties of tangible user interfaces are as follows: [4]

  1. Space-multiplex both input and output.
  2. Concurrent access and manipulation of interface components.
  3. Strong specific devices.
  4. Spatially aware computational devices.
  5. Spatial re-configurability of devices.

Comparison with graphical user interfaces

A tangible user interface must be differentiated from a graphical user interface (GUI). A GUI exists only in the digital world, whereas a TUI connects the digital with the physical world. For example, a screen displays the digital information, whereas a mouse allows us to directly interact with this digital information. [5] A tangible user interface represents the input directly in the physical world, and makes the digital information directly graspable. [6]

A tangible user interface is usually built for one specific target group, because of the low range of possible application areas. Therefore, the design of the interface must be developed together with the target group to ensure a good user experience. [7]

In comparison with a TUI, a GUI has a wide range of usages in one interface. Because of that it targets a large group of possible users. [7]

One advantage of the TUI is the user experience, because it occurs a physical interaction between the user and the interface itself (E.g.: SandScape: Building your own landscape with sand). Another advantage is usability, because the user knows intuitively how to use the interface by knowing the function of the physical object. So, the user does not need to learn the functionality. That is why the Tangible User interface is often used to make technology more accessible for elderly people. [6]

Interface type/attributesTangible user interfaceGraphical user interface
Range of possible application areasBuild for one specific application areaBuild for many kinds of application areas
How the system is drivenPhysical objects, such as a mouse or a keyboardBased on graphical bits, such as pixels on the screen
Coupling between cognitive bits and the physical outputUnmediated connectionIndirect connection
How user experience is drivenThe user already knows the function of the interface by knowing how the physical objects functionThe user explores the functionality of the interface
User behavior when approaching the systemIntuitionRecognition

[7]

Examples

A simple example of tangible UI is the computer mouse: Dragging the mouse over a flat surface moves a pointer on the screen accordingly. There is a very clear relationship about the behaviors shown by a system with the movements of a mouse. Other examples include:

Several approaches have been made to establish a generic middleware for TUIs. They target toward the independence of application domains as well as flexibility in terms of the deployed sensor technology. For example, Siftables provides an application platform in which small gesture sensitive displays act together to form a human-computer interface.

For collaboration support, TUIs have to allow the spatial distribution, asynchronous activities, and the dynamic modification of the TUI infrastructure, to name the most prominent ones. This approach presents a framework based on the LINDA tuple space concept to meet these requirements. The implemented TUIpist framework deploys arbitrary sensor technology for any type of application and actuators in distributed environments. [11]

State of the art

Interest in tangible user interfaces (TUIs) has grown constantly since the 1990s, and with every year, more tangible systems are showing up. A 2017 white paper outlines the evolution of TUIs for touch table experiences and raises new possibilities for experimentation and development. [12]

In 1999, Gary Zalewski patented a system of moveable children's blocks containing sensors and displays for teaching spelling and sentence composition. [13]

Tangible Engine is a proprietary authoring application used to build object-recognition interfaces for projected-capacitive touch tables. The Tangible Engine Media Creator allows users with little or no coding experience to quickly create TUI-based experiences.

The MIT Tangible Media Group, headed by Hiroshi Ishi is continuously developing and experimenting with TUIs including many tabletop applications. [14]

The Urp [15] system and the more advanced Augmented Urban Planning Workbench [16] allow digital simulations of air flow, shadows, reflections, and other data based on the positions and orientations of physical models of buildings, on the table surface.

Newer developments go even one step further and incorporate the third dimension by allowing a user to form landscapes with clay (Illuminating Clay [17] ) or sand (Sand Scape [18] ). Again different simulations allow the analysis of shadows, height maps, slopes and other characteristics of the interactively formable landmasses.

InfrActables is a back projection collaborative table that allows interaction by using TUIs that incorporate state recognition. Adding different buttons to the TUIs enables additional functions associated to the TUIs. Newer versions of the technology can even be integrated into LC-displays [19] by using infrared sensors behind the LC matrix.

The Tangible Disaster [20] allows the user to analyze disaster measures and simulate different kinds of disasters (fire, flood, tsunami,.) and evacuation scenarios during collaborative planning sessions. Physical objects allow positioning disasters by placing them on the interactive map and additionally tuning parameters (i.e. scale) using dials attached to them.

The commercial potential of TUIs has been identified recently. The repeatedly awarded Reactable, [21] an interactive tangible tabletop instrument, is now distributed commercially by Reactable Systems, a spinoff company of the Pompeu Fabra University, where it was developed. With the Reactable users can set up their own instrument interactively, by physically placing different objects (representing oscillators, filters, modulators...) and parametrise them by rotating and using touch-input.

Microsoft is distributing its novel Windows-based platform Microsoft Surface [22] (now Microsoft PixelSense) since 2009. Beside multi-touch tracking of fingers, the platform supports the recognition of physical objects by their footprints. Several applications, mainly for the use in commercial space, have been presented. Examples range from designing an own individual graphical layout for a snowboard or skateboard to studying the details of a wine in a restaurant by placing it on the table and navigating through menus via touch input. Interactions such as the collaborative browsing of photographs from a handycam or cell phone that connects seamlessly once placed on the table are also supported.

Another notable interactive installation is instant city [23] that combines gaming, music, architecture and collaborative aspects. It allows the user to build three-dimensional structures and set up a city with rectangular building blocks, which simultaneously results in the interactive assembly of musical fragments of different composers.

The development of the Reactable and the subsequent release of its tracking technology reacTIVision [24] under the GNU/GPL as well as the open specifications of the TUIO protocol have triggered an enormous amount of developments based on this technology.

In the last few years, many amateur and semi-professional projects outside of academia and commerce have been started. Due to open source tracking technologies (reacTIVision [24] ) and the ever-increasing computational power available to end-consumers, the required infrastructure is now accessible to almost everyone. A standard PC, webcam, and some handicraft work allows individuals to set up tangible systems with a minimal programming and material effort. This opens doors to novel ways of perceiving human-computer interaction and allows for new forms of creativity for the public to experiment with.[ citation needed ]

It is difficult to keep track and overlook the rapidly growing number of all these systems and tools, but while many of them seem only to utilize the available technologies and are limited to initial experiments and tests with some basic ideas or just reproduce existing systems, a few of them open out into novel interfaces and interactions and are deployed in public space or embedded in art installations. [25]

The Tangible Factory Planning [26] is a tangible table based on reacTIVision [24] that allows to collaboratively plan and visualize production processes in combination with plans of new factory buildings and was developed within a diploma thesis.

Another example of the many reacTIVision-based tabletops is ImpulsBauhaus-Interactive Table [27] and was on exhibition at the Bauhaus-University in Weimar marking the 90th anniversary of the establishment of Bauhaus. Visitors could browse and explore the biographies, complex relations and social networks between members of the movement.

Using principles derived from embodied cognition, cognitive load theory, and embodied design TUIs have been shown to increase learning performance by offering multimodal feedback. [28] However, these benefits for learning require forms of interaction design that leave as much cognitive capacity as possible for learning.

Physical icon

A physical icon, or phicon, is the tangible computing equivalent of an icon in a traditional graphical user interface, or GUI. Phicons hold a reference to some digital object and thereby convey meaning. [29] [30] [31]

History

Physical icons were first used as tangible interfaces in the metaDesk project built in 1997 by Professor Hiroshi Ishii's tangible bits research group at MIT. [32] [33] The metaDesk consisted of a table whose surface showed a rear-projected video image. Placing a phicon on the table triggered sensors that altered the video projection. [34]

See also

Related Research Articles

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

Interaction design, often abbreviated as IxD, is "the practice of designing interactive digital products, environments, systems, and services." While interaction design has an interest in form, its main area of focus rests on behavior. Rather than analyzing how things are, interaction design synthesizes and imagines things as they could be. This element of interaction design is what characterizes IxD as a design field, as opposed to a science or engineering field.

<span class="mw-page-title-main">WIMP (computing)</span> Style of human-computer interaction

In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface. Other expansions are sometimes used, such as substituting "mouse" and "mice" for menus, or "pull-down menu" and "pointing" for pointer.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.

In computer science, interactive computing refers to software which accepts input from the user as it runs.

A voice-user interface (VUI) enables spoken human interaction with computers, using speech recognition to understand spoken commands and answer questions, and typically text to speech to play a reply. A voice command device is a device controlled with a voice user interface.

In artificial intelligence, an embodied agent, also sometimes referred to as an interface agent, is an intelligent agent that interacts with the environment through a physical body within that environment. Agents that are represented graphically with a body, for example a human or a cartoon animal, are also called embodied agents, although they have only virtual, not physical, embodiment. A branch of artificial intelligence focuses on empowering such agents to interact autonomously with human beings and the environment. Mobile robots are one example of physically embodied agents; Ananova and Microsoft Agent are examples of graphically embodied agents. Embodied conversational agents are embodied agents that are capable of engaging in conversation with one another and with humans employing the same verbal and nonverbal means that humans do.

<span class="mw-page-title-main">Hiroshi Ishii (computer scientist)</span> Japanese computer scientist

Hiroshi Ishii is a Japanese computer scientist. He is a professor at the Massachusetts Institute of Technology. Ishii pioneered the Tangible User Interface in the field of Human-computer interaction with the paper "Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms", co-authored with his then PhD student Brygg Ullmer.

<span class="mw-page-title-main">Deutsch limit</span>

The Deutsch limit is an aphorism about the information density of visual programming languages originated by L. Peter Deutsch that states:

In computing, post-WIMP comprises work on user interfaces, mostly graphical user interfaces, which attempt to go beyond the paradigm of windows, icons, menus and a pointing device, i.e. WIMP interfaces.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

A smart object is an object that enhances the interaction with not only people but also with other smart objects. Also known as smart connected products or smart connected things (SCoT), they are products, assets and other things embedded with processors, sensors, software and connectivity that allow data to be exchanged between the product and its environment, manufacturer, operator/user, and other products and systems. Connectivity also enables some capabilities of the product to exist outside the physical device, in what is known as the product cloud. The data collected from these products can be then analyzed to inform decision-making, enable operational efficiencies and continuously improve the performance of the product.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.

Sergi Jordà is a Catalan innovator, installation artist, digital musician and Associate Professor at the Music Technology Group, Universitat Pompeu Fabra in Barcelona. He is best known for directing the team that invented the Reactable. He is also a trained Physicist.

<span class="mw-page-title-main">Wendy Mackay</span> Computer Scientist

Wendy Elizabeth Mackay is a Canadian researcher specializing in human-computer interaction. She has served in all of the roles on the SIGCHI committee, including Chair. She is a member of the CHI Academy and a recipient of a European Research Council Advanced grant. She has been a visiting professor in Stanford University between 2010 and 2012, and received the ACM SIGCHI Lifetime Service Award in 2014.

Sharon Oviatt is an internationally recognized computer scientist, professor and researcher known for her work in the field of human–computer interaction on human-centered multimodal interface design and evaluation.

Responsive computer-aided design is an approach to computer-aided design (CAD) that utilizes real-world sensors and data to modify a three-dimensional (3D) computer model. The concept is related to cyber-physical systems through blurring of the virtual and physical worlds, however, applies specifically to the initial digital design of an object prior to production.

<span class="mw-page-title-main">Bruno Zamborlin</span> Italian researcher, entrepreneur and artist

Bruno Zamborlin is an AI researcher, entrepreneur and artist based in London, working in the field of human-computer interaction. His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and artificial intelligence. In 2013 he founded Mogees Limited a start-up to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone. With HyperSurfaces, he converts physical surfaces of any material, shape and form into data-enabled-interactive surfaces using a vibration sensor and a coin-sized chipset. As an artist, he has created art installations around the world, with his most recent work comprising a unique series of "sound furnitures" that was showcased at the Italian Pavilion of the Venice Biennale 2023. He regularly performed with UK-based electronic music duo Plaid. He is also honorary visiting research fellow at Goldsmiths, University of London.

References

  1. Ishii, Hiroshi (2008). "Tangible bits". Proceedings of the 2nd international conference on Tangible and embedded interaction - TEI '08. pp. xv. doi:10.1145/1347390.1347392. ISBN   978-1-60558-004-3. S2CID   18166868.
  2. Ullmer, Brygg; Ishii, Hiroshi (2000). "Emerging Frameworks for Tangible User Interfaces" (PDF). IBM Systems Journal. 39 (3–4): 915–931. Retrieved 17 February 2024.
  3. Hornecker, Eva; Buur, Jacob (2006). "Getting a Grip on Tangible Interaction: a Framework on Physical Space and Social Interaction" (PDF). Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI (06). doi:10.1145/1124772.1124838 . Retrieved 17 February 2024.
  4. Kim, Mi Jeong; Maher, Mary Lou (30 May 2008). "The Impact of Tangible User Interfaces on Designers' Spatial Cognition". Human–Computer Interaction. 23 (2): 101–137. doi:10.1080/07370020802016415. S2CID   1268154.
  5. http://tmg-trackr.media.mit.edu:8020/SuperContainer/RawData/Papers/485-Radical%20Atoms%20Beyond%20Tangible/Published/PDF Archived 19 September 2012 at the Wayback Machine [ full citation needed ]
  6. 1 2 3 Ishii, Hiroshi (2007). "Tangible User Interfaces". The Human-Computer Interaction Handbook. pp. 495–514. doi:10.1201/9781410615862-35. ISBN   978-0-429-16397-5.
  7. 1 2 3 Campbell, John; Carandang, Xharmagne (29 July 2012). "Comparing Graphical and Tangible User Interfaces for a Tower Defense Game". AMCIS 2012 Proceedings. CiteSeerX   10.1.1.924.6112 .
  8. "Internet-of-Things answering machine from 1992, with marbles / Boing Boing". boingboing.net. 21 March 2013.
  9. "Topobo construction kit with kinetic memory". www.topobo.com.
  10. "jive - social networking for your gran". jive.benarent.co.uk.
  11. http://www.cs.rit.edu/~pns6910/docs/Tuple%20Space/A%20Tuple-Space%20Based%20Middleware%20for%20Collaborative%20Tangible%20User%20Interfaces.pdf%5B%5D%5B%5D
  12. "The Evolution of Tangible User Interfaces on Touch Tables | Ideum". Ideum - exhibit design | touch tables | interactive exhibits. Retrieved 31 October 2017.
  13. "Wireless I/O apparatus and method of computer-assisted instruction".
  14. "Tangible Media". www.media.mit.edu. MIT Media Lab. Retrieved 10 December 2014.
  15. Underkoffler, John; Ishii, Hiroshi (1999). "Urp". Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit - CHI '99. pp. 386–393. doi:10.1145/302979.303114. ISBN   978-0-201-48559-2. S2CID   52817952.
  16. Ishii, H.; Underkoffler, J.; Chak, D.; Piper, B.; Ben-Joseph, E.; Yeung, L.; Kanji, Z. (2002). "Augmented urban planning workbench: Overlaying drawings, physical models and digital simulation". Proceedings. International Symposium on Mixed and Augmented Reality. pp. 203–211. CiteSeerX   10.1.1.19.4960 . doi:10.1109/ISMAR.2002.1115090. ISBN   978-0-7695-1781-0. S2CID   2303022.
  17. Piper, Ben; Ratti, Carlo; Ishii, Hiroshi (2002). "Illuminating clay". Proceedings of the SIGCHI conference on Human factors in computing systems Changing our world, changing ourselves - CHI '02. p. 355. doi:10.1145/503376.503439. ISBN   978-1-58113-453-7. S2CID   7146503.
  18. Ishii, Hiroshi (June 2008). "The tangible user interface and its evolution". Communications of the ACM. 51 (6): 32–36. doi:10.1145/1349026.1349034. S2CID   29416502.
  19. Hofer, Ramon; Kaplan, Patrick; Kunz, Andreas (2008). "Mighty Trace". Proceeding of the twenty-sixth annual CHI conference on Human factors in computing systems - CHI '08. p. 215. doi:10.1145/1357054.1357091. hdl:20.500.11850/9226. ISBN   978-1-60558-011-1. S2CID   12977345.
  20. Alexa, Marc (5 August 2007). "Tangible user interface for supporting disaster education". ACM SIGGRAPH 2007 posters. Siggraph '07. pp. 144–es. doi:10.1145/1280720.1280877. ISBN   9781450318280. S2CID   1851821.
  21. Jordà, Sergi; Geiger, Günter; Alonso, Marcos; Kaltenbrunner, Martin (2007). "The reac Table". Proceedings of the 1st international conference on Tangible and embedded interaction - TEI '07. p. 139. CiteSeerX   10.1.1.81.1645 . doi:10.1145/1226969.1226998. ISBN   978-1-59593-619-6. S2CID   17384158.
  22. Wall, Josh (2009). "Demo I Microsoft Surface and the Single View Platform". 2009 International Symposium on Collaborative Technologies and Systems. pp. xxxi–xxxii. doi:10.1109/CTS.2009.5067436. ISBN   978-1-4244-4584-4.
  23. Hauert, Sibylle; Reichmuth, Daniel; Böhm, Volker (2007). "Instant city". Proceedings of the 7th international conference on New interfaces for musical expression - NIME '07. p. 422. doi:10.1145/1279740.1279846. S2CID   22458111.
  24. 1 2 3 Kaltenbrunner, Martin; Bencina, Ross (2007). "ReacTIVision". Proceedings of the 1st international conference on Tangible and embedded interaction - TEI '07. p. 69. doi:10.1145/1226969.1226983. ISBN   978-1-59593-619-6. S2CID   459304.
  25. "Sourceforge TUIO User Exhibition".
  26. Tangible Factory Planning, Diploma Thesis, Daniel Guse, http://www.danielguse.de/tangibletable.php Archived 9 July 2010 at the Wayback Machine
  27. "Interactive Table with reacTIVision : ImpulsBauhaus".
  28. Skulmowski, Alexander; Pradel, Simon; Kühnert, Tom; Brunnett, Guido; Rey, Günter Daniel (January 2016). "Embodied learning using a tangible user interface: The effects of haptic perception and selective pointing on a spatial learning task". Computers & Education. 92–93: 64–75. doi:10.1016/j.compedu.2015.10.011. S2CID   10493691.
  29. Fidalgo, F., Silva, P., Realinho, V.: "Ubiquitous Computing and Organizations", page 201. Current Developments in Technology-Assisted Education, 2006
  30. Michitaka Hirose (2001). Human-computer Interaction: INTERACT '01 : IFIP TC.13 International Conference on Human-Computer Interaction, 9th-13th July 2001, Tokyo, Japan. IOS Press. pp. 337–. ISBN   978-1-58603-188-6.
  31. Hamid Aghajan; Juan Carlos Augusto; Ramon Lopez-Cozar Delgado (25 September 2009). Human-Centric Interfaces for Ambient Intelligence. Academic Press. pp. 15–. ISBN   978-0-08-087850-8.
  32. Howard Rheingold (21 March 2007). Smart Mobs: The Next Social Revolution. Basic Books. pp. 104–. ISBN   978-0-465-00439-3.
  33. Paul Dourish (2004). Where the Action is: The Foundations of Embodied Interaction. MIT Press. pp. 45–. ISBN   978-0-262-54178-7.
  34. Mary Beth Rosson; John Millar Carroll (2002). Usability Engineering: Scenario-based Development of Human-computer Interaction. Morgan Kaufmann. pp. 316–. ISBN   978-1-55860-712-5.