Hardware interface design

Last updated
Dieter Rams, and by extension Braun, produced minimal yet tactile hardware interfaces for a variety of products such as this Braun T1000CD. Braun T1000CD Closeup.jpg
Dieter Rams, and by extension Braun, produced minimal yet tactile hardware interfaces for a variety of products such as this Braun T1000CD.

Hardware interface design (HID) is a cross-disciplinary design field that shapes the physical connection between people and technology in order to create new hardware interfaces that transform purely digital processes into analog methods of interaction. It employs a combination of filmmaking tools, software prototyping, and electronics breadboarding.

Contents

Through this parallel visualization and development, hardware interface designers are able to shape a cohesive vision alongside business and engineering that more deeply embeds design throughout every stage of the product. The development of hardware interfaces as a field continues to mature as more things connect to the internet.

Hardware interface designers draw upon industrial design, interaction design and electrical engineering. Interface elements include touchscreens, knobs, buttons, sliders and switches as well as input sensors such as microphones, cameras, and accelerometers.

The Teenage Engineering OP-1 combines a mixture of hardware buttons, knobs, and a color-coded OLED display. OP-1 Sequencer Concept.png
The Teenage Engineering OP-1 combines a mixture of hardware buttons, knobs, and a color-coded OLED display.
An iPod, an iconic & revolutionary hardware interface that re-imagined the jog wheel IPod Nano 4G black.jpg
An iPod, an iconic & revolutionary hardware interface that re-imagined the jog wheel

History

In the last decade a trend had evolved in the area of human-machine-communication, taking the user experience from haptic, tactile and acoustic interfaces to a more digitally graphical approach. Important tasks that had been assigned to the industrial designers so far, had instead been moved into fields like UI and UX design and usability engineering. The creation of good user interaction was more a question of software than hardware. Things like having to push two buttons on the tape recorder to have them pop back out again and the cradle of some older telephones remain mechanical haptic relicts that have long found their digital nemesis and are waiting to disappear.

However, this excessive use of GUIs in today’s world has led to a worsening impairment of the human cognitive capabilities.[ citation needed ] Visual interfaces are at the maximum of their upgradability. Even though the resolution of new screens is constantly rising, you can see a change of direction away from the descriptive intuitive design to natural interface strategies, based on learnable habits (Google’s Material Design, Apple’s iOS flat design, Microsoft’s Metro Design Language). Several of the more important commands are not shown directly but can be accessed through dragging, holding and swiping across the screen; gestures which have to be learned once but feel very natural afterwards and are easy to remember.

In the area of controlling these systems, there is a need to move away from GUIs and instead find other means of interaction which use the full capabilities of all our senses. Hardware interface design solves this by taking physical forms and objects and connecting them with digital information to have the user control virtual data flow through grasping, moving and manipulating the used physical forms.

If you see the classic industrial hardware interface design as an “analog” method, it finds its digital counterpart in the HID approach. Instead of translating analog methods of control into a virtual form via a GUI, one can see the TUI as an approach to do the exact opposite: transmitting purely digital processes into analog methods of interaction. [1] [ unreliable source ]

Examples

Example hardware interfaces include a computer mouse, TV remote control, kitchen timer, control panel for a nuclear power plant [2] and an aircraft cockpit. [3]

See also

Related Research Articles

<span class="mw-page-title-main">Graphical user interface</span> User interface allowing interaction through graphical icons and visual indicators

A graphical user interface, or GUI, is a form of user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation. In many applications, GUIs are used instead of text-based UIs, which are based on typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs), which require commands to be typed on a computer keyboard.

<span class="mw-page-title-main">Computer-aided design</span> Constructing a product by means of computer

Computer-aided design (CAD) is the use of computers to aid in the creation, modification, analysis, or optimization of a design. This software is used to increase the productivity of the designer, improve the quality of design, improve communications through documentation, and to create a database for manufacturing. Designs made through CAD software help protect products and inventions when used in patent applications. CAD output is often in the form of electronic files for print, machining, or other manufacturing operations. The terms computer-aided drafting (CAD) and computer-aided design and drafting (CADD) are also used.

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

<span class="mw-page-title-main">Embedded system</span> Computer system with a dedicated function

An embedded system is a specialized computer system—a combination of a computer processor, computer memory, and input/output peripheral devices—that has a dedicated function within a larger mechanical or electronic system. It is embedded as part of a complete device often including electrical or electronic hardware and mechanical parts. Because an embedded system typically controls physical operations of the machine that it is embedded within, it often has real-time computing constraints. Embedded systems control many devices in common use. In 2009, it was estimated that ninety-eight percent of all microprocessors manufactured were used in embedded systems.

<span class="mw-page-title-main">Usability</span> Capacity of a system for its users to perform tasks

Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.

ISO 9241 is a multi-part standard from the International Organization for Standardization (ISO) covering ergonomics of human-system interaction and related, human-centered design processes. It is managed by the ISO Technical Committee 159. It was originally titled Ergonomic requirements for office work with visual display terminals (VDTs). From 2006 onwards, the standards were retitled to the more generic Ergonomics of Human System Interaction.

Interaction design, often abbreviated as IxD, is "the practice of designing interactive digital products, environments, systems, and services." While interaction design has an interest in form, its main area of focus rests on behavior. Rather than analyzing how things are, interaction design synthesizes and imagines things as they could be. This element of interaction design is what characterizes IxD as a design field, as opposed to a science or engineering field.

<span class="mw-page-title-main">WIMP (computing)</span> Style of human-computer interaction

In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface. Other expansions are sometimes used, such as substituting "mouse" and "mice" for menus, or "pull-down menu" and "pointing" for pointer.

The following outline is provided as an overview of and topical guide to human–computer interaction:

An output device is any piece of computer hardware that converts information or data into a human-perceptible form or, historically, into a physical machine-readable form for use with other non-computerized equipment. It can be text, graphics, tactile, audio, or video. Examples include monitors, printers and sound cards.

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

<span class="mw-page-title-main">Tangible user interface</span>

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.

<span class="mw-page-title-main">Hardware architecture</span>

In engineering, hardware architecture refers to the identification of a system's physical components and their interrelationships. This description, often called a hardware design model, allows hardware designers to understand how their components fit into a system architecture and provides to software component designers important information needed for software development and integration. Clear definition of a hardware architecture allows the various traditional engineering disciplines to work more effectively together to develop and manufacture new machines, devices and components.

User experience design, upon which is the centralized requirements for "User Experience Design Research", defines the experience a user would go through when interacting with a company, its services, and its products. User experience design is a user centered design approach because it considers the user's experience when using a product or platform. Research, data analysis, and test results drive design decisions in UX design rather than aesthetic preferences and opinions, for which is known as UX Design Research. Unlike user interface design, which focuses solely on the design of a computer interface, UX design encompasses all aspects of a user's perceived experience with a product or website, such as its usability, usefulness, desirability, brand perception, and overall performance. UX design is also an element of the customer experience (CX), and encompasses all design aspects and design stages that are around a customer's experience.

Graphical system design (GSD) is a modern approach to designing measurement and control systems that integrates system design software with COTS hardware to dramatically simplify development. This approach combines user interfaces, models of computation, math and analysis, Input/output signals, technology abstractions, and various deployment target. It allows domain experts, or non- implementation experts, to access to design capabilities where they would traditionally need to outsource a system design expert.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles of furniture.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface".

<span class="mw-page-title-main">SHELL model</span> Conceptual model for human error in aviation

In aviation, the SHELL model is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.

References

  1. "Human factors and ergonomics of future Smarthome Appliances". Protonet. Retrieved 16 January 2016.
  2. E.E. Shultz; G.L. Johnson (1988). "User interface design in safety parameter display systems: direction for enhancement". Conference Record for 1988 IEEE Fourth Conference on Human Factors and Power Plants. Lawrence Livermore Nat. Lab. pp. 165–170. doi:10.1109/HFPP.1988.27496.
  3. Lance Sherry; Peter Polson; Michael Feary. "DESIGNING USER-INTERFACES FOR THE COCKPIT" (PDF). Society of Automotive Engineers. Retrieved 28 June 2011.