Gulf of execution

Last updated

In human computer interaction, the gulf of execution is the gap between a user's goal for action and the means to execute that goal. [1] Usability has as one of its primary goals to reduce this gap by removing roadblocks and steps that cause extra thinking and actions that distract the user's attention from the task intended, thereby preventing the flow of his or her work, and decreasing the chance of successful completion of the task. [2] Similarly, there is a gulf of evaluation that applies to the gap between an external stimulus and the time a person understands what it means. [3] Both phrases are first mentioned in Donald Norman's 1986 book User Centered System Design: New Perspectives on Human-computer Interaction. [1] [4]

Contents

Example

This can be illustrated through the discussion of a VCR problem. Let us imagine that a user would like to record a television show. They see the solution to this problem as simply pressing the Record button. However, in reality, to record a show on a VCR, several actions must be taken:

  1. Press the record button.
  2. Specify time of recording, usually involving several steps to change the hour and minute settings.
  3. Select channel to record on - either by entering the channel's number or selecting it with up/down buttons.
  4. Save the recording settings, perhaps by pressing an "OK" or "menu" or "enter" button.

The difference between the user's perceived execution actions and the required actions is the gulf of execution.

Example of Gulf of Execution:

An example of gulf of execution is, if a user wants to save a document in a word processing software, but they are unsure how to access the "Save" feature or they cannot find it easily due to unclear labeling or hidden menus, it creates a Gulf of Execution. The user's intention to save the document is not aligned with the system's interface or available actions, causing frustration and making it challenging for the user to complete the task.

Example of Gulf of Evaluation:

An example of the Gulf of Evaluation can be seen in the context of a voice-controlled virtual assistant, such as Amazon Alexa or Google Assistant. Imagine a user giving a command to the virtual assistant to play a specific song from their music library. After issuing the command, the virtual assistant responds by playing a different song or fails to understand the command altogether.

In this scenario, the Gulf of Evaluation is wide because the user may have difficulty understanding why the virtual assistant played the wrong song or why it didn't recognize the command. The user's mental model of the system's response and behavior may not align with the actual outcome, leading to frustration and confusion. The system's feedback, in this case, is not adequately helping the user evaluate whether their desired action was successful or not.

To bridge the Gulf of Evaluation, designers could improve the feedback provided by the virtual assistant. For example, the assistant could respond with a confirmation message, such as "Playing song 'X' from your library" to ensure the user understands what action it will take. If the assistant misinterprets the command, it could provide an informative error message, such as "I'm sorry, I couldn't find the song you requested. Please try again."

By providing clear and meaningful feedback, the Gulf of Evaluation can be narrowed, enabling users to better understand the system's response and assess the success of their actions.

Related Research Articles

<span class="mw-page-title-main">Context menu</span> User interface element

A context menu is a menu in a graphical user interface (GUI) that appears upon user interaction, such as a right-click mouse operation. A context menu offers a limited set of choices that are available in the current state, or context, of the operating system or application to which the menu belongs. Usually the available choices are actions related to the selected object. From a technical point of view, such a context menu is a graphical control element.

<span class="mw-page-title-main">Pointing device gesture</span>

In computing, a pointing device gesture or mouse gesture is a way of combining pointing device or finger movements and clicks that the software recognizes as a specific computer event and responds to accordingly. They can be useful for people who have difficulties typing on a keyboard. For example, in a web browser, a user can navigate to the previously viewed page by pressing the right pointing device button, moving the pointing device briefly to the left, then releasing the button.

In object-oriented programming, the command pattern is a behavioral design pattern in which an object is used to encapsulate all information needed to perform an action or trigger an event at a later time. This information includes the method name, the object that owns the method and values for the method parameters.

In computer science, human–computer interaction, and interaction design, direct manipulation is an approach to interfaces which involves continuous representation of objects of interest together with rapid, reversible, and incremental actions and feedback. As opposed to other interaction styles, for example, the command language, the intention of direct manipulation is to allow a user to manipulate objects presented to them, using actions that correspond at least loosely to manipulation of physical objects. An example of direct manipulation is resizing a graphical shape, such as a rectangle, by dragging its corners or edges with a mouse.

<span class="mw-page-title-main">Menu (computing)</span> List of options or commands within a computer program

In user interface design, a menu is a list of options presented to the user.

<span class="mw-page-title-main">Windows key</span> Keyboard key

The Windows logo key is a keyboard key which was originally introduced on Microsoft's Natural Keyboard in 1994. This key became a standard key on PC keyboards. In Windows, pressing the key brings up the start menu. Ctrl+Esc performs the same function, in case the keyboard lacks this key.

ISO 9241 is a multi-part standard from the International Organization for Standardization (ISO) covering ergonomics of human-computer interaction. It is managed by the ISO Technical Committee 159. It was originally titled Ergonomic requirements for office work with visual display terminals (VDTs). From 2006 onwards, the standards were retitled to the more generic Ergonomics of Human System Interaction.

Common User Access (CUA) is a standard for user interfaces to operating systems and computer programs. It was developed by IBM and first published in 1987 as part of their Systems Application Architecture. Used originally in the MVS/ESA, VM/CMS, OS/400, OS/2 and Microsoft Windows operating systems, parts of the CUA standard are now implemented in programs for other operating systems, including variants of Unix. It is also used by Java AWT and Swing.

The human action cycle is a psychological model which describes the steps humans take when they interact with computer systems. The model was proposed by Donald A. Norman, a scholar in the discipline of human–computer interaction. The model can be used to help evaluate the efficiency of a user interface (UI). Understanding the cycle requires an understanding of the user interface design principles of affordance, feedback, visibility and tolerance.

The cognitive walkthrough method is a usability inspection method used to identify usability issues in interactive systems, focusing on how easy it is for new users to accomplish tasks with the system. A cognitive walkthrough is task-specific, whereas heuristic evaluation takes a holistic view to catch problems not caught by this and other usability inspection methods. The method is rooted in the notion that users typically prefer to learn a system by using it to accomplish tasks, rather than, for example, studying a manual. The method is prized for its ability to generate results quickly with low cost, especially when compared to usability testing, as well as the ability to apply the method early in the design phases before coding even begins.

A voice-user interface (VUI) enables spoken human interaction with computers, using speech recognition to understand spoken commands and answer questions, and typically text to speech to play a reply. A voice command device is a device controlled with a voice user interface.

<span class="mw-page-title-main">User interface design</span> Planned operator–machine interaction

User interface (UI) design or user interface engineering is the design of user interfaces for machines and software, such as computers, home appliances, mobile devices, and other electronic devices, with the focus on maximizing usability and the user experience. In computer or software design, user interface (UI) design primarily focuses on information architecture. It is the process of building interfaces that clearly communicate to the user what's important. UI design refers to graphical user interfaces and other forms of interface design. The goal of user interface design is to make the user's interaction as simple and efficient as possible, in terms of accomplishing user goals.

The Windows shell is the graphical user interface for the Microsoft Windows operating system. Its readily identifiable elements consist of the desktop, the taskbar, the Start menu, the task switcher and the AutoPlay feature. On some versions of Windows, it also includes Flip 3D and the charms. In Windows 10, the Windows Shell Experience Host interface drives visuals like the Start Menu, Action Center, Taskbar, and Task View/Timeline. However, the Windows shell also implements a shell namespace that enables computer programs running on Windows to access the computer's resources via the hierarchy of shell objects. "Desktop" is the top object of the hierarchy; below it there are a number of files and folders stored on the disk, as well as a number of special folders whose contents are either virtual or dynamically created. Recycle Bin, Libraries, Control Panel, This PC and Network are examples of such shell objects.

In computer science, the gulf of evaluation is the degree to which the system or artifact provides representations that can be directly perceived and interpreted in terms of the expectations and intentions of the user. Or put differently, the gulf of evaluation is the difficulty of assessing the state of the system and how well the artifact supports the discovery and interpretation of that state. According to Donald Norman's The Design of Everyday Things "The gulf is small when the system provides information about its state in a form that is easy to get, is easy to interpret, and matches the way the person thinks of the system".

Seven stages of action is a term coined by the usability consultant Donald Norman. The phrase appears in chapter two of his book The Design of Everyday Things, describing the psychology of a person performing a task.

The term natural mapping comes from proper and natural arrangements for the relations between controls and their movements to the outcome from such action into the world. The real function of natural mappings is to reduce the need for any information from a user’s memory to perform a task. This term is widely used in the areas of human-computer interaction (HCI) and interactive design. Leveraging the concept of mapping helps bridge the gulf of evaluation and the gulf of execution, which refer to the gap between the user's understanding of the system and the actual state of the system and the gap between the user's goal and how to achieve that goal with the interface, respectively. By mapping controls to mirror the real world, the user will find it easier to create a mental model of the control and use the control to achieve their desired intention.

A context-sensitive user interface offers the user options based on the state of the active program. Context sensitivity is ubiquitous in current graphical user interfaces, often in context menus.

<span class="mw-page-title-main">Interaction technique</span>

An interaction technique, user interface technique or input technique is a combination of hardware and software elements that provides a way for computer users to accomplish a single task. For example, one can go back to the previously visited page on a Web browser by either clicking a button, pressing a key, performing a mouse gesture or uttering a speech command. It is a widely used term in human-computer interaction. In particular, the term "new interaction technique" is frequently used to introduce a novel user interface design idea.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

References

  1. 1 2 Gulf of Evaluation and Gulf of Execution. The Interaction Design Foundation.{{cite book}}: |website= ignored (help)
  2. "The Two UX Gulfs: Evaluation and Execution". Nielsen Norman Group. Retrieved 2019-03-18.
  3. Gazdecki, Gabriella (2016-11-01). "The Gulf of Execution (and Evaluation)". Medium. Retrieved 2019-03-18.
  4. Norman, Don (1986). User Centered System Design: New Perspectives on Human-computer Interaction. CRC. ISBN   978-0-89859-872-8.