Mouse tracking

Last updated

Mouse tracking (also known as cursor tracking) is the use of software to collect users' mouse cursor positions on the computer. [1] This goal is to automatically gather richer information about what people are doing, typically to improve the design of an interface. Often this is done on the Web and can supplement eye tracking in some situations.

Contents

When mouse tracking takes place without the user's consent, for example on a website, there may be privacy implications.

History

The computer mouse was first invented in 1968 by Douglas Engelbart. [2] The term mouse tracking originally referred to how movements were captured and transmitted to the computer. For example, the original tracker ball mouse used a metal bearing pressed against two rollers to track movement. [2] Much research and technology has gone into what type of tracker provides the most accurate depiction of the user's movement.

With the advent of the World Wide Web, mouse tracking was expanded to include click data. Researchers and developers would track and record each time a user used the mouse to click something on the website, as well as the location of the event. Web developers use these mouse clicks to assess what information users are interested in and how they interact with a page. Additionally, advertisers are interested in click data in terms of banner advertisements and where to place their ads on pages to get the most click-throughs.

More recently, the term mouse tracking has been expanded even more to develop a much broader area of research in helping understand the human-computer interaction (HCI). This development began with eye tracking. While eye tracking has been around since the 1800s, it was not used in HCI until the 1980, primarily to help answer questions about how users search for commands in computer menus and to develop systems to help disabled users. [3] More recently, eye tracking has been used in usability testing on web pages to understand a user's point of focus as well as test the usability of different features of a site, such as dropdown menus. [4] This information can influence Web design so it meets the researcher's goals yet is user friendly.

The problem with using eye tracking in usability testing is the required hardware and then expense. Additionally, eye tracking is limited to small sample sizes and abnormal browsing environments. Mouse tracking, on the other hand, is inexpensive and the data can be collected from any computer. It is in this capacity that mouse tracking was re-invented in HCI research. Eye tracking researchers in the late 1990s noticed patterns between the eye and mouse movements. [5] Based on these findings, researchers who had been tracking click data realized there might be more to learn from the mouse. In 2001 Mon-Chu Chen, John Anderson, and Myeong-Ho Sohn at Carnegie Mellon University, began explicitly investigating whether tracking mouse movements could be used as a proxy for tracking eye movements. This research has continued through the 2000s and to the present. [6] [7] [8] [9] The general findings in the research are that the correlation is not one to one, but there is a relationship between eye and mouse movements, which, in turn suggests mouse movements can in fact be used to determine a user's focus of attention. More recent research has shown that the correlation depends strongly on the user behavior at that time, such as whether the user is reading with the mouse, moving it to perform a click, or leaving it idle. [10] Furthermore, the mouse position actually correlates better with past eye-gaze positions, meaning that people will typically look somewhere before moving their mouse there about 700ms later. [10] Generally, tracking mouse positions can lead to a vast improvement to understanding the user compared with relying on mouse clicks only. In other words, click data informed researchers of a users' primary focus of attention, or their end choice. However, looking at all of the mouse movements can inform the researcher to other options that were of interest to the user but were not selected by clicking, which can lead to better overall understanding of the user's thought process.

The most recent research in this area is using this knowledge to improve websites and applications. Specifically, researchers are trying to parse out what different individual movements mean as well as beginning to use mouse tracking in usability testing to improve products and pages. [11]

Mouse tracking technology and techniques

Javascript

JavaScript is a scripting language which supports multiple programming styles. It is typically client-side, and does not require constant downloads from the website. JavaScript is implemented as part of a Web browser and is supported by all the major web browsers, including Internet Explorer, Firefox and Safari.

Therefore, using this language, Web developers can track user's mouse movements simply by entering lines of code on a page. It does not require any additional software to be installed on the user's computer; they only have to have JavaScript enabled for the researcher to collect data from the webpage. Mouse tracking using JavaScript has been deployed on high-traffic websites such as search engines [12] to collect mouse movement data without affecting the user's computer performance.

Data

Current mouse tracking tools provide a variety of data including the location of the mouse (in terms of pixels), time stamps, any time the mouse hovers on a link of interest, mouse clicks, time spent in areas of interest, and duration of hovers. Additionally, some tracking tools provide more high level analyses, such as heat maps and playbacks which can retrace the mouse's trajectory. [6] [13] [14] An example of an output log is below: [15]

141.84.8.77 2006-09-01,18:44:07 serverdata 8 141.84.8.77 2006-09-01,19:44:08 8 load size=1047x529 141.84.8.77 2006-09-01,19:44:08 8 mousemove coord=283,2 141.84.8.77 2006-09-01,19:44:09 8 mousemove coord=257,125 141.84.8.77 2006-09-01,19:44:10 8 mouseover coord=247,152 name=f dom=abae 141.84.8.77 2006-09-01,19:44:13 8 select radio id=lgr value=lr%3Dlang_de dom=abaecabaac 141.84.8.77 2006-09-01,19:44:16 8 click coord=374,187 name=q dom=abaecaabb 141.84.8.77 2006-09-01,19:44:17 8 keyPress key=H 141.84.8.77 2006-09-01,19:44:17 8 keypress key=a

Applications

Usability testing

Mouse movements can be used to infer a user's intent and focus while browsing a website. By using mouse movements in usability testing, researchers can determine if users are confused, if their expectations are met, where their attention is focused, and much additional information. This tool can be especially beneficial in conjunction with other techniques used in usability testing, such as think aloud procedures, as this information can lead to a better mouse movement model.

Real-time website adaptation

Tracking mouse movements can be used to adapt interfaces in real time based on respondents' interests. Researchers can use information, such as where respondents hold the mouse for an extended period of time and the trajectory of the mouse, to assess their level of interest in that object. [12] [16] [17] The knowledge gained from this can be used to re-sort search criteria based on individual relevancy and suggest other objects, products, or information that might be of interest to the user.

Web design and evaluation

Mouse tracking allows Web developers to view the behaviors of actual users in their natural browsing environment instead of in a laboratory. By tracking where the mouse is located, designers can evaluate the ease of use of their Websites. Specifically, they can see how difficult it is for users to find and use certain features, such as scroll bars or dropdown menus, or to locate important links. Additionally, developers can see what parts of the pages users are most interested in, which can influence page layout if they are not focused where the designer wants them. [13]

Online security and biometrics

Each computer user has their own unique way of using the mouse, which can be used as a biometric identifier. [18] [19] [20] An example of how mouse movements can be used for online security is as follows. Some people rarely engage the mouse until they need it to complete an action, while others are very active with their mouse and use it to read along with text on a page. For users who are active with their mouse, researchers have successfully been able to “learn” a user’s typical behavior through a supervised learning method. [21] Once this behavior is learned, it can be linked to an individual’s account. If the behavior of a user deviates significantly from that user's learned, typical behavior, they can be locked out of the system until their identity is verified. This is another way of ensuring a user is who they claim to be.

Education

Mouse tracking has been used in education to help understand the impact of reading on a computer as opposed to paper and propose ways that reading on a computer could be adapted so understanding and learning were easier. [22] It has also been used to identify off-task behaviors in tutoring settings and in physics to understand how students perceive and process multimedia representations of real experiments. [23] [24]

See also

Related Research Articles

<span class="mw-page-title-main">Pointing device gesture</span>

In computing, a pointing device gesture or mouse gesture is a way of combining pointing device or finger movements and clicks that the software recognizes as a specific computer event and responds to accordingly. They can be useful for people who have difficulties typing on a keyboard. For example, in a web browser, a user can navigate to the previously viewed page by pressing the right pointing device button, moving the pointing device briefly to the left, then releasing the button.

<span class="mw-page-title-main">Pointing device</span> Human interface device for computers

A pointing device is a human interface device that allows a user to input spatial data to a computer. CAD systems and graphical user interfaces (GUI) allow the user to control and provide data to the computer using physical gestures by moving a hand-held mouse or similar device across the surface of the physical desktop and activating switches on the mouse. Movements of the pointing device are echoed on the screen by movements of the pointer and other visual changes. Common gestures are point and click and drag and drop.

<span class="mw-page-title-main">Fitts's law</span> Predictive model of human movement

Fitts's law is a predictive model of human movement primarily used in human–computer interaction and ergonomics. The law predicts that the time required to rapidly move to a target area is a function of the ratio between the distance to the target and the width of the target. Fitts's law is used to model the act of pointing, either by physically touching an object with a hand or finger, or virtually, by pointing to an object on a computer monitor using a pointing device. It was initially developed by Paul Fitts.

Keystroke logging, often referred to as keylogging or keyboard capturing, is the action of recording (logging) the keys struck on a keyboard, typically covertly, so that a person using the keyboard is unaware that their actions are being monitored. Data can then be retrieved by the person operating the logging program. A keystroke recorder or keylogger can be either software or hardware.

Personal information management (PIM) is the study and implementation of the activities that people perform in order to acquire or create, store, organize, maintain, retrieve, and use informational items such as documents, web pages, and email messages for everyday use to complete tasks and fulfill a person's various roles ; it is information management with intrapersonal scope. Personal knowledge management is by some definitions a subdomain.

<span class="mw-page-title-main">WIMP (computing)</span> Style of human-computer interaction

In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface. Other expansions are sometimes used, such as substituting "mouse" and "mice" for menus, or "pull-down menu" and "pointing" for pointer.

A recommender system, or a recommendation system, is a subclass of information filtering system that provides suggestions for items that are most pertinent to a particular user. Recommender systems are particularly useful when an individual needs to choose an item from a potentially overwhelming number of items that a service may offer.

<span class="mw-page-title-main">Scrolling</span> Sliding motion vertically or horizontally over display devices

In computer displays, filmmaking, television production, and other kinetic displays, scrolling is sliding text, images or video across a monitor or display, vertically or horizontally. "Scrolling," as such, does not change the layout of the text or pictures but moves the user's view across what is apparently a larger image that is not wholly seen. A common television and movie special effect is to scroll credits, while leaving the background stationary. Scrolling may take place completely without user intervention or, on an interactive device, be triggered by touchscreen or a keypress and continue without further intervention until a further user action, or be entirely controlled by input devices.

<span class="mw-page-title-main">Banner blindness</span> Tendency to ignore banner-size notices

Banner blindness is a phenomenon in web usability where visitors to a website consciously or unconsciously ignore banner-like information. A broader term covering all forms of advertising is ad blindness, and the mass of banners that people ignore is called banner noise.

GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals." GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.

Exploratory search is a specialization of information exploration which represents the activities carried out by searchers who are:

In computing, post-WIMP comprises work on user interfaces, mostly graphical user interfaces, which attempt to go beyond the paradigm of windows, icons, menus and a pointing device, i.e. WIMP interfaces.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

<span class="mw-page-title-main">GroupLens Research</span> Computer science research lab

GroupLens Research is a human–computer interaction research lab in the Department of Computer Science and Engineering at the University of Minnesota, Twin Cities specializing in recommender systems and online communities. GroupLens also works with mobile and ubiquitous technologies, digital libraries, and local geographic information systems.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".

Urban computing is an interdisciplinary field which pertains to the study and application of computing technology in urban areas. This involves the application of wireless networks, sensors, computational power, and data to improve the quality of densely populated areas. Urban computing is the technological framework for smart cities.

In web analytics, a session, or visit is a unit of measurement of a user's actions taken within a period of time or with regard to completion of a task. Sessions are also used in operational analytics and provision of user-specific recommendations. There are two primary methods used to define a session: time-oriented approaches based on continuity in user activity and navigation-based approaches based on continuity in a chain of requested pages.

Animal–computer interaction (ACI) is a field of research for the design and use of technology with, for and by animals covering different kinds of animals from wildlife, zoo and domesticated animals in different roles. It emerged from, and was heavily influenced by, the discipline of Human–computer interaction (HCI). As the field expanded, it has become increasingly multi-disciplinary, incorporating techniques and research from disciplines such as artificial intelligence (AI), requirements engineering (RE), and veterinary science.

<span class="mw-page-title-main">Shumin Zhai</span> Human–computer interaction research scientist

Shumin Zhai is a Chinese-born American Canadian Human–computer interaction (HCI) research scientist and inventor. He is known for his research specifically on input devices and interaction methods, swipe-gesture-based touchscreen keyboards, eye-tracking interfaces, and models of human performance in human-computer interaction. His studies have contributed to both foundational models and understandings of HCI and practical user interface designs and flagship products. He previously worked at IBM where he invented the ShapeWriter text entry method for smartphones, which is a predecessor to the modern Swype keyboard. Dr. Zhai's publications have won the ACM UIST Lasting Impact Award and the IEEE Computer Society Best Paper Award, among others, and he is most known for his research specifically on input devices and interaction methods, swipe-gesture-based touchscreen keyboards, eye-tracking interfaces, and models of human performance in human-computer interaction. Dr. Zhai is currently a Principal Scientist at Google where he leads and directs research, design, and development of human-device input methods and haptics systems.

Click tracking is when user click behavior or user navigational behavior is collected in order to derive insights and fingerprint users. Click behavior is commonly tracked using server logs which encompass click paths and clicked URLs. This log is often presented in a standard format including information like the hostname, date, and username. However, as technology develops, new software allows for in depth analysis of user click behavior using hypervideo tools. Given that the internet can be considered a risky environment, research strives to understand why users click certain links and not others. Research has also been conducted to explore the user experience of privacy with making user personal identification information individually anonymized and improving how data collection consent forms are written and structured.

References

  1. Lopez, Richard B.; Stillman, Paul E.; Heatherton, Todd F.; Freeman, Jonathan B. (2018). "Minding One's Reach (To Eat): The Promise of Computer Mouse-Tracking to Study Self-Regulation of Eating". Frontiers in Nutrition. 5: 43. doi: 10.3389/fnut.2018.00043 . ISSN   2296-861X. PMC   5972293 . PMID   29872661.
  2. 1 2 Edwards, Benj (1968-12-08). "The computer mouse turns 40". Macworld. Retrieved 2012-02-23.
  3. Jacob, Robert J.K.; Karn, Keith S. (2003), "Eye Tracking in Human-Computer Interaction and Usability Research", The Mind's Eye, Elsevier, pp. 573–605, doi:10.1016/b978-044451020-4/50031-1, ISBN   978-0-444-51020-4 , retrieved 2020-11-21
  4. Schiessl, Duda, Thoelke, Fischer. "Eye tracking and its application in usability and media research" (PDF). MMI Interaktiv. Retrieved 2013-10-18.{{cite web}}: CS1 maint: multiple names: authors list (link)
  5. Byrne, Michael D.; Anderson, John R.; Douglass, Scott; Matessa, Michael (1999). "Eye tracking the visual search of click-down menus". Proceedings of the SIGCHI conference on Human factors in computing systems the CHI is the limit - CHI '99. ACM. p. 402. doi:10.1145/302979.303118. ISBN   0-201-48559-1. S2CID   2212549.
  6. 1 2 Mueller, Florian; Lockerd, Andrea (2001-03-31). "Cheese". CHI '01 extended abstracts on Human factors in computing systems - CHI '01. ACM. p. 279. doi:10.1145/634067.634233. ISBN   1-58113-340-5. S2CID   6301468.
  7. Guo, Qi; Agichtein, Eugene (2010). "Towards predicting web searcher gaze position from mouse movements". Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems - CHI EA '10. ACM. p. 3601. doi:10.1145/1753846.1754025. ISBN   978-1-60558-930-5. S2CID   16330552.
  8. Chen, Mon Chu; Anderson, John R.; Sohn, Myeong Ho (2001-03-31). "What can a mouse cursor tell us more?". CHI '01 extended abstracts on Human factors in computing systems - CHI '01. ACM. p. 281. doi:10.1145/634067.634234. ISBN   1-58113-340-5. S2CID   16969703.
  9. Rodden, Kerry; Fu, Xin; Aula, Anne; Spiro, Ian (2008). "Eye-mouse coordination patterns on web search results pages". Proceedings of the twenty-sixth annual CHI conference extended abstracts on Human factors in computing systems – CHI '08. p. 2997. doi:10.1145/1358628.1358797. ISBN   978-1-60558-012-8. S2CID   1759484.
  10. 1 2 "User See, User Point: Gaze and Cursor Alignment in Web Search" (PDF).
  11. "Google nabs patent to monitor your cursor movements". TechEye.net. 20 July 2010. Retrieved 2013-10-18.
  12. 1 2 "No Clicks, No Problem: Using Cursor Movements to Understand and Improve Search" (PDF).
  13. 1 2 Arroyo, Ernesto; Selker, Ted; Wei, Willy (2006-04-21). "Usability tool for analysis of web designs using mouse tracks". CHI '06 extended abstracts on Human factors in computing systems - CHI EA '06. ACM. p. 484. doi:10.1145/1125451.1125557. ISBN   1-59593-298-4. S2CID   7684333.
  14. Atterer, Wnuk, Schmidt. "Knowing the User's Every Move – User Activity Tracking for Website Usability Evaluation and Implicit Interaction" (PDF). Retrieved 2013-10-18.{{cite web}}: CS1 maint: multiple names: authors list (link)
  15. "UsaProxy – Usability Proxy for Websites". Fnuked.de. Retrieved 2012-02-23.
  16. Chris Crum (13 July 2010). "Google Eyes Mouse Movement as Possible Search Relevancy Signal". WebProNews. Retrieved 2012-02-23.
  17. Guo, Qi; Agichtein, Eugene (2008-07-20). "Exploring mouse movements for inferring query intent". Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval - SIGIR '08. ACM. p. 707. doi:10.1145/1390334.1390462. ISBN   978-1-60558-164-4. S2CID   2334939.
  18. Jorgensen, Zach, and Ting Yu. "On mouse dynamics as a behavioral biometric for authentication." Proceedings of the 6th ACM Symposium on Information, Computer and Communications Security. 2011.
  19. Weiss, Adam, et al. "Mouse movements biometric identification: A feasibility study." Proc. Student/Faculty Research Day CSIS, Pace University, White Plains, NY (2007).
  20. Agustin, Francis. "Amazon reportedly wants to track its customer service employees by their keyboard strokes and mouse movements". Business Insider. Retrieved 2021-11-22.
  21. Pusara, Maja; Brodley, Carla E. (2004-10-29). "User re-authentication via mouse movements". Proceedings of the 2004 ACM workshop on Visualization and data mining for computer security - VizSEC/DMSEC '04. ACM. p. 1. doi:10.1145/1029208.1029210. ISBN   1-58113-974-8. S2CID   1016649.
  22. "CHI 97: A Comparison of Reading Paper and On-Line Documents". Sigchi.org. Retrieved 2012-02-23.
  23. Cetintas, Luo, Yan, Hord, Dake (2009). "Learning to Identify Students' Off-Task Behavior in Intelligent Tutoring Systems". Proceedings of the 2009 conference on Artificial Intelligence in Education: Building Learning Systems that Care: From Knowledge Representation to Affective Modelling. IOS Press Amsterdam. pp. 701–703. ISBN   978-1-60750-028-5.{{cite book}}: CS1 maint: multiple names: authors list (link)
  24. Voßkühler, Adrian; Nordmeier, Volkhard; Kuchinke, Lars; Jacobs, Arthur M. (2008). "OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs". Behavior Research Methods. 40 (4): 1150–62. doi: 10.3758/BRM.40.4.1150 . PMID   19001407.