Persuasive technology

Last updated

Persuasive technology is broadly defined as technology that is designed to change attitudes or behaviors of the users through persuasion and social influence, but not necessarily through coercion. [1] Such technologies are regularly used in sales, diplomacy, politics, religion, military training, public health, and management, and may potentially be used in any area of human-human or human-computer interaction. Most self-identified persuasive technology research focuses on interactive, computational technologies, including desktop computers, Internet services, video games, and mobile devices, [2] but this incorporates and builds on the results, theories, and methods of experimental psychology, rhetoric, [3] and human-computer interaction. The design of persuasive technologies can be seen as a particular case of design with intent. [4]

Contents

Taxonomies

Functional triad

Persuasive technologies can be categorized by their functional roles. B. J. Fogg proposes the functional triad as a classification of three "basic ways that people view or respond to computing technologies": persuasive technologies can function as tools, media, or social actors – or as more than one at once. [5]

Direct interaction v. mediation

Persuasive technologies can also be categorized by whether they change attitude and behaviors through direct interaction or through a mediating role: [13] do they persuade, for example, through human-computer interaction (HCI) or computer-mediated communication (CMC)? The examples already mentioned are the former, but there are many of the latter. Communication technologies can persuade or amplify the persuasion of others by transforming the social interaction, [14] [15] providing shared feedback on interaction, [16] or restructuring communication processes. [17]

Persuasion design

Persuasion design is the design of messages by analyzing and evaluating their content, using established psychological research theories and methods. Andrew Chak [18] argues that the most persuasive web sites focus on making users feel comfortable about making decisions and helping them act on those decisions. During the clinical encounter, clinical decision support tools (CDST) are widely applied to improve patients' satisfaction towards medical decision-making shared with the physicians. [19] The comfort that a user feels is generally registered subconsciously. [20]

Persuasion by social motivators

Previous research has also utilized on social motivators like competition for persuasion. By connecting a user with other users, [21] his/her coworkers, [22] friends and families, [23] a persuasive application can apply social motivators on the user to promote behavior changes. Social media such as Facebook, Twitter also facilitate the development of such systems. It has been demonstrated that social impact can result in greater behavior changes than the case where the user is isolated. [24]

Persuasive strategies

Halko and Kientz made an extensive search in the literature for persuasive strategies and methods used in the field of psychology to modify health-related behaviors. [25] Their search concluded that there are eight main types of persuasive strategies, which can be grouped into the following four categories, where each category has two complementary approaches.

Instruction style

Authoritative

This persuades the technology user through an authoritative agent, for example, a strict personal trainer who instructs the user to perform the task that will meet their goal.[ citation needed ]

Non-authoritative

This persuades the user through a neutral agent, for example, a friend who encourages the user to meet their goals. Another example of instruction style is customer reviews; a mix of positive and negative reviews together give a neutral perspective on a product or service. [26]

Social feedback

Cooperative

This persuades the user through the notion of cooperating and teamwork, such as allowing the user to team up with friends to complete their goals.[ citation needed ]

Competitive

This persuades the user through the notion of competing. For example, users can play against friends or peers and be motivated to achieve their goal by winning the competition.[ citation needed ]

Motivation type

Extrinsic

This persuades the user through external motivators, for example, winning a trophy as a reward for completing a task.[ citation needed ]

Intrinsic

This persuades the user through internal motivators, such as the good feeling a user would have for being healthy or for achieving a goal.[ citation needed ]

It is worth noting that intrinsic motivators can be subject to the overjustification [27] effect, which states if intrinsic motivators are associated with a reward and you remove the reward then the intrinsic motivation tends to diminish. This is because depending on how the reward is seen, it can become linked to extrinsic motivations instead of intrinsic motivations. Badges, prizes, and other award systems will increase intrinsic motivation if they are seen as reflecting competence and merit.

In 1973, Lepper et al. conducted a foundational study that underscored the overjustification effect. [28] Their team brought magic markers to a preschool and created three test groups of children who were intrinsically motivated. The first group were informed that if they used markers they could receive a “Good Player Award.” The second group was not incentivized to use the magic markers with a reward, but were given a reward after playing. The third group was given no expectations about awards and received no awards. A week later, all students played with the markers without a reward. The students receiving the "good player" award originally showed half as much interest as when they began the study. Later, other psychologists repeated this experiment only to conclude that rewards create short-term motivation, but undermine intrinsic motivation.

Reinforcement type

Negative reinforcement

This persuades the user by removing an unpleasant stimulus. For example, a brown and dying nature scene might turn green and healthy as the user practises more healthy behaviors.[ citation needed ]

Positive reinforcement

This persuades the user by adding a positive stimulus. For example, adding flowers, butterflies, and other nice-looking elements to an empty nature scene as a user practises more healthy behaviors.[ citation needed ]

Logical Fallacies

More recently, Lieto and Vernero [29] [30] have also shown that arguments reducible to logical fallacies are a class of widely adopted persuasive techniques in both web and mobile technologies. These techniques have also shown their efficacy in large-scale studies about persuasive news recommendations [31] as well as in the field of human-robot interaction. [32] A 2021 report by the RAND Corporation [33] shows how the use of logical fallacies is one of the rhetorical strategies used by the Russia and its agents to influence the online discourse and spread subversive information in Europe.

Reciprocal equality

One feature that distinguishes persuasion technology from familiar forms of persuasion is that the individual being persuaded often cannot respond in kind. This is a lack of reciprocal equality. For example, when a conversational agent persuades a user using social influence strategies, the user cannot also use similar strategies on the agent. [1]

Health behavior change

While persuasive technologies are found in many domains, considerable recent attention has focused on behavior change in health domains. Digital health coaching is the utilization of computers as persuasive technology to augment the personal care delivered to patients, and is used in numerous medical settings. [34]

Numerous scientific studies show that online health behaviour change interventions can influence users' behaviours. Moreover, the most effective interventions are modelled on health coaching, where users are asked to set goals, educated about the consequences of their behaviour, then encouraged to track their progress toward their goals. Sophisticated systems even adapt to users who relapse by helping them get back on the bandwagon. [35]

Maintaining behavior change long term is one of the challenges of behavior change interventions. For instance, as reported, for chronic illness treatment regimens non-adherence rate can be as high as 50% to 80%. Common strategies that have been shown by previous research to increase long-term adherence to treatment include extended care, skills training, social support, treatment tailoring, self-monitoring, and multicomponent stages. However, even though these strategies have been demonstrated to be effective, there are also existing barriers to implementation of such programs: limited time, resources, as well as patient factors such as embarrassment of disclosing their health habits. [36]

To make behavior change strategies more effective, researchers also have been adapting well-known and empirically tested behavior change theories into such practice. The most prominent behavior change theories that have been implemented in various health-related behavior change research has been self-determination theory, theory of planned behavior, social cognitive theory, transtheoretical model, and social ecological model. Each behavior change theory analyses behavior change in different ways and consider different factors to be more or less important. Research has suggested that interventions based on behavior change theories tend to yield better result than interventions that do not employ such theories. The effectiveness of them vary: social cognitive theory proposed by Bandura, which incorporates the well-known construct of self-efficacy, has been the most widely used method in behavior change interventions as well as the most effective in maintaining long-term behavior change. [37]

Even though the healthcare discipline has produced a plethora of empirical behavior change research, other scientific disciplines are also adapting such theories to induce behavior change. For instance, behavior change theories have also been used in sustainability, such as saving electricity, [38] and lifestyle, such as helping people drinking more water. [39] These research has shown that these theories, already effectively proven useful in healthcare, is equally powerful in other fields to promote behavior change.[ citation needed ]

Interestingly, there have been some studies that showed unique insights and that behavior change is a complex chain of events: a study by Chudzynski et al. showed that reinforcement schedule has little effect on maintaining behavior change. [40] A point made in a study by Wemyss et al. is that even though people who have maintained behavior change for short term might revert to baseline, their perception of their behavior change could be different: they still believe they maintained the behavior change even if they factually have not. [38] Therefore, it is possible self-report measures would not always be the most effective way of evaluating the effectiveness of the intervention.[ citation needed ]

Promote sustainable lifestyles

Previous work has also shown that people are receptive to change their behaviors for sustainable lifestyles. This result has encouraged researchers to develop persuasive technologies to promote for example, green travels, [41] less waste, [22] etc.

One common technique is to facilitate people's awareness of benefits for performing eco-friendly behaviors. For example, a review of over twenty studies exploring the effects of feedback on electricity consumption in the home showed that the feedback on the electricity consumption pattern can typically result in a 5–12% saving. [42] Besides the environmental benefits such as CO2 savings, health benefit, cost are also often used to promote eco-friendly behaviors. [41]

Research challenges

Despite the promising results of existing persuasive technologies, there are three main challenges that remain present.

Technical challenges

Persuasive technologies developed relies on self-report or automated systems that monitor human behavior using sensors and pattern recognition algorithms. Several studies in the medical field have noted that self-report is subject to bias, recall errors and low adherence rates. The physical world and human behavior are both highly complex and ambiguous. Utilizing sensors and machine learning algorithms to monitor and predict human behavior remains a challenging problem, especially that most of the persuasive technologies require just-in-time intervention.

Difficulty in studying behavior change

In general, understanding behavioral changes require long-term studies as multiple internal and external factors can influence these changes (such as personality type, age, income, willingness to change and more). For that, it becomes difficult to understand and measure the effect of persuasive technologies. Furthermore, meta-analyses of the effectiveness of persuasive technologies have shown that the behavior change evidence collected so far is at least controversial, since it is rarely obtained by Randomized Controlled Trials (RCTs), [43] the “gold standard” in causal inference analysis. In particular, due to relevant practical challenges to perform strict RCTs, [44] most of the above-mentioned empirical trials on lifestyles rely on voluntary, self-selected participants. If such participants were systematically adopting the desired behaviors already before entering the trial, then self-selection biases would occur. Presence of such biases would weaken the behavior change effects found in the trials. Analyses aimed at identifying the presence and extent of self-selection biases in persuasive technology trials are not widespread yet. A study by Cellina et al. on an app-based behavior change trial in the mobility field found evidence of no self-selection biases. [45] However, further evidence needs to be collected in different contexts and under different persuasive technologies in order to generalize (or confute) their findings.[ citation needed ]

Ethical challenges

The question of manipulating feelings and desires through persuasive technology remains an open ethical debate. User-centered design guidelines should be developed encouraging ethically and morally responsible designs, and provide a reasonable balance between the pros and cons of persuasive technologies. [46]

In addition to encouraging ethically and morally responsible designs, Fogg believes education, such as through the journal articles he writes, is a panacea for concerns about the ethical challenges of persuasive computers. [5] Fogg notes two fundamental distinctions regarding the importance of education in engaging with ethics and technology: "First, increased knowledge about persuasive computers allows people more opportunity to adopt such technologies to enhance their own lives, if they choose. Second, knowledge about persuasive computers helps people recognize when technologies are using tactics to persuade them." [5]

Another ethical challenge for persuasive technology designers is the risk of triggering persuasive backfires, where the technology triggers the bad behavior that it was designed to reduce. [47]

See also

Other subjects which have some overlap or features in common with persuasive technology include:

Related Research Articles

<span class="mw-page-title-main">Persuasion</span> Umbrella term of influence and mode of communication

Persuasion or persuasion arts is an umbrella term for influence. Persuasion can influence a person's beliefs, attitudes, intentions, motivations, or behaviours.

The Stanford Behavior Design Lab is a research organization advancing behavior change methods and models based at Stanford University. Founded in 1998 and directed by B. J. Fogg, the Behavior Design Lab is a team of Stanford students, recent graduates, and quantitative researchers who study factors that impact human behavior, and conduct IRB research. The team is the global authority in a new and systematic way to design for behavior change, an approach called “Behavior Design." The Lab manager is Tanna Drapkin.

The elaboration likelihood model (ELM) of persuasion is a dual process theory describing the change of attitudes. The ELM was developed by Richard E. Petty and John Cacioppo in 1980. The model aims to explain different ways of processing stimuli, why they are used, and their outcomes on attitude change. The ELM proposes two major routes to persuasion: the central route and the peripheral route.

<span class="mw-page-title-main">B. J. Fogg</span> Author

Brian Jeffrey Fogg is an American social scientist and author who is a research associate and adjunct professor at Stanford University. He is the founder and director of the Stanford Behavior Design Lab, formerly known as the Persuasive Technology Lab.

Behavioural change theories are attempts to explain why human behaviours change. These theories cite environmental, personal, and behavioural characteristics as the major factors in behavioural determination. In recent years, there has been increased interest in the application of these theories in the areas of health, education, criminology, energy and international development with the hope that understanding behavioural change will improve the services offered in these areas. Some scholars have recently introduced a distinction between models of behavior and theories of change. Whereas models of behavior are more diagnostic and geared towards understanding the psychological factors that explain or predict a specific behavior, theories of change are more process-oriented and generally aimed at changing a given behavior. Thus, from this perspective, understanding and changing behavior are two separate but complementary lines of scientific investigation.

Captology is the study of computers as persuasive technologies. This area of inquiry explores the overlapping space between persuasion in general and computing technology. This includes the design, research, and program analysis of interactive computing products created for the purpose of changing people's attitudes or behaviors.

The unified theory of acceptance and use of technology (UTAUT) is a technology acceptance model formulated by Venkatesh and others in "User acceptance of information technology: Toward a unified view". The UTAUT aims to explain user intentions to use an information system and subsequent usage behavior. The theory holds that there are four key constructs: 1) performance expectancy, 2) effort expectancy, 3) social influence, and 4) facilitating conditions.

Inoculation theory is a social psychological/communication theory that explains how an attitude or belief can be made resistant to persuasion or influence, in analogy to how a body gains resistance to disease. The theory uses medical inoculation as its explanatory analogy but instead of applying it to disease, it is used to discuss attitudes and other positions, like opinions, values, and beliefs. It has applicability to public campaigns targeting misinformation and fake news, but it is not limited to misinformation and fake news.

Source credibility is "a term commonly used to imply a communicator's positive characteristics that affect the receiver's acceptance of a message." Academic studies of this topic began in the 20th century and were given a special emphasis during World War II, when the US government sought to use propaganda to influence public opinion in support of the war effort. Psychologist Carl Hovland and his colleagues worked at the War Department upon this during the 1940s and then continued experimental studies at Yale University. They built upon the work of researchers in the first half of the 20th century who had developed a Source-Message-Channel-Receiver model of communication and, with Muzafer Sherif, developed this as part of their theories of persuasion and social judgement.

<span class="mw-page-title-main">Gamification</span> Using game design elements in non-games

Gamification is the attempt to enhance systems, services, organizations, and activities by simulating experiences similar to those experienced when playing games in order to motivate and engage users. This is generally accomplished through the application of game design elements and game principles in non-game contexts.

The use of electronic and communication technologies as a therapeutic aid to healthcare practices is commonly referred to as telemedicine or eHealth. The use of such technologies as a supplement to mainstream therapies for mental disorders is an emerging mental health treatment field which, it is argued, could improve the accessibility, effectiveness and affordability of mental health care. Mental health technologies used by professionals as an adjunct to mainstream clinical practices include email, SMS, virtual reality, computer programs, blogs, social networks, the telephone, video conferencing, computer games, instant messaging and podcasts.

In social psychology, the Yale attitude change approach is the study of the conditions under which people are most likely to change their attitudes in response to persuasive messages. This approach to persuasive communications was first studied by Carl Hovland and his colleagues at Yale University during World War II. The basic model of this approach can be described as "who said what to whom": the source of the communication, the nature of the communication and the nature of the audience. According to this approach, many factors affect each component of a persuasive communication. The credibility and attractiveness of the communicator (source), the quality and sincerity of the message, and the attention, intelligence and age of the audience can influence an audience's attitude change with a persuasive communication. Independent variables include the source, message, medium and audience, with the dependent variable the effect of the persuasion.

In social psychology, the boomerang effect, also known as "reactance", refers to the unintended consequences of an attempt to persuade resulting in the adoption of an opposing position instead. It is sometimes also referred to as "the theory of psychological reactance", stating that attempts to restrict a person's freedom often produce an "anticonformity boomerang effect". In other words, the boomerang effect is a situation where people tend to pick the opposite of what something or someone is saying or doing because of how it is presented to them. Typically, the more aggressively a position is presented to someone, the more likely they are to adopt an opposing view.

<span class="mw-page-title-main">Behavioural design</span> Field of design concerned with the influence of design on behavior

Behavioural design is a sub-category of design, which is concerned with how design can shape, or be used to influence human behaviour. All approaches of design for behaviour change acknowledge that artifacts have an important influence on human behaviour and/or behavioural decisions. They strongly draw on theories of behavioural change, including the division into personal, behavioural, and environmental characteristics as drivers for behaviour change. Areas in which design for behaviour change has been most commonly applied include health and wellbeing, sustainability, safety and social context, as well as crime prevention.

Mindfulness and technology is a movement in research and design, that encourages the user to become aware of the present moment, rather than losing oneself in a technological device. This field encompasses multidisciplinary participation between design, psychology, computer science, and religion. Mindfulness stems from Buddhist meditation practices and refers to the awareness that arises through paying attention on purpose in the present moment, and in a non-judgmental mindset. In the field of Human-Computer Interaction, research is being done on Techno-spirituality — the study of how technology can facilitate feelings of awe, wonder, transcendence, and mindfulness and on Slow design, which facilitates self-reflection. The excessive use of personal devices, such as smartphones and laptops, can lead to the deterioration of mental and physical health. This area focuses on redesigning and creating technology to improve the wellbeing of its users.

Social navigation is a form of social computing introduced by Paul Dourish and Matthew Chalmers in 1994, who defined it as when "movement from one item to another is provoked as an artifact of the activity of another or a group of others". According to later research in 2002, "social navigation exploits the knowledge and experience of peer users of information resources" to guide users in the information space, and that it is becoming more difficult to navigate and search efficiently with all the digital information available from the World Wide Web and other sources. Studying others' navigational trails and understanding their behavior can help improve one's own search strategy by guiding them to make more informed decisions based on the actions of others.

Animal–computer interaction (ACI) is a field of research for the design and use of technology with, for and by animals covering different kinds of animals from wildlife, zoo and domesticated animals in different roles. It emerged from, and was heavily influenced by, the discipline of Human–computer interaction (HCI). As the field expanded, it has become increasingly multi-disciplinary, incorporating techniques and research from disciplines such as artificial intelligence (AI), requirements engineering (RE), and veterinary science.

Feminist HCI is a subfield of human-computer interaction (HCI) that applies feminist theory, critical theory and philosophy to social topics in HCI, including scientific objectivity, ethical values, data collection, data interpretation, reflexivity, and unintended consequences of HCI software. The term was originally used in 2010 by Shaowen Bardzell, and although the concept and original publication are widely cited, as of 2020 Bardzell's proposed frameworks have been rarely used since.

A Behavioral Change Support System (BCSS) is any information and communications technology (ICT) tool, web platform, or gamified environment which targets behavioral changes in its end-users. BCSS are built upon persuasive systems design techniques.

Antonio Lieto is an Italian cognitive scientist and computer scientist at the University of Salerno and a research associate at the Institute of High Performance Computing of the Italian National Research Council focusing on cognitive architectures and computational models of cognition, commonsense reasoning and models of mental representation, and persuasive technologies. He teaches Artificial Intelligence and "Design and Evaluation of Cognitive Artificial Systems" at the Department of Computer Science of the University of Turin.

References

  1. 1 2 Fogg 2003a, p. [ page needed ].
  2. Oinas-Kukkonen et al. 2008, p. [ page needed ].
  3. 1 2 Bogost 2007, p. [ page needed ].
  4. Lockton, Harrison & Stanton 2010.
  5. 1 2 3 4 Fogg 1998.
  6. Fogg 2003b.
  7. Fogg 2003c.
  8. Reeves & Nass 1996, p. [ page needed ].
  9. Turkle 1984, p. [ page needed ].
  10. Fogg 2003d.
  11. Fogg & Nass 1997b.
  12. Moon 2000.
  13. Oinas-Kukkonen & Harjumaa 2008.
  14. Licklider & Taylor 1968.
  15. Bailenson et al. 2004.
  16. Dimicco, Pandolfo & Bender 2004.
  17. Winograd 1986.
  18. Perfetti 2003.
  19. Yang et al. 2020.
  20. Spahn 2012.
  21. De Oliveira, Cherubini & Oliver 2010.
  22. 1 2 Thieme et al. 2012.
  23. Caraban et al. 2015.
  24. Chiu et al. 2009.
  25. Halko & Kientz 2010.
  26. Wixom & Todd 2005.
  27. "APA Dictionary of Psychology".
  28. Greene, David, and Mark R. Lepper. “Effects of Extrinsic Rewards on Children’s Subsequent Intrinsic Interest.” Child Development, vol. 45, no. 4, [Wiley, Society for Research in Child Development], 1974, pp. 1141–45, https://doi.org/10.2307/1128110.
  29. Lieto & Vernero 2013.
  30. Lieto & Vernero 2014.
  31. Gena et al. 2019.
  32. Augello et al. 2021.
  33. Matthews et al. 2021.
  34. Elton 2007.
  35. Cugelman, Thelwall & Dawes 2011.
  36. Middleton, Anton & Perri 2013.
  37. Joseph et al. 2016.
  38. 1 2 Wemyss et al. 2019.
  39. Dhar & Putnam-Farr 2017.
  40. Chudzynski et al. 2015.
  41. 1 2 Froehlich et al. 2009.
  42. Fischer 2008.
  43. Hamari, Koivisto & Pakkanen 2014.
  44. Bhushan, Steg & Albers 2018.
  45. Cellina, Vittucci Marzetti & Gui 2021.
  46. Ijsselsteijn et al. 2006.
  47. Stibe & Cugelman 2016.

Sources