A Special Interest Group on Designed and Engineered Friction in Interaction

Sandy J.J. Gould, Lewis L. Chuang, Ioanna Iacovides, Diego Garaialde, Marta E. Cecchinato, Benjamin R. Cowan, Anna L. Cox

This is an HTML version of the accepted manuscript. There is a publisher’s PDF version available (no logon required).

Introduction

Human-computer interactions are implicitly designed to be smooth and efficient. The implicit objective is to enhance performance, improve safety, and promote satisfaction of use. Few designers would intentionally create systems that induce frustration or are inefficient or evendangerous. Nonetheless, optimizing usability can lead to automatic and thoughtless behaviour. In other words, an over-optimization of performance and satisfaction could imply or encourage behaviours that compromises individual users and their communities.

Frictions —changes to an interaction to make it more taxing in some way— are one potential solution to the risks of over-optimisation and over-proceduralisation. The content warnings placed on social media posts on platforms like Facebook and Twitter are an example of friction. These frictions have been added in response to particularly ‘risky’ scenarios, where, for instance, widespread misinformation may significantly influence democratic processes. Twitter, for instance, added friction to the process of ‘retweeting’ (i.e., relaying a message to other users) for certain messages. If a user tried to retweet a message containing a link without having opened the link then Twitter would produce an interstitial dialog asking users if they wanted to read the link before retweeting (Andrew Hutchinson 2020).

In this short paper, we consider the perspectives of different academic disciplines’ accounts (and usages) of tensions between automatic and deliberate behaviour. We explore the limits on theoretical frameworks that can plausibly describe the mechanism of designed frictions. Following this, we enumerate some effective designs for intentional frictions in human-computer interactions, identify abstract principles from their real-world use, and expand on how they could be generalized for innovations in designed frictions. Finally, we hope to address how current practices for evaluating usability can be modified to consider the potential costs of automatic behaviour and how they could be mitigated with designed frictions.

Relevant Theoretical Frameworks

Human Factors

Rasmussen’s Skill-Recognition-Knowledge framework (Rasmussen 1983) presents a suitable framework for considering how behavioural automaticity varies as a function of the user’s expertise. When a new task is encountered for which existing procedures do not exist, “the control of performance must move to a higher conceptual level” (Rasmussen 1983, 259). Skilled performance does not have this quality, rolling “along with­ out conscious attention or control” (Rasmussen 1983, 259). This transition from knowledge to rule to skill is critical for effective performance in many tasks. But it also risks a kind of mindless interaction causing people to act in ways that do not fit with their larger goals and preferences.

People’s ability to habituate and adapt, to turn knowledge-based interactions into skill based ones are part of the motivation for developing frictions, but these abilities are what make the development of frictions particularly challenging. Dialogs on GUI computer systems are a kind of friction, to say, let users know of security issues when they try and visit a website. But people become habituated to these kinds of messages (Garaialde et al. 2020), reducing their effectiveness (Egelman, Cranor, and Hong 2008). People also adapt their behaviour to frictions; Gould et al. (2016) found that a task lockout (a form of friction to encourage people to check before proceeding with a task) needs to be carefully calibrated in length. Too short and it is almost imperceptible to people. Too long and people start switching to other activities, defeating the purpose of the friction. Determining the ‘right’ amount of friction for an interaction is something that is highly contextually contingent, but it may still be possible to develop heuristics based on the effort the friction is designed to elicit and the nature of the primary task in terms of cognitive workload.

Sociology

End users are most immediately affected by the introduction of frictions for better or worse; they may irritate people but they may save them from costly errors. But there are other stakeholders whose goals mean that many of our interactions with technology are often steered toward a kind of habituated instinct. Dark patterns (Gray et al. 2018; Bösch et al. 2016) rely on on these kinds of semi-automatic behaviours. Friction might provide a way to break people out of automatic responses, but to some groups these kinds of responses are valuable, either directly or indirectly. Therefore, some of the contextualising questions about frictions and their role in interaction require us to consider the wider political economy of interaction, and the consequences of continually seeking to minimize friction in interactions.

Psychology

There are many tasks developed within psychology that contrast involuntary and voluntary behaviour. Dual-process accounts of cognition (Kahneman 2011), similar to other formulations such as the ‘want and should self’ (Bitterly et al. 2014) or the model-free and model-based systems (Gläscher et al. 2010), hold that decision making is driven by a fast automatic process (System 1) and a deliberative process (System 2). System 1 is a fast decision making system that drives the execution of repeated habituated decisions leading to little deliberation. It is highly impulse oriented and requires little cognitive resources (Botvinick and Weinstein 2014). System 2 is a slow, more deliberative process that allows for planning and intentionality (Kahneman 2011) (Evans and Stanovich 2013). Because of the need for significant resources, System 2 tends to be used sparingly (O’Hara and Payne 1998). Dual process accounts have been used to explain concepts such as impulse control (Hofmann, Friese, and Strack 2009). Switching people to a more deliberative mode of thinking; to get them to consider what they are doing, requires getting people to switch out of the automatic, fast processes that they normally do to engage in more System 1 based decision making. The critical question for the design of frictions is understanding how designing for this switch from System 1 or 2 impacts interaction, the contexts this would be useful and what tools can be used to cause such a switch.

Design

The idea of slowing interactions down to influence how people experience an interaction is a tool that has been used frequently in design work. Slow design (Grosse-Hering et al. 2013; Strauss and Fuad-Luke 2008), slow technology (Cheng et al. 2011) and designing for slowness (Cheng et al. 2011) all recognise that increasing the speed and reducing the effort of interactions can deny people mental time and space for reflection. Generally, these design approaches represent an entire orientation to an interaction, rather than a specific, friction-creating stage of an interaction.

Other design approaches like pleasurable trouble makers (Hassenzahl and Laschke 2015) and uncomfortable interactions (Benford et al. 2012) both aim to create interactions in opposition to the principles of speed and effortlessness in interaction design. Again, the goal is to produce interactions that are out of the ordinary, that elicit reflection or simply novel experiences.

Reflection is a critical aspect to some of these design approaches, and it seems here there is the greatest overlap between design accounts and cognitive accounts of interaction. Hassenzahl and Laschke’s (Hassenzahl and Laschke 2015) work tries to mesh these these accounts; both traditions stand to benefit from mutual awareness and on of the goals of the SIG is to try and identify more links between different research traditions — there is certainly a phenomenon of mutual interest here.

Examples of Designed Frictions

There are many examples where automatic and reflexive behaviour is explicitly discouraged, and design frictions are frequently referenced by industrial user-experience professionals (Zoltan Kollin 2018; Nick Babich 2017). There is no clear definition (or definitions) of what friction are though, or whether a proposed friction needs to be effective to qualify as a friction. In this section we describe some examples that we think have the qualities of frictions; additional steps added to an interaction that are intended to slow things down and give room for deliberative thinking.

Wang et al. (Wang et al. 2014) found that adding a count-down timer friction between users of Facebook clicking ‘post’ and a post actually being posted avoided accidental posts, but left participants split over its usefulness. Some participants liked it, others found it irritating. Users could force a message to be sent before the countdown timer completed, but this still added an extra step, an extra friction to the task.

Twitter, understanding the potential for its social network to be a conduit for misinformation, put in several frictions specifically for the 2020 US presidential election1. For messages the platform considers misleading, a warning is placed over the tweet content, and users are required to click-through the warning to see the information. This extra step adds friction to the interaction, making sharing misinformation more effortful and less likely to be mindlessly shared.

Pop-up dialogs that check users want to quit without exiting or that inform them of a security issue with a website they’re trying to access (Egelman, Cranor, and Hong 2008) act as a friction, introducing an extra stage to a task that is intended to get a user to stop and think. Do they work? Unfortunately people become habituated to these kinds of messages, to the point that clicking through them becomes a proceduralised aspect of the task, and it ceases to have an impact (Amran, Zaaba, and Singh 2018). Understanding cognition and context is critical to understanding what makes frictions effective.

Open Questions

There a number of open questions about the use of frictions. One of the goals of the SIG is to determine which are most pressing. As we see it, the most important questions about frictions are:

  • What kinds of interactional contexts are frictions most suited to?
  • What are the most effective ways to get people to switch to a slower, more deliberative way of thinking?
  • How quickly do people become habituated to frictions, and how do we manage and/or mitigate the effects of friction habituation?
  • Should we be focusing on changing people’s behaviour instead of steering them with frictions?
  • How do we calibrate frictions so that they give people space to think, but are not excessively frustrating or negative to user experience?

Need for a SIG

Many people in the HCI community are thinking about the deleterious effects of mindless interactions with technology (Cox et al. 2016), whether these are for individuals, larger groups or the environment. As HCI research and methods have substantially enhanced the capability to build faster, less effortful interactions, the community also has a responsibility to understand, and where it makes sense, ameliorate some of these negative effects (or potential negative effects).

Some researchers are using behaviour change methods to try and change people’s automatic processes in scenarios where permanent change is needed. But this may be an unnecessarily or impractically complex approach for quickly getting people to deliberate on a particular stage of an interaction before they proceed (Pinder et al. 2018). Researchers and designers have identified that frictions might provide the room required for these deliberations. However, what an effective friction looks like, why it is effective and the kinds of contexts that frictions lend themselves to is not well understood. We propose a special interest group to try and stimulate discussion about the most pressing priorities for new knowledge generation in this space.

The special interest group is designed to be of general interest to CHI attendees, but should be of particular interest to attendees with an interest in cognition, design, and their confluence. Frictions necessarily draw on these two domains; an understanding from cognition of how attention works and how it can be co-opted, but also a more design-oriented feeling for how to introduce frictions into interactions in a way that makes them feel an authentic addition.

References

Amran, Ammar, Zarul Fitri Zaaba, and Manmeet Kaur Mahinderjit Singh. 2018. “Habituation Effects in Computer Security Warning.” Information Security Journal: A Global Perspective 27 (4): 192–204. https://doi.org/10.1080/19393555.2018.1505008.

Andrew Hutchinson. 2020. “Twitter Tests Making ’Quote Tweet’ the Default Retweeting Option.” Social Media Today. https://www.socialmediatoday.com/news/twitter-tests-making-quote-tweet-the-default-retweeting-option/586621/.

Benford, Steve, Chris Greenhalgh, Gabriella Giannachi, Brendan Walker, Joe Marshall, and Tom Rodden. 2012. “Uncomfortable Interactions.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2005–14. CHI ’12. New York, NY, USA: ACM. https://doi.org/10.1145/2207676.2208347.

Bitterly, T. Bradford, Robert Mislavsky, Hengchen Dai, and Katherine L. Milkman. 2014. “Dueling with Desire: A Synthesis of Past Research on Want/Should Conflict.” {{SSRN Scholarly Paper}} ID 2403021. Rochester, NY: Social Science Research Network. https://doi.org/10.2139/ssrn.2403021.

Botvinick, Matthew, and Ari Weinstein. 2014. “Model-Based Hierarchical Reinforcement Learning and Human Action Control.” Philosophical Transactions of the Royal Society B: Biological Sciences 369 (1655): 20130480. https://doi.org/10.1098/rstb.2013.0480.

Bösch, Christoph, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher. 2016. “Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns.” Proceedings on Privacy Enhancing Technologies 2016 (4): 237–54.

Cheng, Justin, Akshay Bapat, Gregory Thomas, Kevin Tse, Nikhil Nawathe, Jeremy Crockett, and Gilly Leshed. 2011. “GoSlow: Designing for Slowness, Reflection and Solitude.” In CHI ’11 Extended Abstracts on Human Factors in Computing Systems, 429–38. CHI EA ’11. New York, NY, USA: ACM. https://doi.org/10.1145/1979742.1979622.

Cox, Anna L., Sandy J. J. Gould, Marta E. Cecchinato, Ioanna Iacovides, and Ian Renfree. 2016. “Design Frictions for Mindful Interactions: The Case for Microboundaries.” In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, 1389–97. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/2851581.2892410.

Egelman, Serge, Lorrie Faith Cranor, and Jason Hong. 2008. “You’ve Been Warned: An Empirical Study of the Effectiveness of Web Browser Phishing Warnings.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1065–74. CHI ’08. New York, NY, USA: ACM. https://doi.org/10.1145/1357054.1357219.

Evans, Jonathan St. B. T., and Keith E. Stanovich. 2013. “Dual-Process Theories of Higher Cognition: Advancing the Debate.” Perspectives on Psychological Science 8 (3): 223–41. https://doi.org/10.1177/1745691612460685.

Garaialde, Diego, Christopher P. Bowers, Charlie Pinder, Priyal Shah, Shashwat Parashar, Leigh Clark, and Benjamin R. Cowan. 2020. “Quantifying the Impact of Making and Breaking Interface Habits.” International Journal of Human-Computer Studies 142: 102461. https://doi.org/https://doi.org/10.1016/j.ijhcs.2020.102461.

Gläscher, Jan, Nathaniel Daw, Peter Dayan, and John P. O’Doherty. 2010. “States Versus Rewards: Dissociable Neural Prediction Error Signals Underlying Model-Based and Model-Free Reinforcement Learning.” Neuron 66 (4): 585–95. https://doi.org/10.1016/j.neuron.2010.04.016.

Gould, Sandy J. J., Anna L. Cox, Duncan P. Brumby, and Alice Wickersham. 2016. “Now Check Your Input: Brief Task Lockouts Encourage Checking, Longer Lockouts Encourage Task Switching.” In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 3311–23. CHI ’16. New York, NY, USA: ACM. https://doi.org/10.1145/2858036.2858067.

Gray, Colin M., Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs. 2018. “The Dark (Patterns) Side of UX Design.” In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 534:1–14. CHI ’18. New York, NY, USA: ACM. https://doi.org/10.1145/3173574.3174108.

Grosse-Hering, Barbara, Jon Mason, Dzmitry Aliakseyeu, Conny Bakker, and Pieter Desmet. 2013. “Slow Design for Meaningful Interactions.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3431–40. CHI ’13. New York, NY, USA: ACM. https://doi.org/10.1145/2470654.2466472.

Hassenzahl, Marc, and Matthias Laschke. 2015. “Pleasurable Troublemakers.” In The Gameful World. Approaches, Issues, Application, edited by Steffen P. Walz and Sebastian Deterding, 167–96. Cambridge, MA, USA: MIT Press.

Hofmann, Wilhelm, Malte Friese, and Fritz Strack. 2009. “Impulse and Self-Control From a Dual-Systems Perspective.” Perspectives on Psychological Science 4 (2): 162–76. https://doi.org/10.1111/j.1745-6924.2009.01116.x.

Kahneman, Daniel. 2011. Thinking, Fast and Slow. New York, NY, US: Farrar, Straus and Giroux.

Nick Babich. 2017. “When Friction In Design Is Good For UX.” http://babich.biz/friction-in-design/.

O’Hara, Kenton P., and Stephen J. Payne. 1998. “The Effects of Operator Implementation Cost on Planfulness of Problem Solving and Learning.” Cognitive Psychology 35 (1): 34–70. https://doi.org/10.1006/cogp.1997.0676.

Pinder, Charlie, Jo Vermeulen, Benjamin R. Cowan, and Russell Beale. 2018. “Digital Behaviour Change Interventions to Break and Form Habits.” ACM Transactions on Computer-Human Interaction 25 (3): 15:1–66. https://doi.org/10.1145/3196830.

Rasmussen, J. 1983. “Skills, Rules, and Knowledge; Signals, Signs, and Symbols, and Other Distinctions in Human Performance Models.” IEEE Transactions on Systems, Man, and Cybernetics SMC-13 (3): 257–66. https://doi.org/10.1109/TSMC.1983.6313160.

Strauss, Carolyn, and Alastair Fuad-Luke. 2008. “The Slow Design Principles.” Proceedings of the Changing the Change.

Wang, Yang, Pedro Giovanni Leon, Alessandro Acquisti, Lorrie Faith Cranor, Alain Forget, and Norman Sadeh. 2014. “A Field Trial of Privacy Nudges for Facebook.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2367–76. CHI ’14. New York, NY, USA: ACM. https://doi.org/10.1145/2556288.2557413.

Zoltan Kollin. 2018. “Designing Friction For A Better User Experience.” Smashing Magazine. https://www.smashingmagazine.com/2018/01/friction-ux-design-tool/.


  1. https://blog.twitter.com/en_us/topics/company/2020/2020-election-changes.html ↩︎