Article

An Overview of Choice Architecture and Persuasive Technology

UX brainstorming and planning on a clear glass wall. Discussing frames and notes.

Do you ever get frustrated when you go to the grocery store and discover that the product you have always bought is no longer stocked? When the shelf space is now given over to some variation and you need to go to another store to purchase the item that you use?

You have just experienced retail choice architecture. Retailers and manufacturers optimize shelf spacing and item placement to maximize revenue, given the mix of buyers at a particular store.

We navigate these changes in the physical world all the time as empowered consumers. When we have agency and access to resources, we can drive to another location or shop online and pay a premium to get exactly the product we want — shipped to our doorstep. In these instances, our high level of agency overcomes the constrained choices imposed on us.

Persuasive Information Systems and Digital Choice Architecture

But what if these choices are designed for you through digital technologies? What if your choices are constrained based on someone else’s idea about what’s good for you or what you should buy? What if choices are uniquely tailored to you, based on AI models and your online history?

This is the essence of a relatively new area in IT research described as “persuasive information systems.” At HICSS 2021, Winikoff et al. presented a paper evaluating the effectiveness of the Microsoft MyAnalytics platform as an example of persuasive information systems: “The Advent of Digital Productivity Assistants: The Case of Microsoft MyAnalytics.” A number of interesting points in this paper can help us understand how persuasive technologies may meet us in the workplace of the future. For example:

This class of [persuasive information] systems can be seen as having strong contemporary salience due to the increasing governmental interest in changing attitudes and behaviours. However, unlike traditional policy tools, which use mandates, bans and incentives, nudges in persuasive information systems alter choice architecture to make it easy to accept default options. (339)

I was introduced to this concept over ten years ago during a project in healthcare. We were developing behavioral nudges to support changes in adherence patterns for patients. The idea was to make the right choice the easy choice for patients who had suffered a cardiac event by presenting the healthiest options for diet and exercise in the best light through a series of gently applied interaction patterns. When this work is performed ethically and altruistically, managing, or limiting choices for the end-user can be justified, but it can also lead to consumer frustration about missing options.

So, What Is Choice Architecture?

Broadly, choice architecture involves designing how (and what) choices are presented to end-users in order to effect a specific outcome. It’s why a lot of kids’ cereals are lower to the ground, why certain products are prominently featured at the entrance to a store and why cigarettes come with warning labels (and graphic images in many countries outside the United States). Most of us are also familiar with choice architecture online. When you shop online, the layouts of product pages you see are heavily influenced by choice architecture.

If you’re purchasing shoes, for example, you may only see options that exactly match what you searched for (even though a more highly rated, similar shoe may exist for you to buy on the site). The default color may be chosen for you — and it may also happen to be the one that the manufacturer has a surplus of. More expensive shoes or shoes that are more profitable for the manufacturer may be displayed prominently as “related products.”

All the choices you have to make to purchase a product are engineered to achieve some specific goal. That’s why choice architecture is sometimes called “choice engineering,” “choice management,” “choice limitation,” “choice constraint,” or other similar terms.

What Are the Differences Between Choice Architecture, Behavioral Nudges and Persuasive Technology?

Many of the terms related to choice architecture are used interchangeably, but subtle differences exist between these concepts.

Choice Architecture

Choice architecture is the wider framework for any system, physical or digital, that is designed to affect people’s choices. Choice architecture exists at nearly every scale. Laws are a choice architecture for society, and they impact the individual behavior of most people. Choice architecture impacts how many fields an organization requires in a form. Even the weather can be a form of choice architecture; it directly affects how people make decisions about their behavior.

If people are choosing, choice architecture underlies how and how many choices have been presented to them.

Importantly, choice architecture exists whether the designers of choices have explicitly and transparently defined that architecture.

Behavioral Nudges

Nudges are neutral interventions or systems that guide people in a particular direction. A notification on your phone is a nudge: it provides information that you can choose to act on or not. Having a default option checked on a web form is a nudge: you can choose to keep it selected or de-select it. When you use a tool, like an app, to provide directions as you drive, those instructions are a nudge: you can follow them or take a different route, especially if you know the area or encounter an obstacle, such as construction.

If an intervention or system has incentives or disincentives for people who are making choices, it is not a nudge. If a choice is made for you, that’s not a nudge. The key to a nudge is that it is neutral and provides users with full agency.

Persuasive Technology

Persuasive technology primarily refers to digital technologies that are designed to persuade and influence users. Persuasive technology or “persuasive information systems” include websites, apps, video games, computers, smartphones, operating systems, voice assistants, smartwatches and nearly every piece of tech that we interact with on a daily basis.

In short, a framework of choice architecture guides how persuasive technologies engage with users. The tools of persuasive technology and persuasive information systems can include nudges, incentives, disincentives, defaults, descriptive language, graphics, color and a host of other design elements and interventions.

The Ethics of Persuasive Technology and Choice Architecture

Any time we are attempting to influence the opinions or actions of others, whether through technology or otherwise, ethics should, at the very least, be part of the conversation. And that conversation should be transparent and explicit rather than siloed or implied.

To facilitate conversations about the ethics involved in choice architecture and persuasive technology, consider the following questions:

What general and industry-specific regulations will factor into designing the choice architecture? For example, healthcare organizations need to abide by HIPAA, businesses with European consumers need to abide by GDPR, etc.

If your profession, organization, or team has defined mission statements or core values, those principles will influence choice architecture. Examples of this could be the Hippocratic Oath or Patagonia’s core values to build the best product, cause no unnecessary harm, use business to protect nature and not be bound by convention.

Traditionally, the term “persuasive technology” itself only includes systems that leverage nudges, persuasion, and social influence to affect people’s ideas and actions, as opposed to coercion.

What individuals and groups will be affected by the design of this choice architecture?

Often, decision-makers solely focus on the end-user and the organization creating a persuasive information system. But the scope of effects from a particular choice can reach much further. Featuring specific products (rather than, say, product categories) on a web page can, necessarily, advantage and disadvantage certain teams within an organization or vendors of those products. (This may also affect contractual agreements with those parties.)

Individual end-users don’t exist in a vacuum, and their choices can also affect their families and communities.

What alternative strategies could we leverage to facilitate choice?

Many teams get stuck re-using interactive design patterns that have worked for them in the past. Or, worse, they simply use whatever methods their team is capable of implementing. Without a breadth and depth of experience with potential alternative approaches (and the corresponding capability to invent new ones), teams struggle to evaluate the ethics of a particular choice architecture decision, let alone align their approach with all the ethical considerations they have identified.

Are we creating needless suffering?

This is not as intuitive a question as many expect. Making forms longer than necessary can discourage people from completing them; if those forms are meant to help those people receive some kind of benefit, what does that indicate about the choice architecture design? If too many choices are provided, people become overloaded and end up abandoning the process or, worse, choosing random options that aren’t aligned with their best interests. On the other hand, if too few choices are provided, people can feel limited and constrained. They lose trust in the system or they actively seek to frustrate it with their choices.

Of course, in addition to the simple experience of choosing, the outcomes based on any choices or choice architecture can also lead to unintended (but rarely unforeseeable) negative effects.

Vervint: Leveraging Human-Centered Design to Create Ethical Persuasive Technology

At Vervint, we use human-centered design to help organizations. If you want help designing, developing, and fully leveraging persuasive technology, all you need to do is start a conversation. We look forward to hearing from you!

About the Author

mm

Jim VanderMey

Author Title

Jim VanderMey is the Chief Innovation Officer for Vervint. Jim has provided technical leadership and product strategic planning for the organization since the very beginning. Jim is a technology visionary who sets the long and short-term direction for Vervint. As our company has gained an international reputation, Jim has taught and spoken at conferences on a wide variety of topics in Europe, Japan, and throughout North America.