BEcurious

Designing Better Technologies with BE

Written by Kelly Peters | Oct 15, 2020 5:28:37 PM

Kelly Peters' keynote talk at BlockHack Global focused on the topic of consumer technology adoption. In many cases, developers and organizations build a solution anticipating droves of consumer interest, only to find that unanticipated barriers are curbing uptake rates.

In her talk, she provides an overview of BE for blockchain developers looking to overcome adoption challenges at the very outset of their conceptual stage. She shares some key behavioural insights that can be used to understand how people typically make decisions, and how they can apply these lessons to overcome the most common pitfalls facing adoption today. 

Watch the full recording below

 

Here are some excerpts from the talk: 

 


A small simple change in our language can change our thinking. When we start off a sentence with “I think that...”,
we state it as if it is fact. It’s got baggage around itour opinion, our egoand it inhibits our ability to receive feedback. On the contrary, when we start off a sentence with “it is my hypothesis that...”, it implies that we are still gathering evidence and our minds can be changed based on new evidence. This type of thinking allows people to go deeper with their thinking and has a data-seeking orientation.   


Traditional economics is based on this assumption that we are rational, self-maximizing,
and that people always have willpower.
This views people as robots where people would predictably act a particular way given a particular stimulus, but as we know from behavioural economics, people are not truly maximizers. Maybe early adopters of blockchain are more likely to be maximizing beings, but the general population is likely less so. 

Knowing this, when designing products and services for blockchain, designers should hang on to value propositions of blockchain, such as decentralization, as a Hippocratic Oath but also appreciate that most people are boundedly rational. 


B
ehavioural Economics is not saying that we’re either rational or irrational, but rather that we’re boundedly rational–that we can be limited by constraints, such as time and energyConsequently, we use heuristics (i.e., mental shortcuts) to inform our decisions and shape our understanding of what is good or bad, easy or hard, what’s trustworthy or untrustworthy. 

Heuristics can lead us to bias. Take buying a car for example. On one hand, we can set up a spreadsheet to objectively evaluate a car based on its specifications. This is called System decision making, which is objective decision making. On the other hand, we can act on our heart or gut, or use heuristics. We look for cues in our environment to guide our decision making, where we look around and see what’s popular or look at experts. This is called System 1 decision making, which is heuristic driven thinking.  


Designers
 should consider the user experience and should have a segmentation model about users. What assumptions are you making of your users? Are you assuming your users are relying on System 2 type of thinking – where they’re evaluating things and reviewing all the information? If so, then you should provide information users need and perhaps even education.

If you assume your users are relying on System 1, then they are relying on heuristics when engaging with your platform. However, heuristics are not necessarily always rational and they are not always the best guides for our decision-making.  

For example, people may be prone to frequency illusion. When you are looking to buy a car, you may “see” that car more often. This can lead to confirmation bias, confirming your preconceptions of the popularity of car, although in reality, the car may not be as popular as you thinkAlternatively, you may seek the expertise of a friend. However, there may be a halo effect, in which you view their opinion more positively, or they may have an undisclosed conflict of interest. 

Thus, when designing, you need to keep in mind the type of thinking users are engaging in. If users are engaging in System 1 thinking, then they may be prone to biases. If user are engaging in System 2 thinking, then designs must account for users’ limited cognitive abilities.  


For example, we can use biases to
help people save more for retirement. For retirement savings plan, it makes sense for people to invest in their savings because of reductions in taxes and some employers would match investments. But participation is low at about 15-20%. How come? 

We have to look into the human experience of thinking of retirement and savings. There’s cognitive overload when enrolling in these programs—so many documents to go through and forms to fill outThe traditional approach is to educate workers through seminars. These seminars seem to be effective at changing workers’ beliefs about retirement savings, as one study demonstrated that more than 80% said yes to enrolling, however, most did not follow through 

When it comes to retirement savings, people are prone to the status quo bias, where they are fearful of an error of co-mission but not an error of omission. More people are afraid of intentionally making a mistake rather than inaction. 

Thus, a proven strategy is to use defaults with the option to opt-out, which help people save while maintaining the autonomy of choice. As it turns out, the use of defaults resulted in over 90% of people stay enrolled. From a libertarian standpoint, this strategy is ethical, as people still have free choice to opt-out.  


Another powerful tool that can lead to adoption is social proof. One example of social proof is by Duncan Watts and colleaguesIn their Music Lab study, they created an artificial music market. In this market, they allowed users to download and rate the music, which determined the music’s popularity.

Subsequently, they reversed the ratings of the songs, where the ranked #1 song was now the last song, and the ranked #2 was now the second-last songand so forth. Then, with a new group of participants, the researchers exposed them to this music market with this new reversed list.

In this new social world, people were influenced by the ratings, in which the popular songs were downloaded the most, and the least popular songs were downloaded the least. However, over the long run, the popular songs in the original rating became more popular, while the least popular songs in the original rating became the least less popular. This study demonstrated the potential powerful influence of social context on decisions. 


Behavioural Diagnostics 

In this phase, we try to understand and segment the users based on their decision-making and behaviours. We can design products and services by segmenting users into rational thinkers or heuristics thinkers. 

Choice Architecture

Leveraging biases, such as status quo and defaults, to nudge people to behave in a particular manner. 

Experimentation

This is where we generate evidence to test our hypothesis.  


The TAM framework is broken down into two main parts. The first is the u
tility of the technology, which is perceived usefulnessThe second is cost of use, which is the perceived ease of useThese two factors influence attitudes of using the technology and users’ attitudes help determine adoption. 

 

Perception of Risk  
If people view the technology as risky, they would be less likely to adopt it. We can counter high levels of perceived risk by leveraging self-efficacy. This can be done by facilitating a sense of mastery, and using baby steps and feedback. For example, Dropbox’s introduction to its services   break things down for users to help them “master” Dropbox. 

Lack of Trust  

If people do not trust the technology, they would be less likely to adopt it. We can counter lack of trust with operational transparency, which is where users are shown the hidden processes underlying a service/product. For example, in a study using kayak.com, it was found that when the search process was slowed down to show effort, users’ perception of value went up. 

Status Quo 

As mentioned, people like to stick with the status quo. One way to counter status quo is with habit disruption, which is to introduce something new when people have a change in habit. 


Q (1:03:00)
: There’s so many choices in blockchain. Is it on us as developers to weed out the many options? 

A: When there are so many choices, there is a risk of exposing users to choice overload, which dissuades them from making a decision. 

Iyengar led a study about choice overload using jams. Note that this specific study did not replicate but it is relevant to our discussion

This study was set up in a grocery store, where customers could sample jams. In one setup, there were 6 choices of jam and in a separate setup, there were 24 choices of jams. It was found that significant more people bought a jam when they were exposed to 6 jams as opposed to 24 jams. So more people acted when there were fewer choices.

So how can this be applied? You can use clustering and have a triage
mechanism
, in which people can choose based on features of a product. For example, a jam can be split into sugar or sugar free and so forthIt is the same thing in the investment world. With so many names, people either don’t choose at all or they choose investment based on names, which is not rational. 

 

Q (1:08:50): What is the FOMO (Fear Of Missing Out) role in influencing our decisions? It is not rational but very human trait. 

A: This is a classic example of the scarcity effect. Salespeople use this tactic to drive sales by leveraging limited timeScarcity is based on loss aversion, in which people typically are driven by not wanting to lose something, and that is a powerful motivator. There are psychological taxes that can influenced behaviour, such as fear, guilt, regret, and shame. These behavioural influences should be used for good, ethical outcomes. 

 

Q (1:11:00): Are there traps and pitfalls that you’ve noticed that designers fall into when designing Blockchain products and services? 

A: Behavioural Economics can be a powerful tool. My intuition is that consumers are viewed as rational actors, in which people will act accordingly to information and incentives. However, we know people are boundedly rational, so people need to leverage the tools from psychology to help drive better design. We need to use BE to build and scale in order to gain adoption.