As designers, we are constantly refining our design process and trying new and different methodologies to improve our work. This involves research, analyzing findings, developing various solutions, listening to feedback, and iterating upon our designs.
Over the years there have been a lot of discussions on how we can improve each step of our process and the various methods associated with it, but many designers sometimes forget the importance of staying neutral and being patient throughout the entire process.
Biases often find their ways into our design decisions. Even the best designers may sometimes find themselves jumping to conclusions, taking shortcuts, letting their personal experiences influence their work, and more. Staying on course is even more difficult when working as part of a team. This is because everyone on the team has their own views which might not necessarily align with the views of everyone else. While the design process is enriched by multiple points of view, they also increase the chances of biases cropping up in the design decisions made by the team over the course of the entire process.
In our pursuit to be great designers, we need to work toward avoiding such biases and understanding the importance of remaining neutral when we’re evaluating someone else’s opinions. Remember that acknowledging a problem is the first step toward fixing it.
- Confirmation bias
Confirmation bias stems from direct beliefs or theories. Our minds tend to look for evidence that supports existing beliefs while ignoring those which contradict them. Once we form an opinion, we embrace the information which supports it and tend to ignore or not properly weigh the information which goes against it. This process is motivated by wishful thinking.
For example, if you are excited about buying a product and read a negative review about it, you may be more inclined to somewhat disregard the reviewer’s issues with the product or rationalize them in a way that puts the reviewer at fault.
Often during the design process, designers may get stuck on beliefs based on initial research, which may drive the project into a completely different direction than it should be headed. The more time and effort you invest into something, the harder it becomes to go back and reconsider the decisions you’ve made. We hold our beliefs close to out hearts and we begin to ignore any facts against them, which actually strengthens them. This is known as the backfire effect.
The backfire effect is actually a subtype of the confirmation bias. It describes how we tend to reject evidence which challenges our beliefs and, in the process, strengthening our support for our initial stance. This essentially means that showing people evidence which disproves their opinions can often be ineffective and actually end up causing them to support them even more strongly.
People experience the backfire effect because of a process that occurs in their brains when information is presented to them that conflicts their preexisting beliefs.
Essentially, when we are presented with information which suggests that what we currently believe is wrong, we tend to feel threatened, leading us to generate various negative emotions and desperately attempt to protect the information we had. This is especially likely to occur when the information we’re protecting is particularly important to us.
Here are some tips for avoiding confirmation bias:
- question your opinions at every step of the way and discuss them with others
- surround yourself with a diverse group of people, pay attention to their opinions, and weigh each one equally
- actively direct your attention instead of letting it be directed by your existing beliefs
- Framing bias
Framing represents the context in which choices are presented to us. Framing bias occurs when someone makes a decision based on the way their choices were framed instead of on their own merits.
For example, subjects in a study were asked to decide whether they would undergo surgery; a part of them were told that the survival rate is 90%, while the other part was told that the mortality rate is 10%. The framing based on the survival rate increased acceptance among the participants even though there were no actual differences in the situation presented to them.
It's important to remember that framing can affect both users and designers. One of the most important problems faced by designers is narrow framing.
In the finance industry, it is said that investors suffer from narrow framing when they make investment decisions without taking into account the context of their entire portfolio. This bias isn’t limited to just people in the finance industry and designers can also fall prey to it.
Here’s another example:
- 80% of users successfully found the bookmark button
- 20% of users were unable to find the bookmark button
The first way of framing the situation tends to shine a more positive light on the situation, while the second way of framing it makes it look like there is an issue which needs to be fixed soon.
Here are a few ways to avoid framing bias:
- Always try to frame things in at least two different ways to see if they represent the situation the same way. This will ensure that you are not the one creating the bias. Similarly, you should also do this when receiving information and evaluating it.
- Ask for more information in order to avoid ambiguity. The more you know about the context, the better your decisions will be.
- Avoid jumping to conclusions. Try not to rush decisions and invest time and thought into understanding the situation.
- False consensus bias
False consensus bias occurs when we assume that other people have the same opinions and beliefs that we do. Because our own beliefs are so accessible to use, we tend to extend them to others.
For example, if you are in favor of women having the right to abort pregnancies and opposed to capital punishment, you are likely to think that most other people also share your beliefs.
The people we tend to surround ourselves with to most of the time share most of our beliefs and opinions, otherwise we probably wouldn’t surround ourselves with them. Because of this, however, we end up living in a sort of bubble that deceives us into thinking that the majority of people are like ourselves and those we surround ourselves with. We are also more likely to notice when others share our beliefs than when they don’t, again leading to overestimate just how common these beliefs are.
This occurs more often than we think when building a new product or implementing a new feature with a team. We often need to make decisions based on a limited amount of information and this involves making certain assumptions which may very well be wrong.
Here are a few ways of ensuring that your assumptions are valid and not just based on your own beliefs:
- List every finding or assumption that was made with limited or no research.
- Walk your colleagues through your research and don’t leave out any details when explaining everything to them. Encourage them to be critical about your findings and always try to articulate how you reached your conclusions.
- Conduct more research to validate or invalidate your assumptions with a larger sample of your target audience.
- Availability heuristic
The availability heuristic refers to how we sometimes unknowingly give more importance to whatever information we can recall first or most easily. We tend to underestimate the impact of events which have happened recently or are frequently talked about.
In the context of design, we often become more sensitive to certain user pain points that we have heard about most recently and end up prioritizing them over others which may be more pressing. This issue worsens when there are several stakeholders each giving their own input.
Here are some tips on avoiding the availability bias:
- Conduct research into multiple sets of users or look at usage analytics to uncover user pain points instead of just listening to the opinions of stakeholders.
- Gathering insights from research can be a bit tricky as we can be influenced by the availability heuristic here as well. For example, if the analysis wasn’t thorough enough, we may give more weight to the most recent or memorable interviews. Design tools such as personas, customer journey maps, empathy maps, etc. can help us approach research data from a broader point of view.
The availability heuristic can also help us improve the user experience. If you remind users of a problem they are facing, they will consider it a problem that is worth solving. Here are a few things you can try:
- Focus your landing page’s design and copy on the problems that your product can solve, instead of what it does. “Get faster dinner delivery” is better than “online fast food delivery.”
- Give your users positive feedback when they solve problems and remind them what those problems were. “Congratulations, your inbox is empty!” is better than “Congratulations!”
- Remove any irrelevant information from your design. The more irrelevant information you include, the more diluted the relevant information will be.
- Curse of knowledge bias
This cognitive bias occurs when we unknowingly assume that others have the same information and knowledge that we do, thus being able to understand a topic or situation as well as we do.
If you’ve ever joined a new project and found it difficult to keep up with what everyone’s saying, it might not be your fault. The other people involved may think that their knowledge of the topic is very obvious, and you should already be able to understand everything, but that may not be the case. This is commonly referred to as a curse of knowledge.
For example, if you’ve been working on a system for a long time, you’ve likely become very familiar with its terminology. You begin using it in your design work as if they are self-explanatory, but in reality, others don’t really understand it. Similar situations can also arise during knowledge transfers.
Here are a few tips for avoiding this bias:
- When you’ve been working with a system for a long time, you tend to become biased, so it’s always a good idea to take a step back and think about things from a different perspective. Don’t assume your audience always knows what you’re talking about if they are unfamiliar with the system.
- Get a fresh perspective from someone who is unfamiliar with the system. Show them your wireframes and prototypes before finalizing your design.
- Leverage insights from newcomers on the team and actively ask for feedback from first-time users of your product. Asking for feedback from existing users may also help you figure out some missing information from the product.
- Research your target audience and find out how familiar it is with the jargon used in your industry, company, and product.
- Bias blind spot
The bias blind spot refers to recognizing biases in other people’s judgment but being unable to see them in your own. Similarly, the bias blind spot also describes our tendency to overlook the impact of our own biases on the decisions we make. You might not think it, but it is actually fairly common. A study conducted in 2015 revealed that out of 661 adults, only one said that they consider themselves more biased than the average person.
The term was coined by Emily Pronin, a social psychologist at Princeton, together with her colleagues Daniel Lin and Lee Ross. The bias blind spot seems to be a genuine blind spot, in that it is not actually related to our ability to make decisions. Regardless of their actual decision making ability, people tend to believe that they are less biased than others.
Learning more about cognitive biases comes with the responsibility of working toward being honest with ourselves about our decisions, actions, and behaviors, and assessing whether they are biased or not.
Knowing about the various mental traps we can fall into, we can take measures to protect ourselves and our work from them.