COGNITIVE BIAS IN SOFTWARE TESTING

It is an acknowledged fact that we perceive errors in the work of others more readily than in our own.

– Leonardo da Vinci

Software tests have changed over time from just identifying and reporting bugs to preventing them. Primary tests in the early stages of the software development life cycle bring quality to software products and minimize market failure. As developers, testers, analyzers, project managers, software specialists, and even upper managers, we all certainly agree on that.

We also have lots of high-quality smart testing tools, development and testing processes, rules, and methodologies to identify defects and fix bugs. However, there can still be lots of defects that have been “missed” and found after a product goes online or is deployed to production. Two questions still remain: Why or how we have missed a particular defect and whose fault is this?

Smart teams will check the processes, calibrate their testing tools accordingly, find the gap, maybe give necessary training to their testing team and development team and take all possible actions to prevent the “missing the bug” failure in the next deployment. That’s the easy part. Smarter team members will try to look deeper within themselves, which is the difficult part. An introspective approach, however, is particularly effective towards improvement of software testing as it should provide software test professionals with clues as to what kind of individual issues they are up against, one of which is the topic of this article: Cognitive biases.

Wikipedia defines cognitive biases as “Cognitive biases are systematic patterns of deviation from norm or rationality in judgment”. Cognitive biases impact how we think, feel, communicate, interact with friends, buy, sell, use products, and so on. Often, we are unaware of our own cognitive biases and how they impact our lives. If you disagree about this last sentence, that’s your “bias blind spot”, an instance of cognitive bias that prevents you from recognizing the impact of biases on your own judgment, while readily doing it for others.

Software testing process is affected mostly negatively by a team’s or an individual’s biases. Every single team member -even the most experienced ones- has biases that (s)he cannot control and sometimes is not even aware of. Needless to say, an ideal work environment is free of bias. The problem is that this is simply unachievable because of human nature. Bias is a natural tendency to view the world through a lens that has been colored by past experiences. Modifying biases in ourselves is very difficult. An easier thing to do is to acknowledge the bias in the team to reduce the negative impacts of biases in product quality.

Software testing is both an art and a science; a profession that consists primarily of making choices, assessments, evaluations, and conclusions about software features. Software testing is also subject to cognitive bias throughout the phases of test creation, execution, and consumption of results. There are many types of cognitive bias that can affect software testing processes, testing and QA teams. I have summarized four of them that we have faced during our consultancy services:

1.   Confirmation Bias

Definition: The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.

Confirmation bias favors the selection of test cases toward those cases that testers predict will execute according to their expectations. A good approach to deal with this urge is to continue reviewing all assumptions and explore other ways to test “outside the box” as in negative testing.

2.   Bandwagon Effect

Definition: The tendency to do (or believe) things because many other people do (or believe) the same. Related to herd behavior and groupthink.

When a certain number of people believe in something, then it automatically increases the probability of the other person also to believe the same. This happens many a time in our day-to-day lives. A most common example is when we buy some products. Rather than we independently selecting a product we usually go with the belief of others. (that’s how we as human created “vogue” or “fashion”)

Exactly the same behavior is displayed in the testing world as well. If a team member feels that one particular module is defect free, we unknowingly tend to believe the same especially if we have a long history of working as peers.

3.   Congruence Bias

Definition: The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.

This is similar to confirmation bias, in which testers plan and execute tests according to written expected behavior. It can induce problems, especially on major releases that contain new features and/or require user training. It mostly occurs when not all negative flows are mentioned in the requirements document.

4.   Inattentional Blindness

Definition: Inattentional blindness, also known as perceptual blindness, is a psychological lack of attention that is not associated with any vision defects or deficits. It may be further defined as the event in which an individual fails to perceive an unexpected stimulus that is in plain sight.

This is not exactly a bias but lack of attention and tendency of us test professionals to miss the most obvious defects when we are not looking for them.

To relate this to the testing world, in an enhancement project for example, where one of the screens is newly developed, then it is a natural tendency of the testers to focus on the newly developed screen more and miss out the other critical integrations or changes on the other screens. Regression tests could be a helper for this particular issue.

Conclusion:

There are several dozens of biases that affect human daily life, that help decision making process easier for ourselves according to scientists, researchers and psychologists. Many of these biases affect belief formation, business and economic decisions, and human behavior in general. As testers and QA team members, there is no easy way to overcome these biases. After all, we all are human beings, not robots. What we can do is to know and acknowledge these biases and be open-minded about the outcomes of irrational behaviors and be willing to make changes as the conditions necessitate.


Further reading:

[1]: Yalansavar web site

http://yalansavar.org/

[2]: 18 Cognitive Biases You Can Use for Conversion Optimization. Shanelle Mullin, October 15, 2015.

http://conversionxl.com/blog/cognitive-biases-in-cro/

[3]: List of cognitive biases,

http://en.wikipedia.org/wiki/List_of_cognitive_biases

[4]: Fuqun Huang (June 21st, 2017). Human Error Analysis in Software Engineering, Theory and Application on Cognitive Factors and Risk Management, Fabio De Felice and Antonella Petrillo, IntechOpen, DOI: 10.5772/intechopen.68392. Available from:

http://www.intechopen.com/books/theory-and-application-on-cognitive-factors-and-risk-management-new-trends-and-procedures/human-error-analysis-in-software-engineering

[5]: Oswald, Margit E.; Grosjean, Stefan (2004). “Confirmation Bias”. In Pohl, Rüdiger F. Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp. 79–96. ISBN 978-1-84169-351-4. OCLC 55124398.