Try a little thought experiment with me.
Take out a piece of paper. Call to mind the last two digits of your cell phone number. Write them down. Now think about the answer to this question: How many African member nations are in the United Nations? Write this answer too.
In the early years of this century, a professor at Harvard Business School regularly began her course in behavioral economics with this exercise, collecting the slips of paper from her students and tallying the results. Invariably, a student's estimate of the second figure would be biased by his or her knowledge of the first despite the fact that they’re not related. Students with low-digit phone numbers would estimate a lower number of countries, and vice versa. In psychological terms, the students' estimates had been anchored.
This concept of anchoring is important in business intelligence. Everything we do is about information, and if two seemingly unrelated pieces of data can impact each other, shouldn’t we understand what’s happening?
Let me explain what’s going on.
The anecdote above is recounted by Michael Lewis in his most recent book, The Undoing Project. Readers (and moviegoers) will remember Lewis for his earlier work, including The Big Short, The Blind Side, and Moneyball.
In these books, as well as others, Lewis illuminates current societal trends, big ideas, and sometimes even just interesting quirks of human nature. He does it by providing us with an intimate look at the people implementing them. Moneyball was as much an introduction to the managers, coaches, and players of the Oakland Athletics as it was about how baseball experts--any experts--can allow their judgment to be warped by their own minds.
In the same way, The Undoing Project is a look at underlying causes of the flaws and biases that are at the root of human errors in judgment, and a portrait of the two Israeli scientists who had tested and described them decades ago.
For people in our line of work—that is, providing decision makers with the data they need to make judgments about an organization's direction—this topic should be of more than a little interest. Business intelligence has at its core the premise that data-driven decision making is superior to gut instinct.
The Undoing Project provides a great introduction to the studies showing why this is true, and some of the unexpected pitfalls we may run into. It is also a beautifully written story of collaboration and friendship.
Daniel Kahneman and Amos Tversky were an unlikely pair. Danny was a childhood fugitive from the Nazis, an introvert, by all accounts full of doubts. For the most part self-taught in the subject, he nonetheless became at age 20 the Israel Defense Force's expert on psychological matters, designing a personality test comprised of a simple series of factual questions.
The resulting "Kahneman score" was so predictive of a recruit's future success that it is, with very few modifications, still in use today. Amos Tversky was a native Israeli, his mother a member of the young country's first parliament. He was a paratrooper, a leader, a war hero, an extrovert, brash, brilliant, and argumentative.
Danny's first instinct, on the other hand, was not to argue, but to question and make sense. "When someone says something," he would say, "Don't ask yourself if it is true. Ask what it might be true of."
Yet somehow these two became the closest of friends, collaborators whose ideas were so intertwined that when they published their first paper together simply flipped a coin to determine which one would be credited with lead authorship, alternating for every subsequent work.
Kahneman's and Tversky's joint work in decision making and risk became a foundation of behavioral economics. It became the basis of the study of heuristics in human decision making, how gut feelings could actually interfere with judgment. They studied and tested human biases (the endowment effect, confirmation bias, present bias, hindsight bias, recency bias, vividness bias, and others) not just to catalog them but to explore the human mind and the errors it could fall prey to.
As anyone in business intelligence knows, presenting accurate information in the way it is intended can be challenging.
Lewis explains why this happens: "Were they investigating the biases or the heuristics? The errors enabled you to offer at least a partial description of the mechanism: The bias was the footprint of the heuristic."
Advances in our understanding of how people behave in uncertain situations and with limited information are rooted in their original studies.
In the end, their work is not a dismissal of human instinct, but a caution. Lewis writes, "In their talks and writings, Danny and Amos and explained repeatedly that the rules of thumb that the mind used to cope with uncertainty often worked well. But sometimes they didn't; and these specific failures were both interesting in and of themselves and revealing about the mind's inner workings."
Amos Tversky's advice: "Always keep one hand firmly on data." And this is where we in the BI world come in. People in business intelligence have been providing decision makers with data for decades, with the underlying media maturing from green-bar reports to KPIs to visualizations to dashboards and beyond.
I believe that giving people immediate access to the enterprise data they need, and making information as ubiquitous in their business life as it is in their daily online doings is the key to reducing unintended bias. People today would be a lot less likely to fall prey to the anchoring bias in our professor’s exercise if getting answers to questions didn’t look like a game of telephone.
I recommend you read the book yourself, it contains a lot of information that is relevant to a BI practitioner.
And by the way, there are 54 African member nations in the UN.