This is a vitally important article, and will probably take you a few days to work through, working through each of the major sections a step at a time. Don’t rush this, and do the exercises, including watching the videos and doing the additional reading we recommend. This topic is one of the most important cornerstones to improving your critical thinking skills.
Our brains are the most powerful computers on the planet (if you’re reading this in the archives after the year 2045, please don’t snigger). In order to work well, our brains have developed many different shortcuts to making decisions. Although we manage some of these shortcuts consciously, most happen without our knowledge at a subconscious level.
There’s nothing wrong with this – in fact, if our brains didn’t do this they wouldn’t work for us at all. We get way too many inputs and there are way too many things for us to consider when making decisions, so our brains have to summarise, select and shortcut in order to get to any sort of coherent thought or outcome. This normally serves us well.
The problem is when these shortcuts lead us to making mistakes, errors or bad decisions. We call these cognitive biases.
Example 1. When you are ready, click to open this textbox, look at this list of characters for 5 seconds and try and memorise as many of them as you can, and then click to close the textbox again.
Answer: Now, close the textbox or look away from the screen and try and repeat as many of the characters as you can. When you’ve done that, click to open the textbox below and see an explanation.
I suspect that the most common characters remembered would be the “$”, “9” and “DOGS” at the end. The “$” and “9” stand out because they’re different and your mind gives them more attention. The letters “DOGS” stand out because it is already a known set of letters that makes up a word, so your brain only needs one placeholder for this pattern, rather than the four individual letters. Some people might have remembered “RT” because this was repeated. Our brains tend to focus on and remember unusual information and patterns better than random or unremarkable information. We can end up giving more attention and value to things that stand out, and this can influence decision making.
Example 2: Are words in English more likely to start with a K, or have a k as their third letter?
A: They are twice as likely to have k as the third letter. But our minds find it easier to think of words that start with K, so we default to this as our answer.
Example 3: Here is a Multiplication quiz. You need to do this in your head, and take less than 5 second to complete it. Unless you’re a mathematics genius, you’re going to have to guess the answer and provide just an estimate. Are you ready? Go…
Multiply these numbers together:
9 X 8 X 7 X 6 X 5 X 4 X 3 X 2 X 1
Ok, what’s your best guess at the answer?
- Is it about 230,000?
- Or 200,000?
- or about 170,000?
- Or something else?
Click here for the answer and explanation to example 3:
A: What was your answer? Would it surprise you to discover the real answer is actually 362,880.
This is an example of anchoring. Unless you’re a mathematics whizz kid, the fact that I suggested some answers for you “anchored” your most likely response. Because I anchored you lower than the real number and then reduced it even further, you probably selected a number much lower than the actual 362,880. If you were close to the real number, you might have double guessed yourself and written a number lower than your actual guess. If I had anchored you at about half a million, you’d have guessed too high.
How did you do with these simple tests? And what did you think of how your brain can be manipulated and fooled?
You might realise immediately that marketers and salespeople are pretty good at using these techniques to influence you. We’ll talk about that more below. But this topic is much more important than just a few sales techniques.
In our increasingly complex, ambiguous and fast-paced world, unfortunately our brains are proving to be a little bit outdated. We’re in need of an upgrade. The easiest upgrade available is to identify, recognise and accept our cognitive biases, and then develop skills and techniques to overcome them. Our brains can be trained to do this.
What goes wrong
While we’d like to imagine that our beliefs are rational, logical and objective, the fact is that our ideas are often based on incomplete information, faulty memories or incorrect analysis of the information that is available.
Cognitive bias errors can be classified as either belief perseverance (our inability to accept new information that contradicts existing beliefs) or processing errors (failure to manage and organize information properly, either through errors of judgement or lack of capacity to compute and analyse data). Note that we’re not talking about logical fallacies – these stem from an error in a logical argument.
Cognitive bias is rooted in thought processing errors, where the logic might be correct, but the selection or weighting of data leads one person to one outcome and another to something completely different.
Cognitive biases are not necessarily all bad. In fact, they’re necessary: they allow us to reach decisions quickly, or even reach decisions at all. Since attention is a limited resource, we have to be selective about what we pay attention to, and this allows biases to creep in and influence the way we see and think about the world. If we had to think about every possible option when making a decision, it would take too much time to make even the simplest choices. Shortcuts are necessary. These mental shortcuts are technically known as heuristics. While they can often be surprisingly accurate, they can also lead to errors in thinking – and they often do. In addition, cultural prejudices, social pressures, individual motivators, emotions and limits on our mind’s ability to process information can also contribute to these biases.
Watch this 6 minute video to alert yourself to the impact cognitive biases can have in the workplace:
If you’d like more information at this point, here are two videos of about 15 minutes each which give you some more introductions to the topic. The first has a psychology-based focus, and the second comes from the world of investing and finance. Both focus on an introduction to cognitive bias:
You should also add Daniel Kahneman’s excellent book, “Thinking Fast and Slow” to your reading list. Along with his colleague, Amos Tversky, Kahneman identified that our brains have two distinct ‘operating systems’: System 1 is fast, gullible and jumps to conclusions. System 2 is slow, laborious, painstaking and is “in charge of doubting and unbelieving” but “sometimes busy and often lazy.” It’s this mix of speed in system 1 and system 2’s “laziness” that causes cognitive bias.
Kahneman gives a frightening real world example: in an analyse of over 1,000 decisions made by Israeli parole board judges, it was discovered that the judges freed prisoners in up to 65% of cases straight after meals and snacks, but the rate fell close to zero just before breaks. The last prisoner evaluated before a refreshment break was never released. How crazy is that? Being tired or hungry reduces our capacity to think, so we revert to easy options and defaults. And change people’s lives because of it.
Someone’s life might not depend on the next decision you make, but it’s still important to recognise, understand and deal with cognitive biases when we can.
Part 2: Common Cognitive Biases, and how to deal with them
There are literally hundreds of cognitive biases (see the wikipedia list here and the rationalwiki list here.) In this lesson, we will deal with just a few categories of the most important. With each one, I’ll give a definition, some examples, implications and some suggestions for dealing with it.
Confirmation bias is the tendency for us to look for information that agrees with what we already believe and to give more credit to evidence that supports those beliefs. Changing our viewpoints is hard cognitive work, and our brains tend to avoid doing it whenever possible. This is an often unconscious act of referencing only those perspectives that support and confirm our pre-existing views, while at the same time ignoring or dismissing opinions and information that threaten our views.
A 4 minute video to watch: Confirmation bias and the primacy effect as it relates to interacting with other people
Example: Imagine that a person holds a belief that red-headed people are more creative than anyone else. Whenever this person encounters a person that is both red-headed and creative, they place greater importance on this “evidence” that supports what they already believe. They might even think of a red-headed person they meet as more creative than they really are, so powerful are confirmation biases in our minds.
The confirmation bias is related to something called our reticular activating system. For example, if you are considering buying a particular make of new car you suddenly see people driving that model of car everywhere. If you or your partner become pregnant, you’ll suddenly feel that the world is filled with toddlers, babies and pregnant women. Confirmation bias is seeing the world through a filter of what interests your brain right now, and what you believe about the world.
Confirmation biases do not merely impact how we gather information, but also how we interpret and recall information.
In many cases, people on two sides of an issue can listen to the same story, and each will walk away with a different interpretation that they feel validates their existing point of view – the confirmation bias is working to “bias” their opinions.
Confirmation bias is a very powerful marketing technique, when marketers attempt to raise the profile of a few key characteristics of a product and then ensure you experience these benefits as early as possible in the use of the product. Once you’ve decided it’s a good product, you continually look for reasons to confirm this belief, and loyalty is much more likely.
Another application of an understanding of confirmation bias is how we motivate ourselves when learning something. I am busy teaching my youngest daughter to ride a bicycle. If I said to her as I gave her a helpful shove, “Don’t fall off” I have put this thought in her head, and her whole body keeps telling her that she is most likely to fall off. The slightest wobble of the bike reinforces this thought, and she falls off within a few seconds. It would be much better to provide positive and affirming inputs such as “Keep your balance” or “stay upright”.
So, really, it’s true – the motivational gurus have it right: visualisations and positive self talk are important. There is a reason that one of the most common characteristics of successful people is that they are optimistic. They believe they’re going to succeed and their whole system looks for confirmation of this success – all the time.
How to avoid bad confirmation bias:
- Develop a good understanding of the scientific method. The key to this approach is that you need to go through the process of attempting to falsify your discovery. Scientists don’t stop when they’ve proved something to be true. They then true and disprove it, actively looking for contradictory evidence. We should do the same.
- Be receptive and open to criticism. This is easier said than done, but the best first step is to ask whoever is giving you criticism to be specific, rational and attempt to remove emotion. Just asking a critic to clarify their criticism can reduce the emotion and help you process their critique.
- Challenge your own thoughts.
- Read more widely. Especially seek out articles, websites and news sources that you don’t agree with.
- Attempt to see the world from their perspective.
- Ask for opinions from outside, and collect information from diverse backgrounds. Having diversity on your team is a critical resource.
- Have a devil’s advocate in your group.
- Work in a diverse group, and actively involve other people in decision making.
When trying to make a decision, we tend to be overly influenced by the first piece of information that we hear. We have a tendency to focus too much on a single piece of information rather than all available information. This often happen with the first piece of information you received (anchoring bias), and can also happen with the most recent information you received (recency bias), or the most emotional information you have in your memory (availability bias).
These biases have to do with what we remember and bring into our minds, and is obviously influenced by many things including culture, beliefs, expectations, emotions and feelings, as well as our frequency of exposure to information. In today’s world, this is fuelled by media and social media.
Example: Are more people killed by donkeys or sharks each year? Think about your answer, and then click this textbox to find out more.
When rare events occur they become very visible as they receive heavy coverage in the media. This means that later we are more likely to recall this rare event, and our minds will suggest that it is not, in fact, rare. We too easily assume that our recollections and memories are representative and true, and we discount events that are outside of our immediate memory.
But it can be even worse than just a memory problem. Researchers have found that having participants choose a completely random number can influence what people guess when asked unrelated questions. Participants were asked to guess how many countries there are in Africa. But before asking the question, they spun a wheel with numbers 1-100 on it. This generated a random number. People who spun a low number guessed a lower number in answer to the question than those who spun a higher number. The correlation was signifiant – a random number directly influenced the answer people gave to the question.
Example (taken from https://www.fs.blog/2011/08/mental-model-availability-bias/):
“Many life decisions are affected by the vividness of information. Although most people recognize that AIDS is a devastating disease, many individuals ignore clear data about how to avoid contracting AIDS. In the fall of 1991, however, sexual behavior in Dallas was dramatically affected by one vivid piece of data that may or may not have been true. In a chilling interview, a Dallas woman calling herself C.J. claimed she had AIDS and was trying to spread the disease out of revenge against the man who had infected her. After this vivid interview made the local news, attendance at Dallas AIDS seminary increased dramatically. Although C.J.’s possible actions were a legitimate cause for concern, it is clear that most of the health risks related to AIDS are not a result of one woman’s actions. There are many more important reasons to be concerned about AIDS. However, C.J.’s vivid report had a more substantial effect on many people’s behavior than the mountains of data available.”
In our work environments, one of the most important problems with these biases is when conducting performance appraisals. Working from memory, vivid instances of an employee’s behavior (either positive or negative) will be most easily recalled from memory, will appear more numerous than commonplace incidents and will therefore be weighted more heavily in the performance appraisals. Add to this that the most recent engagements with an employee will further colour the appraisal, and you can see why some people argue that we should get rid of any staff assessments based on memory.
Example: A 5 minute video on anchoring. (You’ll recognise a similar multiplication exercise to the one we used in Part 1 of this article.)
This bias is used very often in sales negotiations. It’s easiest to see in some restaurant menus: they feature very expensive meals while also including more (apparently) reasonably priced ones. This is done with wine as well. The higher the price of the most expensive meal or wine on the menu, the higher the average spend is. The higher anchor drags all the other numbers higher. This anchoring bias is also why, when given a choice, we tend to pick the middle option — not too expensive, and not too cheap. Sellers and agents will intentionally quote exorbitant prices for their products. They do not intend to sell at those prices, but they set the price anchor at a high level so they can lower it a little later during negotiation.
Example: In 1984, Kahneman demonstrated the retrievability bias when he did a study that asked participants to estimate the frequency of seven-letter words that had the letter “n” in the sixth position. Their participants estimated such words to be less common than seven letter words ending in the more memorable “ing”. This response is incorrect. Logically, it’s easy to understand, since all seven letter words ending with “ing” also have an “n” in the sixth position. However it’s easy to recall seven letter words ending with “-ing”.
How to avoid the anchoring and recency biases and availability heuristic:
- Work from data as far as possible – research and study everything, be evidenced-based.
- Improve your understanding of statistics, especially of probability.
- Be skeptical of yourself.
- Slow down and reflect.
- Specifically try and recall instances of the issue that aren’t so memorable, and actively search out disproving examples (this helps counter confirmation bias as you’re doing this).
- Ask more questions, and better questions. Ask harder questions.
- A great question is: “Walk me through a scenario where this issue could be bad for you.”
- Work in a diverse group, and actively involve other people in decision making.
The framing bias is when we process the same information differently depending on how it is presented and received. Does it matter whether your doctor tells you that you have a 15% chance of dying from a disease, or that you have an 85% of surviving? It actually does make a difference.
Lawyers are extremely good at using the framing effect to phrase questions designed to lead to specific answers. The classic example of this is the great question: “Have you stopped beating your wife yet?” There isn’t a way to answer this question without implying your guilt in beating your wife. You have to nullify the question, not answer it.
Consider this example that has been proven over and over again by researchers:
Example: You work for the Center for Disease Control and there is an outbreak of a deadly disease in a town of 600 people. All 600 people in the town are expected to die if you do nothing. Let’s say you have come up with two different programs designed to fight the disease:
With Program 1: 200 people in the town will definitely be saved.
With Program 2: There is a 1/3rd probability that all 600 people will be saved, and a 2/3rds probability that no people will be saved.
In the study, 72 per cent of the subjects picked Program 1.
Now consider the same scenario worded differently:
With Program 3: 400 people in the town will definitely die.
With Program 4: There is a 1/3rd probability that nobody will die, and a 2/3rds probability that 600 people will die.
Now which do you pick?
In the study, 78 per cent of the subjects picked Program 4.
But here’s the problem: the net result of the second set of choices is exactly the same as the first set. Programs 1 and 3 mean the same thing, and Programs 2 and 4 mean the same thing. Depending on how the question is asked, the answers differ completely. The framing bias has changed the answers dramatically, simply based on how the issues has been presented.
Recognising the framing bias for what it is can be frightening, because you realise that we can be so easily manipulated simply based on how a question is asked. Politics is filled with the framing bias.
How to avoid the framing and priming biases:
- When you know that someone has a reason to distort your opinion (such as political messages, lawyers, etc), be very aware of the danger of framing.
- When you hear statistics or numbers being used to bolster an argument, always look for potential framing issues. Ask how the numbers could be presented differently, and whether this makes a difference.
- Especially keep your eyes open for negative numbers that can be reframed positively – do this reframing and see if it changes your perception (e.g. your doctor says you have a 20% chance of not surviving cancer, versus if your doctor had said you have an 80% chance of surviving it).
- Ask better questions – ask open ended questions, and don’t imply the answer in the way you ask the question.
- Refuse to answer “either/or” type questions.
- Always interrogate the assumption implicit in how a question is asked.
- Work in a diverse group, and actively involve other people in decision making.
Additional Reading on Cognitive Laziness: THE FRAMING EFFECT BIAS: IMPROVING DECISION MAKING SKILLS FOR COGNITIVE MISERS
Part 3: The bandwagon effect or conformity bias
This cognitive bias refers to the tendency we have to adopt the same beliefs as the people around us, or to assume that other people are making the right decision and that we should go along with them. This cognitive bias is closely related to a psychological phenomenon known as herd mentality. We experience the bandwagon effect or conformity bias when we place much greater value on decisions that are likely to conform to current trends or will please individuals within our existing (or desired) peer group.
Researchers have looked at this often, doing many different types of exercises to prove this point. My favourite example is described by Dan Ariely, author of “Predictable Irrational” – it’s a version of the famous Asch Experiment (see below).
Five students are seated at a table and are asked questions by an examiner. They answer the questions in order, from student 1 to student 5. What student 5 doesn’t know is that everyone else is part of the research team – they are the only one being tested. The examiner asks a simple question, like “What is 4 times 12?”. Each of the first four students gives the same wrong answer, let’s say “44”. The final student, who is the subject of the test, almost always goes with the same answer as everyone else, even though it’s obvious on their face they are surprised that everyone else gives this answer. They typically assume they misheard the question.
Example: When a crowd of people is queuing to get through a limited number of entrances, sometimes a long line will form at one entrance while the one next to it is completely free. Each new person shows up and just assumes that the second entrance is broken, closed or not working. If no one decides to test this assumption, then the line will get longer and longer for no good reason.
Though we’re often unconscious of it, we love to go with the flow of the crowd. When lots of people start to pick a winner or a favourite, our individual brains start to shut down and we enter into a kind of “groupthink” space. It doesn’t actually have to be a large crowd – it can include small groups, like a family or even a small group of co-workers. The bandwagon effect is what often causes behaviours, social norms and memes to propagate among groups of individuals.
The bandwagon effect is something we see every day. It’s an effect that enhances popularity by virtue of already-existing popularity. Marketers take advantage of this by using social proof. The best example is on Amazon.com, where you are told all the time about ratings, “people who bought what you’ve just bought also bought these other items”, and post-purchase affirmations that you’re part of a group of people who made a good decision to buy.
As with all these biases, it’s important to realise that the bandwagon bias is not always incorrect. In fact, quite often, the group is correct in its decisions. The danger is those few times when its not. If you find yourself feeling good because everyone around you is thinking just like you, you should consider that a big sign of a potential problem. It may well be that people in your team are self-censoring for fear of exclusion or retribution.
How to avoid the bandwagon effect:
- When you have unanimity and consensus, be suspicious. Ask questions, and invite contradictory or questioning opinions from the group.
- You often need to remove the loudest voices or the people with the most authority from the room in order to help others be free to express opposing opinions.
- Hire people who share your values, but not your views. Don’t be scared to include people who disagree with you in your team.
- Do a pre-mortem. Have a meeting where you assume your decision was a disaster a year from now, and describe how this disaster unfolded. See details of this technique explained by Daniel Kahneman in this 3 minute video. (Here’s a more detailed overview of the workshop structure if you need it – click here).
- Having a diverse group is critical to ensuring we don’t fall into a herd mentality. Watch this video if you didn’t watch it above, and listen especially to the variations of the Asch Experiment where the subject was given a partner or there was diversity in the answers the group gave. Having different views expressed in your group is the best way to ensure you don’t fall into the bandwagon effect.
This bias is the tendency to attribute greater weight and accuracy to the opinion of an authority figure.
Example: For many years, advertisers would put people selling toothpaste (and other household products) in white lab coats. The perceived authority of the doctor or lab technician was being used to influence our acceptance of the product.
Example: Probably the most famous authority bias experiment of all time was conducted by Stanley Milgram, a Psychology Professor at Yale University in 1961-62.
Prof. Milgram placed an ad in the paper offering $4 for volunteers to participate in a memory and learning study. When people showed up, Milgram performed an experiment that had nothing to do with memory and learning.
Under the watchful gaze of Milgram, the volunteer – labeled “the teacher” — read out a string of words to his partner, labeled “the learner.” The learner was hooked up to an electric shock machine in another room. If the learner made a mistake while repeating the words, the teacher would deliver an electric shock of increasing intensity. The starting shock was 15 volts and the maximum shock – a lethal dose – was 450 volts. Some people refused to do it and stopped the experiment early. Other people continued with Milgram urging them that “the experiment must go on.”
Amazingly, in one version of the experiment, 26 of the original 40 participants, with encouragement from the researcher (both beforehand, with an explanation of why the experiment was so important, and during the experiment, with reminders that they need to carry on, if the participant started to hesitate) proceeded to the maximum shock level – long after the learner pled for mercy, warned the teacher of an apparent heart condition or appeared to be either unconscious or dead after a “fatal” shock.
The most interesting part is they believed the electric shocks were real until they emerged from the lab and were told the truth. The cries of pain were recorded and the learner — Jim McDonough— was in on it.
This same pattern of authority bias is very evident in the corporate world. The highest ranking person, the most highly paid person or the most experienced person (however that is defined in the room) often have extraordinary influence on other people in a room, and their opinions can easily sway everyone else.
How to avoid the authority bias:
- In Milgram’s experiments, people were much more like to be defiant when the experimenter wasn’t in the same room as them. Creating distance between yourself and the authority figure can help you take a stand against bad authority. So, make sure you are aware of who has explicit and implicit authority on a group (or on yourself), and ask them to leave the room during critical decision making moments.
- Seek disagreement. Alfred Sloan, President, Chairman and CEO of General Motors for many years, believed that decisions shouldn’t be made until someone had expressed why the “preferred” option might not be the right one. If everyone in the room agreed, he would postpone decisions so that disagreements could be developed, presented, discussed and debated.
- Have someone play Devil’s Advocate in every discussion.
- Ask the following questions:
- Is the person making this statement an authority in this area?
- Is this person being paid or rewarded for making this statement?
- What expertise does this person have in this area?
This bias is the tendency people have to blame external forces when bad things happen and to give themselves the credit when good things happen.
Example: When we win a poker hand it is due to our skill at reading the other players and knowing the odds, but when we lose, it is due to getting dealt a poor hand.
Example: A student thinking about their grades: When I get a good grade on a test I attribute it to my intelligence, or good study habits. When I get a bad grade I attribute it to a bad professor, or poorly set exam.
This bias does serve an important role: it helps protect our self-esteem, builds our confidence and helps us persevere in the face of adversity. However, it is more often damaging as it leads to faulty attributions, analysis and actions. This is very common in the business world (and elsewhere) as people regularly take credit for successes but do not accept responsibility for failures.
Interestingly, experts suggest that this bias is much more widespread in Western cultures – especially the United States – than it is in Eastern cultures such as China and Japan. This is because individualist cultures place a greater emphasis on personal achievement and self-esteem – and therefore protecting the self from feelings of failure is more important. Collectivist cultures, such as tend to be found in Eastern and African cultures, are more likely to attribute personal success to luck and failures to lack of talent.
How to avoid the self-serving bias:
- Value failure and take accountability – the self-serving bias is often a way of escaping failure.
- Find ways to give others credit.
- Take less credit when things go well. Acknowledge the role of luck and circumstance.
- Examine and know your strengths and weaknesses.
- Look for opportunities to improve rather than confirmation of good performance.
- Take the time to evaluate outcomes after the fact.
Using Cognitive Bias for Your Benefit
While we definitely want to learn how to identify and overcome cognitive biases in our own thinking, we can of course use our knowledge of cognitive biases to engage and influence other people. Here’s an interesting website giving insights on how to do this for marketing and sales purposes: https://getuplift.co/cognitive-biases-psychological-triggers-cheat-sheet/
The following strategies will be helpful in general for all of the above cognitive biases, and many more besides:
- Practice mindfulness. Be more aware of yourself, your thinking and your emotions as you process information.
- Consider who is impacted by your decision or lack of decision. Sometimes, looking at how others will be impacted by a decision will help to clarify the decision.
- Seek out potentially relevant or new information, and especially focus on disconfirming evidence.
Actively seek diverse outside opinions to counter your overconfidence.
- Work hard to increase the diversity of your team.
- Reframe or flip problems on their head to see if you are viewing the issue only one of many possible ways.
- Many of the biases listed above occur because we work quickly and rely on intuition. Give your rational mind some data to work with, give it some time and consciously reframe what you know and you might be surprised to see a different point of view emerging.
- Reflect on the past. Look back on your decision-making history, and ask if you’ve ever been in a situation like this before. How was that situation similar to the current one? How was it different? What did you do? What were the outcomes? What would you do differently?
Although these practices might take some time to practice and master initially, they will ultimately save you time and energy, and will lead to better decision making.
A group process for overcoming cognitive bias:
The cognitive biases above are common, but they represent only a small fraction of all the documented cognitive biases. Don’t become despondent or disillusioned at how easy it is for us to be fooled. But equally, don’t ignore this issue, thinking it’s too big an issue to do anything about. Understanding these biases is very helpful in learning how they can lead us to poor decisions, and even just knowing about them and acknowledging them will make a big difference in overcoming them. It will be one of the most important things you can do.
- A superb summary
- [Video] Cognition and why your brain lets you down (11 minutes)
- [Video] 12 cognitive biases explained, with excellent examples (10 minutes)
- A Study.com online course (free signup required)
- 58 Cognitive Biases That Screw Up Everything We Do
- Book: Daniel Kahneman, Thinking Fast and Slow (also see https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow – includes a nice list of key biases)
- Book: Dan Ariely, Predictably Irrational (also see https://en.wikipedia.org/wiki/Predictably_Irrational)
- Book: Malcolm Gladwell, Blink: The Power of Thinking without Thinking (see https://en.wikipedia.org/wiki/Blink:_The_Power_of_Thinking_Without_Thinking)
- Book: David McRaney, You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself (see: https://youarenotsosmart.com/the-book/)
- Book: Cordelia Fine, A Mind of Its Own: How Your Brain Distorts and Deceives