The topic of how we think, reason and make decisions is an important one as the outcomes of our decision-making influences the trajectory of our own and the lives of others.

According to Daniel Kahneman there are two types of thinking: system 1 and system 2. System 1 thinking is fast, unconscious and automatic. System 1 helps us make fast decisions and make sense of our world effortlessly. However system 1 is also responsible for why we fall prey to biases and failures of logical reasoning. This is because system 1 uses heuristics (rules of thumb) to reason and make decisions.
System 2 on the other hand is slow, conscious and deliberate. System 2 engages use of formal logic such as syllogisms. But system 2 is effortful and puts load on our working memory so we are less likely to use it.

System 1 helps us understand and make sense of the world by finding patterns, however this can also lead to a bias where we can fail to estimate correct probability of something happening. When asked which of these sequences would you prefer for a lotto ticket, most people do not choose the top sequence.

1, 2, 3, 4, 5, 6
12, 29, 64, 20, 15, 19
10, 60, 42, 28, 30, 16
55, 18, 88, 13, 25, 20

This is because system 1 recognises the bottom three as ‘random’ despite the fact that all four are equally probable to come up in a lotto draw.

Another bias that influences how we make decisions is the anchoring bias. This is where our decisions are influenced by our previous answers to a question or information provided to us. This information acts as an anchor by which we judge all other information. Sale prices on price tags act as an anchor and can make the normal price seem reasonable even if they are outlandish.anchoring bias Checking source articles and using multiple data points (check if your information matches with other sources of information) can help to get the initial assumptions as accurate as possible.

Sometimes we hear people making claims or defending a decision based on their own experience or that of a couple of friends. This glitch in reasoning is called insensitivity to sample size. When only a few people are surveyed there is large variation in that sample and we can’t be confident about drawing conclusions. For example imagine a jar filled with lollies, 1/3 are one colour and 2/3 another.
One person draws 5 lollies out where 4 are red and 1 is white. Another person draws 20 lollies out where 12 are red and 8 are white. Most people say the first person has more evidence that most of the lollies are red, however the second person has a larger sample size which represents a more accurate probability. Newspapers are littered with this type of fallacy. For example a headline “schools are failing students in numeracy” Based on one person saying things like my child isn’t learning the basics in maths at school. This is just one person and one child that may not be achieving for numerous reasons. We can’t make any inference about the state of affairs of numeracy in schools, but these type of articles can and do influence us because of system one thinking.

How we frame a question or problem can influence how we make decisions. Tversky and Kahneman (1981) asked participants in a study to choose between two options in a disaster planning situation. If option A is implemented then 200 people will be saved. If option B is implemented there is a 1/3 probability that 600 people will be saved and a 2/3 probability that no one will be saved. 72% of participants preferred plan A and 28% plan B even though both end up with the same amount of people being saved. The point here is that sometimes we may need to re-frame how we talk to people to get them to see the true benefits of differing options. As mentors, teachers or coaches we can re-frame how we talk to students to get them to try new and different things.

One of the hardest biases for us to shake when making decisions would have to be conformation bias. This is when we select information that affirms our current beliefs.
Conformation bias can lead to drawing conclusions that are not correct. Even scientists can fall prey to this bias despite using the scientific method which is designed to remove subjectivity. Constant questioning of our own beliefs and being open to change can help reduce the affects of this rather noxious bias. You can read more about conformation bias in this blog post 

System one and two thinking is food for thought for educators too. We want students to engage in more system 2 thinking so they will make good decisions, but, system 2 thinking is effortful. How can we encourage more system two thinking?

  • One idea could be to give more time for students to slow their thinking down. Give them more time to answer questions and work on problems in class.
  • Making students aware that thinking is a deliberate process that requires lots of effort can help develop the mental model that we have to keep persisting to get the answer sometimes.
  • Teaching critical thinking skills where students are questioning and reflecting on their own beliefs and the information that is presented to them.

  • Encouraging the use of statistics and probability in everyday thinking can help kids smell a rat when system one throws up a hurried answer.
  • Giving students time to revisit decisions and reflect on how they got to their answers can help them become better decision makers.

Can you think of other ways that we can encourage system two thinking?

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.