Very few of us come to work intent on doing harm. However, despite that we all keep making mistakes. Most of them pass unnoticed and do little harm, although we are all aware the times they don’t, and it is not only the patient that suffers.
So why do mistakes keep happening? We often find a myriad of excuses and apportion blame on those we feel performed poorly, but that is to ignore how susceptible we all are. The capacity for error in humans can be attributed to the two systems our brains use to function, and hopefully by appreciating the interaction between these systems and their faults we can both, prevent mistakes and design better systems in future.
To illustrate these two conflicting systems at work here are two quick experiments to try on yourself.
Exp 1. A bat and ball cost £1.10p, the bat costs £1 more than the ball how much is the ball?
Go on do it!
For most people doing this puzzle, the answer that initially jumps to mind (system 1 – intuition) will be 10p. However, when you think about it (system 2 – analytic) that would only leave a difference of 90p, so must be wrong. You will then reach the true answer of 5p. It is not that we are all innately bad at maths, just that your system 1 produces the quickest and easiest answer so you can get on with more important tasks.
Exp 2. Watch the video and count how many times the team in WHITE pass the ball
For those of you that have never seen this before and were concentrating hard on counting the passes, you will quite happily miss the gorilla walking across the screen (my children certainly failed to spot the gorilla when incentivised with chocolate). This is what happens when you engage System 2. Your brain puts so much processing power into what you’re doing you ignore the obvious.
System 1 vs System 2
System 1 is low effort intuitive thought. It’s fast, it allows us to multitask, it works well most of the time. However, it’s full of bias and error but without it we would not be able to function. System 2, analytical thought on the other hand is accurate. But that’s at the cost of speed, attention to other stimuli, and ability multitask. To function successfully we must use both systems as we have limited mental processing power. We do, however, need to be mindful of what system we are using and its intrinsic flaws, to avoid error.
System 1 errors – Short-cutting
Due to the ease of system 1 your brain will default to this subconsciously whenever possible. But, the short-cutting of system 1 manifests itself in many obvious errors (in hindsight):
- Over-confidence: You don’t sense check the calculation or diagnosis (i.e exp. 1).
- Avoidance: You send a patient to CDU/MAU as you don’t want the fight with speciality’s, or request a random troponin because it’s “safe”.
- Belief: You take information at face value as challenging is be difficult.
- Assumption: You assume someone else will do it, take responsibility, or knows what’s happening
There are many other examples, but common to all is your brain trying to preserve bandwidth so you can continue to multitask by taking the “easy” but often suboptimal option.
System 2 errors – Hyperfocus
With the effortful system 2 you’re not likely to have problems with the specific task you are engaged in, however, beyond that specific task you are vulnerable to several errors.
- Ignorance: if you are trying to reduce a difficult shoulder, you’ll happily ignore the patient has stopped bothering to breath. Or you may miss the STEMI on the ECG thrust under your nose during a resus as all your processing power has been diverted into the task at hand.
- System 1 error: while engaged in system 2 thought, if you try to perform any-other tasks (such as giving advice) your brain will not be able to expend further resource on it and you will inevitably default to system 1 and its inherent problems.
Recognising when you’re at risk of making mistakes
As demonstrated above we are all at risk of bias in our decision making, leaving us vulnerable to making mistakes, however there are high risk situations you should recognise (in you and others).
System 1 – Short-cutting
- Busy/Cognitive overload: When you’re too busy multi-tasking to engage system 2, you will subconsciously short-cut decisions, defaulting to what is easiest or most expedient (probably not what is right).
- Tired/Fatigue/Hungry: When you’re tired, you’ve missed your break, you’ll take any short-cuts available. An experiment on Israeli Judges making parole decisions demonstrates this nicely. The judges easiest “safest” decision is to deny parol, granting parole takes significant mental effort. As you can see from the results in the run up to breaks your chance of parole fell significantly (so if your up for parole get your hearing just after lunch).
- Belief: Humans are hardwired to believe (skepticism is effortful). This makes the handover patient or a suggested diagnosis (by patient/nurse/doctor), very risky. As without putting significant effort into it you are likely to accept what you are told.
- Low Motivation/Too relaxed: If you’re not invested in a decision you’ll rely on system 1 as system 2 is just too much effort
- Stereotyping/Emotion: Beware of the “typical” patient or the patient you ether dislike or like too much. You’re liable to make presumptive decisions, and fail to engage system 2 thinking.
System 2 – Hyperfocus
- Task focused – This could be tasks such as intubation or running resuscitation. All your attention will be taken by the task at hand and what would normally seem obvious will be missed (just like my kids and the gorilla).
- Time pressured/Stressed – You’re trying to perform against a target (4hr wait, sepsis treatment), your focus is taken by a target. Leaving all other decisions to system 1, and ignoring what in hindsight is common sense.
What can we do?
There are no simple answers, as we are hardwired to prefer the seductive and effortless system 1 thinking whenever possible, preserving our effortful system 2 thinking for what we deem important.
Training and repetitive exposure, can be used to alter what is intuitive (System 1), getting us to perform better in defined situations. Consider what happens when there is a loud bang, a civilian will immediately turn to look, a combat veteran will fall to the floor. Neither responses took any thought, but are very different due to training/experience. However, without engaging system 2; in a supermarket the veteran’s intuative response to drop is as inappropriate as the civilian’s in a combat zone.
Klien’s study on decision making, shows that when confronted with a problem we will quickly draw on similar scenarios from previous experience or training. This drawing on previous experience is intuitive system 1 processing, and liable to bias. To ensure good decision making, system 2 then has to check that your previous experience applies to the current situation, if not to either modify your plan of action or look for a better model. So even with good training if system 2 does not engage we will inevitably make poor decisions.
2. Check lists
Check lists have been used in many settings (e.g. intubation, discharge), inserting a hard stop and check on our natural “laissez faire” system 1 processing. However, simply introducing a check list is of little use unless it is adopted and enforced by the majority. As by necessity, a check list is cumbersome and gets in the way of “efficient” system 1 working, so is liable to dropped at the times it’s most needed.
3. Decision Tools
More than 60yrs ago Meehl demonstrated that experts (thats what we are supposed to be), become very good at predicting the outcome of decisions we see the immediate result of. However, our ability to predict (or diagnose) accurately when we don’t regularly or immediately see the results is poor and simple statistical models perform as well or better than us. However, once decision tools become accepted (such as Wells score) we then changes our internal predictive models to reflect these scoring systems, improving our accuracy.
A further benefit of using decision tools is that they can remove some of the system 2 burden from individuals, freeing them to make more useful decisions.
De-Biasing exercises, are useful S.P.I.T. is just one of many mental games that gets you to think about a diagnosis and what you might be missing, by getting you to activate system 2. Ask your self
- What is Serious?
- What is Probable?
- What is Interesting?
- What is Treatable?
5. Take a break
Just like the Israeli judges you are at your best after a break. Other studies have also shown that subjects that were forced to repeatedly engage system 2 thought persisted and performed better when they were given sugar. So eat something.
6. Supportive Environment
Evidence shows people make better decisions and improve more in supportive environments, when they are not in fear of being berated or embarrassed. However, that is not to say people should not have responsibility. As without responsibility, we naturally allow system 1 to rule.
Keep a healthy amount of skepticism, question diagnosis or “facts” that are given to you. Look for errors and try to avoid confirmation bias. All of which is effortful but remember to ASSUME makes an ASS of U and ME.
8. Recognise your limitations & the limitations of others
Your capacity for analytical system 2 thought is finite – be aware of when it’s running out and get help. Equally, recognise that others are reaching their limits and support them. This may be as simple as waiting to ask advice, or taking the cannula out and discharging the patient yourself.
What can the department/trust do
The department and trust can have a hand in the solutions above but can also support in other ways
1. Adequate staffing
We must recognise that individuals have limited capacity for system 2 thought. Hence with out appropriate staffing, and time to perform the tasks required individuals will inevitable perform poorly.
2. Improve processes
Most of our process have evolved naturally, leaving us with sub-optimal processes that put unnecessary strain on the individuals expected to perform within them. When designing processes we need to look at making that process as simple as possible. So that the process support as much system 1 thought as possible. Leaving individuals plenty of bandwidth to solve the problems that will inevitably crop up.