Create a just culture
I recently teamed up with Nick Roder - an experienced Mobile Intensive Care Ambulance (MICA) paramedic (and my uncle) - to give a talk at Agile Australia where we explored what organisations can learn from paramedics about teamwork, collaboration and leadership. We shared how paramedics deal with their unique, complex environment, and how they create a collaborative culture that enables effective teams.
This is the second in a series of four articles, detailing the insights and practical techniques from paramedicine that you can use to help your people do their best work, every day. The other articles in the series are available here:
Continuous improvement requires Psychological Safety
Before I get to Just Culture, let’s talk about continuous improvement for a moment. Continuous improvement - making incremental improvements to processes and products over time - gets a lot of focus in organisations. And rightly so. When done correctly it can unlock immense value and efficiency. All of those little changes add up and help to create an organisation that learns and grows.
However, it’s important to recognise that continuous improvement is more than just processes and tools. You can’t simply tell teams to do retrospectives or to implement tools like lean A3s and leave it at that. Rather, you unlock the true power of continuous improvement when you create a culture where people are comfortable talking about and learning from their mistakes. A culture where people are empowered to change the way they work and to challenge existing practices.
My conversations with Nick taught me that continuous improvement is not just important for paramedics, it’s non-negotiable. For them, raising concerns can lead to valuable changes in practice that improve outcomes for patients and paramedics. This is only possible in an organisation that both makes it safe for employees to speak up, and is prepared to listen. In other words, an organisation that has psychological safety.
Amy Edmondson defines Psychological safety as “the belief that the work environment is safe for interpersonal risk taking”1. It is the belief that you won’t be punished or humiliated for speaking up, taking risks, or making mistakes. Studies have shown that teams and organisations that have psychological safety are more likely to continuously improve. In fact, in most cases they can’t improve without it.
Just culture means taking a systems thinking view
I was interested to learn that paramedics take a systems thinking view. They focus on creating a Just Culture, which means being able to raise concerns, mistakes or challenges within the organisation in an environment that treats employees with due respect. A just culture recognises that human error is inevitable and that complex systems designed by humans will fail2. There is shared accountability between the organisation (the system) and the individual when things go wrong.
Safety-sensitive environments like paramedicine and aviation have developed this culture to allow employees to self-identify mistakes without fear of a punitive response. They can expect a respectful restorative process that seeks to correct errors and avoid them happening again.
When things go wrong, don’t blame or punish individuals when their actions or decisions are commensurate with their experience or training. Instead seek to understand the broader context for their actions so that you can identify improvements. This is not the same as a no-blame culture. People are still held accountable for their actions, and negligence and willful misconduct are not tolerated.
Take this example from the healthcare context. A nurse selects the wrong vial of intravenous medication from the dispensing system. They administer the drug and the patient has an adverse reaction. In a traditional organisation, the nurse may be reprimanded or punished. In a just culture, we might discover that 2 different medications look very similar and make changes to better distinguish them and thereby avoid similar mistakes in the future.
The global IT outage in July caused by CrowdStrike serves to illustrate the point. If CrowdStrike chose to identify and punish the individual responsible, that would do nothing to ensure that this type of thing could not happen again. Fortunately, they seem to be taking steps to prevent the situation from happening again3 (fingers crossed).
How to foster a Just Culture
There are 3 simple things that you can do to foster a Just Culture in your organisation4:
- Support the people involved
- Manage your biases
- Consider the system
Figure 1: Venn diagram showing the things you can do to foster a just culture in your team or organisation
Support the people involved
When something goes wrong, our first priority is to support the people involved. Making a mistake is rarely a positive experience, however a supported employee is more likely to resolve their experience productively and learn and grow as a result. They are also more likely to talk about mistakes in the future.
This applies equally in organisations as it does in paramedicine. Sure the stakes are often lower in organisations, however the stress and human impact is very real. I once worked with an IT security team member who accidentally took down a production site while doing penetration testing. He was so stressed by his mistake that he was unable to work effectively to rectify it. Fortunately his team lead was empathetic to the situation and not only worked to get the site back up, but also prioritised his team member’s wellbeing. He recognised that the team member had made an honest mistake and provided appropriate support and counselling.
Manage your biases
Could another person with similar knowledge and experience make the same decision in the same situation?
Managing your biases means giving people the benefit of the doubt by assuming that they set out to do their best and achieve a good outcome. Remember, nobody signs on in the morning thinking "you know what, I'm going to stuff up today". When someone makes a mistake, put yourself in their shoes to ensure that you are empathetic to them.
Try asking yourself this question: Could another person with similar knowledge and experience make the same decision in the same situation?
Some common biases to be aware of in the context of issues and continuous improvement4:
Confirmation bias: favouring information that confirms our existing opinions and ignoring that which contradicts them.
Availability bias: using information that comes to mind quickly and easily. We are more likely to consider something to be true if the information supporting it comes easily to mind.
Recency bias: giving more importance to recent events over historic ones.
Representative bias: making judgments based on how people or situations match particular stereotypes
Hindsight bias (retrospective coherence): creating a coherent narrative of past events based on what we know now. Complex situations that had multiple possible outcomes appear deterministic and predictable in hindsight, with events unfolding the way they ‘had to’.
Outcome bias: using the outcome of a decision to evaluate the quality of a decision or action, instead of considering what was known at the time and the quality of the decision making process
Fundamental attribution error: attributing other people’s actions to their character or other intrinsic qualities, rather than considering the influence of external factors.
Consider the system
"Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in 'upstream' systemic factors"
Considering the system means that rather than focusing solely on individuals and their actions, you need to recognise that they are working within a broader context. Remember, a Just Culture recognises that there is shared accountability between the organisation (the system) and the individual when things go wrong. So ask: What aspects of the system contributed to the actions they took, and the decisions they made? Why did it make sense to them at the time?
Figure 2: The Swiss Cheese Model of accident causation
The image above depicts the Swiss Cheese Model of accident causation conceptualises how mistakes are made. It is used in paramedicine and aviation to illustrate that mistakes are often an alignment of system, structural (equipment) and human failures.
In this model, "errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in 'upstream' systemic factors"5.
The central premise is that our processes, technology and systems have multiple layers of defence (the ‘cheese slices’) to prevent mistakes and accidents. However, each layer has ‘holes’ that could potentially lead to an incident if not caught by a subsequent layer. It is when these ‘holes’ align that we get bad outcomes.
When you consider the system, you are encouraged to think that ‘there might be more to this error than just people’.
Conclusion
Approaches like agile, lean and other modern ways of working have given us many effective tools and processes that can improve how we collaborate and help us achieve better business outcomes. However, simply applying these tools and processes without addressing the underlying culture and mindset in an organisation severely limits their effectiveness.
This is why I think this concept of Just Culture is just as relevant in organisations as it is in paramedicine. When we foster a culture that recognises that there is shared accountability between the organisation and the individual when things go wrong, we empower people to talk about mistakes, issues and potential improvements. We create an environment where both people and the system they work within can learn, grow and improve.
References
- A. C. Edmondson, The Fearless Organization, 2019
- Safer Care Victoria, Just culture in adverse event reviews factsheet, accessed 30 Sep 2024
- ABC News, CrowdStrike releases root cause analysis of the global Microsoft breakdown, accessed 30 Sep 2024
- Safer Care Victoria, Just culture guide for health services, accessed 30 Sep 2024
- J. Reason, Just culture guide for health services, accessed 30 Sep 2024