Posted on 27th April, 2011 by LEO Learning Web Team
This post first appeared on the Epic blog on 27th April 2011.
Patrick Thomas, Key Account Director Energy and Mining, shares his views on the importance of learning from safety incidences in the Energy sector.
In January the report Deepwater Horizon: Report to the President was made public with clearly damning criticism of the Energy industry’s ability to learn from previous mistakes. And by mistakes, I don’t only mean the huge headline-making catastrophes, also known as “low probability, high impact events.” Crucially, this industry has failed to learn from those all-too-common “high probability low impact incidences” that can be precursors to major catastrophic events such as operators taking short cuts when starting equipment, evacuation practice procedures not being carried out, tripped alarms that kept going off, or that small blip in data that doesn’t make any sense so it is discarded. By themselves, they are seemingly innocuous incidences that represent annoyances to the individual. When they occur regularly, these high probability incidents can line up to create that one improbable event that results in injury or death.
The full report, is expertly written in a narrative format that you won’t be able to put down. The political motivation for writing the report for mass consumption underlines the US Government’s desire to keep the nation engaged and appalled.
There are two observations in the report that jump out at me which underscore the need for the industry to look at how learning from incidences (LFIs) has to be approached systemically:
“As the Board that investigated the loss of the Columbia Space Shuttle noted, ‘complex systems almost always fail in complex ways.’ Though it is tempting to single out one crucial misstep or point the finger at one bad actor as the cause of the Deepwater Horizon explosion, any such explanation provides a dangerously incomplete picture of what happened – encouraging the very kind of complacency that led to the accident in the first place.”
“There are recurring themes of missed warning signals, failure to share information, and a general lack of appreciation for the risks involved. In the view of the Commission, these findings highlight the importance of organizational culture and a consistent commitment to safety by industry, from the highest management levels on down.”
When organisations deal with LFIs, the tendency is to look for human error at the “sharp end” of the problem by focussing on the last person that touched it. The reflex reaction is to educate, reappoint or dismiss the person, people or department that is thought to be behind the accident and presume that the organisation has learned from their mistake. However, as Report to the President indicates, no single error and no single person are at fault. The failure lies in the inability of every company implicated in the Deepwater Horizon disaster to adopt and nurture a safety process culture.
Organisational cultures are complex. How an organisation is perceived internally and externally, how it operates, how it treats its employees and how it learns are all indicators of an organisation’s culture. Making a fundamental change to the culture of an organisation, for example, making “consistent commitment to safety by industry from the “highest management levels on down” requires a systemic approach to the culture of an organisation.
One clear way to gauge an organisation’s culture and tendency to learn from mistakes is to listen to the conversations that are happening therein. What is the front line saying? Is it different from what management is saying? How are mistakes treated? Is there a fear of repudiation? Are people being authentic?
Organisational cultures are indeed complex, but they change. And that change can be led from the centre. Not only can you gauge the culture of an organisation through the conversations that are taking place, that is where you can also start to change the culture.
The start of a culture change programme doesn’t have to be onerous. You can start by targeting a particular area and build from there. For example, we are currently developing a programme that encourages challenging conversations between Front Line Leaders (FLLs) and their Operators and Technicians. By providing the teams with a common language around safety, for example using accepted safety models, the conversations become less personal, less risky to the individual.
The team will use the model to see their place in the overall system and the individual team member will plot their role in preventing latent incidences. The shift to a common language enables the individuals to disclose safety concerns and then the team actions the solution together. In this programme, the FLL is tasked with ensuring the group’s actions are delivered and reported back to the team. Delivered monthly, these facilitated team discussions help to fundamentally change the safety culture at the ‘sharp end’.
Although developed from the corporate centre, these facilitated discussions feel like they are local and organic, and thus readily accepted by the Operators and Technicians. These discussions play to the FLL’s strengths; that they have come up through the ranks and are still “one of them”; and support the FLLs in their role as manager.
When the FLLs start to share their successes online with the larger FLL global community, a relatively simple programme starts to have real, transformational results. This programme demonstrates that culture change programmes can start small, in a targeted manner and have big impacts. Deepwater Horizon put a bullseye on Transocean, BP, Halliburton, and Anadarko, but it has also challenged the whole energy and mining sectors to take a hard look at their safety culture leadership. Process safety training does not work as a single training initiative. It must be culturally ingrained and therefore it needs to be considered systemically; from Operators and Technicians all the way through the global management teams.