The oil industry leaders/workers/people in the payroll who triggered the BP oil spill disaster, knew the rules, knew what needed to be done, knew what could go wrong. Yet, the disaster happened.
In small scale, in our Viral Safety™ programmes, people say: ‘we know all that, yet we do thinks in a different, unsafe way’.
People in banking who created the Big Crisis and the Small Crisis knew the rules, knew the boundaries, had a code of ethics, a code of conduct. Yet, they did it, they crossed the line.
These two disastrous extremes highlight the difference between the knowing and the behaving.
Companies’ default position to problems is that there is something wrong with the ‘knowing’. When the knowing is the focus, the solution is ‘knowing more’, which equals training.
It would be ludicrous to say that the BP people need re-training. It is ludicrous to expect bankers to go ‘back to training’, yet this is what they are doing in mass, just kidding themselves.
The problem is culture. Cultures are not created by training. Cultures are behavioural fabrics that are created as social phenomenon (Homo Imitans)
Training is hardly the solution. On the contrary, it is a distraction and a alibi.
‘Render unto Training the things that are Training, and unto Culture the things that are Behavioural’.
The laws of knowing and behaving are different as night and day. (Viral Change™)
It seem that in the two issues you cite, the Exxon oil spill and the financial collapse brought on by certain rule-bending banking practices, the problem is not general culture, but specifically the incentive system. I guess you count that as part of the culture, but it’s sort of the 800-pound gorilla, which in most cases, long-term, in big organizations, tends to trump all the other aspects of culture.
From what I have read, the oil-rig workers and their immediate managers were under great pressure to get the well online, on schedule, and to keep it pumping at full capacity. And, oh yeah, you should obey the safety rules and not get caught if you have to bend them. So if they didn’t have the correct part, or had to choose between doing an extra inspection and keeping the well running, the choice was clear. Output was well rewarded, and failure was a low-probability threat that might, at worst, lead to dismissal. Human behavior in that situation is fairly predictable — the occasional heroes who nag everyone else about “safety first” notwithstanding.
And in the financial firms, the rules put in during the 1930s to regulate banking had not been extended to cover all the newer banking-like activities that these firms engaged in, the regulations that did exist were not aggressively enforced, and if the house of cards came down, people had a pretty strong expectation that neither the firms nor the employees would be punished for this. In the meantime, there were billion of dollars to be made by doing things that were locally profitable but dangerous to the long-term health of the system. Again, human behavior was predictable, and in this case not helped by a culture that admired people who could maximize income by creatively bending the rules.
I’m sure that there are many aspects of culture that are worth examining in such situations, but the first place to look is at the incentive system — the de facto one, understood by the insiders, and not the one that is proclaimed to outside inspectors.