Adaptation and complexity in safety

I was listening to David Woods this morning on Todd Conklin’s podcast discussing his early work on emergency response in nuclear power stations. The most effective responses occurred as operators changed their perspective based on new data as the situation progressed – i.e. they were adaptive. This included test scenarios that were deliberately designed to be dynamic and evolving, rather than a single major event occurring, creating a complex problem.

This reminded me of Philip Tetlock’s work on forecasting. He identified that the best forecasters (so-called superforecasters) consistently outperformed expert analysis for a number of reasons, but primarily because of their willingness to change their view as new information arose, rather than being wedded to a particular idea that then suffered from confirmation bias. Expert forecasters had a tendency to hold tight to their preferred theory instead, in spite of evidence that it may not be entirely accurate. Again, this was for complex, difficult to predict scenarios like long term political or macroeconomic trends.

Dave Snowden, in his work in complex socioeconomic and sociotechnical systems talks frequently about adaptation and goes further to discuss exaptation as well. Exaptation is a term from evolutionary biology where an organism uses a feature for something other than what it evolved to do. So, feathers that evolved for warmth turn out to be helpful in flying. People are very good at exaptation – a lot of innovation occurs by repurposing something rather than some blinding flash of inspiration to develop a brand-new concept. Something we often outlaw in safety rules – only use the correct tool for the job.

This is work going back decades. The results are in. Adaptation is effective – even imperative. Those of us in health and safety that still think a fixed system with set procedures and a compliance lens is the way to go are waaaay out of date.

There is a caveat here, though. This is true for complex systems. A lot of the work we do is complex, but not all of it.

Some of our work can be quite simple. It often takes place in a complex environment, but the specific task is simple. Where things are simple, fixed rules work. Take wearing a harness when working at height. This is a simple task and it is a perfectly reasonable thing to say you must clip it on to an appropriate anchor point. Now, there can be complexity around that – how do I know what’s appropriate? Have the anchor points been tested? What else is happening that may make me forget, or clip on to something that won’t hold my weight? But the actual action of clipping on is very simple.  In most circumstances, there is no need to be adaptive here to get the best outcome (not always of course, these are somewhat fuzzy boundaries).

So, sometimes rules and procedures are appropriate and sometimes not (Rules – who needs them). The key point here is to know what the difference is.

This is where Snowden’s Cynefin framework is useful. This is a sensemaking framework to help understand which of four domains you are in when considering a particular problem – Clear (previously called simple and also obvious), Complicated, Complex and Chaotic. Once you recognise that, you can determine whether you need fixed processes to manage it, or whether experimentation, adaptation and emergence is the path forward. I’m not going to go into it in any detail here – there is a huge amount of information on-line (About – Cynefin Framework – The Cynefin Co) – but this is something which is vastly under-utilised in safety management.

I think this is where some of the ‘whose idea is best’ arguments come from in safety (see safety wars). In sweeping generalisation terms, Safety I looks through a simple lens (clear domain) – safety is simple, follow the rules, be compliant. Safety II (and other ‘new view’ variants) views it through a complexity lens – work is complex, people need to adapt to manage variability. The stark realisation by looking at this via Cynefin is that both are useful, just in different circumstances. It’s almost like Erik Hollnagel titled his book Safety I AND Safety II for a reason.

Get new content delivered directly to your inbox

Leave a Reply

%d bloggers like this: