If it looks like a duck (or a snake)….

“The first rule of snakes [problems] is, if you see a snake, you kill it….Just take care of it”

Jim Barksdale, former CEO Netscape

It’s rare for an event to be truly unexpected.
We know that our personal habits affect our health. We know that incorrect use of tools and machinery can cause injury. We know that small-scale corner-cutting leads to more serious infringements. We know that running complex systems – like drilling rigs or nuclear power stations – beyond established safe parameters can be catastrophic.
Yet, time after time when something goes wrong, we hear a variation of ‘we had no idea that this could happen’. This is despite the mountains of evidence from previous events clearly showing how a specific set of actions led to the incident or issue.
In the majority of the incidents, issues or crises I have been involved in or studied, the warning signs were all there and had been identified before the event.

For the situations where I played a part, I wish I could claim to have been the one who always called these out. I wasn’t. But I can say with certainty that the signs were almost always there.

As risk managers, especially those with a governance position, a big part of our role is to be the voice that speaks out. We should be evidence-based and objective so if we see something that looks like a duck, walks like a duck and quacks like a duck, we should call it a duck. We should also be the ones making sure that everyone else is calling it a duck unless there is a better explanation.
However, what often happens is that we see a duck but instead of dealing with it we convene a committee to study the duck (delay), spend a lot of time arguing that it isn’t a duck (denial) and eventually decide that even if it is a duck, we don’t need to do anything (dismiss).
You will have seen these three tactics – delay, deny, dismiss – used countless times I’m sure. But now ask yourself: how often did the problem disappear?
I’m going to guess, rarely.
In fact, I’m willing to bet that instead of disappearing, the problem got worse over time. So instead of dealing with the problem when it is probably solvable, because of a combination of delay, denial and dismissal, things end up being much worse. But why?
Sometimes the rewards are too attractive.
That’s why Wells Fargo overlooked unauthorized accounts being opened. And why markets ignored the fragility of trading mortgage-backed securities prior to the 2008 US financial meltdown. And why Uber’s board allowed a toxic culture to continue. The rewards were simply too big.
At other times, it’s because the organization knows that this is a nasty problem but they don’t want to deal with it. Unfortunately, as Colin Powell said, “Bad news isn’t wine – it doesn’t improve with age”. Turning a blind eye today simply allows the problem to grow tomorrow.  When you finally have to confront the problem, it is much worse. Facebook is discovering this now as concerns over users’ privacy, election interference and transparency are coming to a head with US Congressional investigations and a massive drop in share price.
In many of these cases, you will hear people say ‘we had no idea’ or ‘this was a black swan event’*, but this simply isn’t true. The warning signs were all there. They just didn’t do anything about them.
But please don’t get me wrong. I understand that there are sometimes compelling reasons to do nothing and dealing with problems is always hard and often unpleasant. I get that. But my point is that once you have identified the problem, you have to tackle it because it isn’t going anywhere.
So no matter how sophisticated or detailed the tools we have are, sometimes you need to step back and look at the big picture. Use your common sense and your intuition and trust the most straightforward explanation.
So if it looks like a duck, or a snake, call it what it is and deal with it. Sometimes, there isn’t a more complicated explanation and a problem today won’t be any easier to deal with tomorrow.

* The term comes from Nassim Nicholas Taleb’s book, ‘The Black Swan: The Impact of the Highly Improbable’. Taleb uses the example of a survey that only finds white swans to explain that although this confirms that white swans are most common, it doesn’t disprove the existence of black swans. His point is that predictive models based on previous observations of what is ‘normal’ can be highly misleading.
However, there are very few true black swan events. Most of the time the warning signs are there but were ignored.

What do you think? Leave a Reply