StratNavApp.com banner ad

It's time we stopped idolising failure in innovation

Failure, it seems, is in vogue.

We're told to "fail forward fast", "celebrate failure", give our teams "permission to fail", etc.

But have we gone too far? Have we inadvertently started idolising failure?

It is true that failure is inevitable in innovation. Just as risk is inevitable in earning investment returns.

But we should not lose sight of the fact that our ultimate objective is success rather than failure; returns rather than risk. Failure and risk are means to an end, not ends in themselves.

I recently heard of an organisation who set a KPI target that "at least 90% of innovations must fail". I think I know what they were trying to achieve. They wanted their staff to be bolder. Less incremental. And that would mean tolerating higher levels of failure.

But to actually encourage - mandate even - more failure is perverse. The obvious unintended consequence is that staff will be encouraged to sabotage perfectly good innovations!

Catchy phrases lauding failure had a purpose. That purpose was to break the mindset that failure was unacceptable. To reintroduce failure as an acceptable cost of innovation. But now I fear they have taken on a life of their own. And it is not good for business. Or for innovation.

To quote Frederic Etiemble, "a good idea doesn't have to become a dogma". (Source)

The original intent of those catchy phrases was to enable learning by doing. Its time we focused our attention back on that original purpose.

Innovation does not require failure. Innovation requires you to run experiments. Experiments don't succeed or fail. They produce results from which you can learn.

What we really want, is a more scientific approach to innovation.

In science, experiments don't fail. They either prove or disprove an hypothesis. Or they're inconclusive. Either way, we learn something.

Scientists don't just throw random chemicals into test tubes and hope something interesting happens. Research programmes are carefully planned and structured.

So how should we go about innovating in a more scientific way?
  1. Be very clear on your goals.
  2. Break those goals down into the smallest testable experiments.
  3. Start with the experiments where the greatest uncertainty exists with the greatest impact first.
  4. For each experiment set a clear hypothesis. Know (1) what data you're going to collect and (2) how you're going to collect it to confirm or disconfirm the hypothesis before you start.
  5. Make sure you have a control group. You need to know if the hypothesis was confirmed or disconfirmed because of the experiment and not because of some other factor.
  6. Experiments are not commitments. Make sure you can stop the experiment any time you want.
Words matter. Our focus on failure will lead to failure. Let's change the language. Let's focus on success. Let's focus on learning. Let's focus on a scientific approach to innovation.

We need to move the narrative from:

  • "we tried a, b and c and failed - awesome job everyone!"
    to
  • "we tried a, b, and c, and learned x, y and z." 

The successful innovators already get this. It is the unsuccessful ones, the not-yet-successful ones, who will be misled by lazy, populist slogans.

See also:

Addendum

(June 2023)

In her book, "The Right Kind of Wrong", Amy Edmondson distinguishes between three kinds of failures:

  1. The potentially intelligent ones. Those teach us things in the midst of genuine uncertainty, when you can’t know the answer before you experiment.
  2. The basic ones, that we wish we never experienced and which would be great to get a ‘do over’ on.
  3. The preventable complex systems failure. Those where we ought reasonably to have know it could go wrong and could have prevented it.

In her post on the subject, Rita McGrath puts the loss of the Titan in this third category.

I would add the collapse of FTX to this category. In fact, listening to videos of Stockton Rush of Oceangate talking about risk was very much like listening to Caroline Ellison of Almeda research talking about risk.

No comments: