The Truth, The Whole Truth

This article was first published by the Alliance for Useful Evidence.

Thomas Edison failed more than 1,000 times before he eventually found a successful design for a lightbulb. When asked about it, he said: 
“I have not failed 1,000 times.  I have successfully discovered 1,000 ways to not make a lightbulb.” 

Useful evidence concerns not only innovations or interventions which work, but also those which don’t. Indeed, perhaps we learn more from those which don’t work than from those that do.

So it’s lamentable and dangerous that evidence about failures in ‘social purpose organisations’ (a good phrase encompassing charities, philanthropy organisations, social enterprises and international development agencies) is almost entirely suppressed. Many social purpose programmes fail. This is visible to the naked eye since the problems they are ostensibly solving manifestly persist despite them. In any case, in a sector which trumpets its penchant for innovating (meaning, doing things for which there’s no evidence yet), simple probability implies that many won’t work.

The important lessons from these failings are invisible because the organisations have no incentive to share them – and indeed normally a disincentive since publicising failings will hamper fundraising. As a result, if the literature is to be believed, practically everybody is in the top quartile. A miracle!

So hats off to the few brave souls who do publicly confess their failings and thereby enable the rest of us to learn.

Oxfam GB is the most recent joiner. It randomly selected 26 of its 362 completed programmes to include in a meta-review* of its work. The ‘review of reviews’ it recently published discusses a bunch of programmes which it believed worked and several which it thinks didn’t.

Engineers Without Borders in Canada began publishing “Failure Reports” in 2008, and has now published four. They run to 30 pages and, as you might expect of engineers’ reports, are rather forensic about the reasons for the failures.

The incentives are easier on philanthropic foundations who don’t have to raise money from outsiders. So it’s disappointing that so few of them publish about their failures, or even candidly about learnings. Two stand out as exceptions. The giant Hewlett Foundation is so aware of the tendency for failures to be concealed that it has an annual award for the staff member who made its ‘worst grant’. Failure is hard to learn from if it’s undiscussable.

The stand-out leader is surely the Shell Foundation, which as it approached its 10th anniversary in 2010 wondered whether it was doing a good job. It couldn’t find out, as it said in this rather excellent understatement:

‘This report was triggered by a simple question: “Has our performance to date in achieving scale been good, average or poor when compared with our peers?” Given the lack of other published information around performance – including both success and failure – from peer organisations, this proved to be a very difficult question to answer. That is surprising given the billions of dollars managed by foundations.’

Surprising indeed. So Shell Foundation went it alone, publishing a warts-and-all account of its own performance, including citing the amount of money it felt it has effectively wasted. To my extensive knowledge, the report is unparalled in the philanthropy world. Crucially, Shell Foundation had tracked that as it went along, so was able to change its strategy and improve – moving through three quite different operating models during that decade. It thus swapped a 80% failure rate (yes really) for a 80% hit-rate.

In improving its performance, Shell Foundation found that data about the locations and causes of its failures were the most useful evidence of them all.

*not a meta-analysis in the statistical sense.

Why are charitable foundations stupidly discouraging charities from merging? –>

Advertisements
This entry was posted in Impact & evaluation, Uncategorized. Bookmark the permalink.

One Response to The Truth, The Whole Truth

  1. Pingback: It’s hard to make evidence-driven decisions if half the data are missing | Giving Evidence

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s