Making charities’ research more findable and useful

Quite possibly, some NGO has discovered a great way to, say, prevent re-offending or improve literacy, but that nobody else knows about it so their genius innovation doesn’t spread. Surely this is unacceptable. 

Giving Evidence has been exploring whether this risk could be reduced if research by charities (including ‘monitoring and evaluation’) were easier to find and clearer. We started with a suspicion that (i) some charity research is published but only in places that few people would know to look, such as on a small organisation’s website, and (ii) some of it could be clearer about what the intervention actually was, or what research they did, or what the results were.

We started in UK criminal justice, and consulted many experts, funders checklist 2015 coverand practitioners on two proposals: (i) creating a repository to hold charities’ research, and (ii) creating a little checklist of items for charities’ research to detail: the intervention; the research question; the research method and how it was used (e.g., if 20 people were interviewed, how were those 20 chosen?); the findings are here.

The short-form checklist, suitable for nonprofits in any sector, is here.
Continue reading

Posted in Uncategorized | 1 Comment

Giving Evidence’s mission and work

Many thanks to the Social Progress Imperative!

More videos on our insights and approach are here.

Posted in Uncategorized | Leave a comment

Assessing Funders’ Performance: Five Easy Tools

This article was first published in the Stanford Social Innovation Review

Measuring impact is so tough that many funders give up, but there are some insightful and actionable tools for funders that aren’t daunting.

When I was a charity CEO, we approached a family foundation. There was no formal application process. Instead, we had to write various emails, and I had to attend various meetings (not unusually, the foundation wanted to see only the CEO, the highest paid staff member). A physicist by background, I kept a tally of the time all this took and the implied cost. Eventually we got a grant, of £5,000. This required that we (I) attend more meetings—for “grantee networking,” meeting the family, and so on. We noted the cost of those too. Towards the grant’s end, the foundation asked us to compile a short report on what we’d done with the grant. By now, the tally stood at £4,500. I felt like saying: “What grant? Honestly, you spent it all yourselves.”

One hears worse. A physicist at Columbia University has calculated that some grants leave him worse off. And I’ve heard of a heritage funder requiring that applications have input from consultants; this made the cost of applying £100,000, though the eventual grant was just £50,000. Continue reading

Posted in Uncategorized | 1 Comment

Easy ways for philanthropic donors to see if they’re doing well

This article was first  published by the Social Impact Analysts Association.

Some skiers are better than others. Some singers are better than others. The same for teaching, nursing and curling. So it seems reasonable to suppose that some people are better at supporting charities than others.

But how do you tell? Curlers can easily see if they beat their opponents, and surgeons see if patients live or die, but success in philanthropy is less evident. Whereas businesses get feedback immediately and constantly – unpopular or over-priced products don’t sell – donors don’t. They can’t rely on charities telling them since they’re daren’t bite the hand that feeds them. Steve Jobs cited the difficulty of knowing if you’re giving well or badly as deterring him from giving much at all.

Happily, it is possible – and not terribly hard. Giving Evidence, a consultancy and campaign which helps donors to give well by using sound evidence, has found various tools which help almost any donor to understand their performance. They’re collated in a new white paper, and are simple: they may even seem rather obvious, but have proven useful to individuals, companies, and foundations who give. They are: Continue reading

Posted in Analysing giving, Donor behaviour & giving stats, Effective giving | Leave a comment

Philanthropy in transition

Caroline Fiennes was one of 11 leaders interviewed by The Guardian for the Philanthropy in Transition series. 

A new generation of donors wants impact and engagement

Out of the dot.com boom came a new breed of donors for whom good intentions are not enough an evidence is key

How do you think philanthropy is changing, and what’s driving those changes?

The most obvious changes in the past 15 years are the arrival of many new donors, new ways of giving, and much higher profile.

It started with the dot.com boom: money from eBay, Microsoft, Google et al. They brought tools common in business, but not used in philanthropy: high-engagement, focus on results, and financial instruments beyond grants such as loans and quasi-equity investments. We often think of them as flashy, and while that’s true of some, there are major European and Asian donors who keep out of sight.

Growth is driven by self-made wealth. The UK’s rich list shows that 50 years ago, most wealth was inherited, now it is self-made. This has brought an urgency about getting things done, which has spurred interest in new ways of engaging. Continue reading

Posted in Donor behaviour & giving stats, Effective giving | Tagged , , , , | Leave a comment

Are we relying on unreliable research?

Ask an important question and answer it reliably” is a fundamental tenet of clinical research. And you’d hope so: you’d hope that medics don’t waste time on questions that don’t matter or which have been answered already, and you’d hope that their research yields robust guidance on how to treat us*. Does research in our sector aimed at understanding the effects of our interventions adhere to that tenet?

We suspect not. It’s a problem because poor quality research leads us to use our
resources badly
. The example of microloans to poor villagers in Northeastern Thailand illustrates why.  In evaluations which compared the outcomes (such as the amounts that households save, the time they spend working or the amount they spend on education) of people who got loans with those of people who didn’t, the loans looked pretty good. But those evaluations didn’t take account of possible ‘selection bias’ in the people who took the loans: perhaps only the richer people or better networked people wanted them or were allowed to have them. A careful study which did correct for selection bias found that in fact the loans made no difference. The authors conclude that ‘‘‘naive’’ estimates significantly overestimate impact.” Continue reading

Posted in Uncategorized | Tagged , , , , , , , , | 5 Comments

Assessing impact needs a reliable comparison group

This letter discusses an article in Stanford Social Innovation Review and was first published there.

Dressed to Thrive” [in Stanford Social Innovation Review, Winter, 2013] describes the work of Fitted For Work (FFW) in helping women into work. By way of demonstrating FFW’s effectiveness, it reports that “75 percent of women who received wardrobe support and interview coaching from FFW find employment within three months… In comparison…about 48 percent of women who rely on Australian federal job agencies find work within a three-month period.”

But the comparison isn’t valid, and doesn’t demonstrate anything about FFW’s effect. This is because women who get FFW’s support differ from those who don’t in (at least) two respects. First, they found out about FFW and chose to approach it for help. It’s quite possible that the women who do this are better networked and motivated than those who don’t. That would be a ‘selection bias’ in the women which FFW serves. And second, of course, the women who come to FFW get FFW’s support. The comparison doesn’t show how much of the difference is due to the selection effect versus how much is due to FFW’s support. Continue reading

Posted in Impact & evaluation | Tagged , , , , , , , | 1 Comment

Why policy change takes more than just funding research

This article, written with Annie Duflo, was first published in Alliance Magazine. A pdf version is here.

Don’t just tell me what to do, come and help me do it!’ said an Indian government official to a researcher bearing results from studies into effective aid programmes. His response is salutary: there is much work now on increasing the use of evidence in public policy, so we need to understand what policymakers actually need and want, and what will help them be more evidence-driven. For foundations there is a clear message: it isn’t enough just to fund research. You have to make sure it reaches the relevant policymakers and in a form that is useful to them.

Over ten years, Innovations for Poverty Action (IPA) has run more than 350 studies in 51 countries to find what works in alleviating poverty. We have had some success in influencing policies of governments, NGOs, foundations and others. Here’s what we have found. Continue reading

Posted in Effective giving | 1 Comment

It’s hard to make evidence-driven decisions if loads of data are missing, or garbage

First, missing data. Philanthropic donors, operational charities and others often have to deal with this. Hence unearthing the missing data is a theme in Giving Evidence’s work: 

  • Massive emergency aid is now flowing to the Philippines following Typhoon Haiyan. Operational NGOs and government aid agencies can only make evidence-based decisions about what’s needed and where to prioritise if they all share data about their activities and plans – in real-time and in a machine-readable format. Generally they don’t: so after the Asian tsunami, for example, some children got vaccinated three times and others (presumably) not at all. Owen Barder who invented such a format, myself and others had a letter in the Financial Times requesting that these data be shared in order to avoid such nonsense in the Philippines. Owen then was on BBC Radio 4’s PM programme and Newsnight.
  • International development in general could be better based on evidence of need and supply if donors disclosed what they’re doing and where. Hence my rant in The Economist in support of Publish What You Fund, which finds that most donors don’t.

Continue reading

Posted in Impact & evaluation | Tagged , , , , , | 2 Comments

Shameful story of Rockefeller and Einstein

This was first published by the Huffington Post.

100 years old this year, The Rockefeller Foundation likes to tell the tale of its founders’ responsiveness and foresight:

‘When a young Albert Einstein sent a request for $500 to John D. Rockefeller’s top lieutenant, Rockefeller instructed his deputy, “Let’s give him $1,000. He may be onto something.” It was bold and daring, intrepid and risk-taking.’

Time is important, as Einstein of all people taught us, so it matters when that story took place. The answer is astonishing: it was 1924. The ‘young’ Einstein was 45 years old. He’d won a Nobel Prize three years before, for work published 19 years earlier which laid the foundations of quantum theory by explaining the photo-electric effect. [The Nobel cheque, below, even says ‘Professor’ on it.] The request came 19 years after his special theory of relativity, which shows among other things that E=mc2. It was also 19 years after he’d explained the bouncing around of gas atoms that you probably saw down a microscope at school. (1905 was a big year for Einstein: one of the most significant for any scientist, ever.) It was seven years after publication of his general theory of relativity about the nature of space-time: arguably the greatest achievement of the human mind, and five years after observational corroboration of a major prediction of general relativity, hailed by The Times newspaper as a ‘Revolution in Science – New Theory of the Universe’.

Continue reading

Posted in Analysing giving, Donor behaviour & giving stats | 2 Comments