Who’s not applying themselves? Donors need to know

Writing and publishing a book which should improve charitable giving I figured made me a social entrepreneur. I needed money – all that editing and design doesn’t come cheap – so I approached UnLtd, the fund for social entrepreneurs set up with £100m left after the Millennium Commission and run by lots of McKinsey people.

I’d have to complete a form – you can imagine how much entrepreneurs love a form – to apply for a grant of up to £10k. The odds are reportedly about one in five, the average grant about £5k*, and the form pretty complicated. I’d need also to be interviewed and assessed … all occupying time which I could be spending actually doing my entrepreneurial work.

UnLtd’s killer question was how would I assess the impact of my ‘project’ (let’s come back to the term ‘project’). It’s a book! What kind of answer are you expecting? That I’ll do an annual survey of charitable giving, identifying changes over time, and doing some kind of regression analysis to circumscribe the book’s contribution to that? Or maybe I should do a randomised control test, to compare donors who read my book with those who didn’t and track their relative performance over time? UnLtd might supply £3k of the ~£12k it’ll cost, even before accounting for my time writing and organising publication, so UnLtd gets to ‘count’ a quarter of whatever impact that experiment identifies?

What rot. Answering the question with any sensible degree of rigour would require work and cost which dwarf that of creating the book itself. So I gave up and went elsewhere. Two private donors, each of whom I’d only met once obliged because the book is so obviously a good idea.

The story is salutary for several reasons. For one, Unltd will never know about entrepreneurs it fails to attract. Though it probably tracks its impact, I bet that only includes entrepreneurs it does support. There’s an outside chance that it considers the entrepreneurs who apply but are rejected – though very few donors even get that far. The ‘ones that got away’ – who were deterred before the process even started – are never considered.

This ‘silent witness’ is important because if a funder is ostensibly trying to support the best entrepreneurs in the land, it needs at least to find them. Yet UnLtd’s ‘help’desk didn’t even take my name, so they have no way of researching this silent constituency.

UnLtd is not alone in this. Donors of all descriptions should actively research the organisations which their processes fail to reach.  There’s a ‘selection bias’ even before the applications forms arrive. Usually these processes select (i.e., preclude everybody except) organisations which are able to discover that the funder exists (watch out if you’ve got no network), organisations able to complete forms (watch out if you’ve got poor literacy, bad English and/or no internet access), and also organisations and people who are well-enough connected to source their requirements from less bureaucratic sources.

So donors who are ambitious and hence passionately chasing the best uses of their resources, do well to consider the important silent witnesses who don’t register on their processes. A little research is required – and possibly just a little imagination.

Giving a goat as a present? Get an extra half-a-goat for free, like this—>

This entry was posted in Admin costs, Donor behaviour & giving stats, Effective giving and tagged , , , , , , , , , , , , , . Bookmark the permalink.

Leave a comment