UK charitable foundation staff and trustees are very white and very male. They’re also often senior in years, and pretty posh. None of those characteristics is necessarily a problem of itself, but (a) the homogeneity creates risk of lacking diversity of views, experiences and perspectives. Increasing diversity has been shown by many studies and circumstances to lead to better decisions and better results. And (b) foundation staff and boards may collectively have little experience of (and hence, good understanding of) the problems they seek to solve, and insights into the communities they seek to serve.
Funded by a group of UK grant-making foundations, Giving Evidence is rating UK foundations on their diversity. We are also looking at their transparency – e.g., how easy it is to find out what they fund and when – and their accountability practices – e.g., whether they have a complaints process, whether they cite the basis on which they assess applications and make decisions, whether they publish any analyses of their own effectiveness. (Read an article in Alliance Magazine about this project.)
We have consulted on the criteria: our ‘starting criteria’ are below. Our criteria are drawn in part from other rating schemes, such as: GlassPockets’ Transparency Standard, Give.Org’s BBB Standards for Charity Accountability and the Funders Collaborative Hub’s Diversity, Equity and Inclusion Data Standard.
Giving Evidence will do and publish this rating annually – until the money runs out! We will publish all our data and analyses each time – because we are scientists and believe in the power of openness and transparency, not least to identify inaccuracies and areas for improvement. Our research and assessment will use only a foundation’s public materials (website, annual reports etc.) because these are normally all that is available to a prospective grantee – who may be deterred if a foundation looks too different from them.
Please note that this is a rating: it is not a ranking (which compares one organisation with another and shows who is top), nor an index (which shows how the whole pack moves over time). Unlike a ranking or index, a rating is an absolute measure: it is quite possible for all entities to achieve the top level, and it is also possible for them all to be in the bottom drawer. There is no pre-determined distribution across the ‘grades’: see graph below.
What are the criteria, and how can a foundation do well on them?
We are publishing our ‘starting criteria’, downloadable here. These are the criteria with which we are starting: we fully expect that some will turn out to be ambiguous, duplicative, or unworkable, so may vary a bit from the final list (which we will also publish). That document includes the provenance of each criterion. We will publish the eventual list that we actually use.
We are also publishing guidance for any foundation on doing well on the rating. For example, one criterion is around having an accessible website, so we include guidance on how to do that.
Whose idea was this and who is funding it?
The project is funded by various UK foundations, led by Friends Provident Foundation. Other funders include: Barrow Cadbury Trust, The Blagrave Trust, Esmee Fairbairn Foundation, John Ellerman Foundation, Joseph Rowntree Reform Trust, Lankelly Chase Foundation and Paul Hamlyn Foundation. The group of funders meets periodically to advise on the project, and the Association of Charitable Foundations joins those meetings. The funders do not have any ‘editorial influence’ e.g., over the results of the research. The project draws on various social movements that have gained momentum over the years, for example, Black Lives Matter, Equal Pay, Disability and LGBTQ+ rights.
Giving Evidence has worked on these issues before. For example, in this study, we looked at how many UK and US charities and grant-making foundations have public meetings (e.g,. Annual General Meetings) or decision-making meetings in public. (Answer: very few.) We were inspired by the 800-year-old City Bridge Trust in London, all of whose grant-making meetings are in public, and Global Giving, which has a public AGM. Caroline also wrote in the Financial Times about the lack of demographic diversity on foundation boards and some ideas for improving that.
Will this include all UK grant-making foundations?
No, simply because they are more than the resources allow. We will assess:
- All the foundations funding this project. They are all trying to learn and improve their own practice. This project is not about anybody hectoring anybody else. Currently 10 foundations are funding this work. And:
- The five largest foundations in the UK (by grant budget). This is because they are so large relative to the overall size of grant-making: The UK’s largest 10 foundations give over a third of the total given by the UK’s largest 300 or so foundations; and giving by the Wellcome Trust alone is 12% of that. We also know that the transactions that many charities experience cover the range of foundation sizes.
- A stratified random subset of other foundations. We are including all types of charitable grant-making foundation, e.g., endowed foundations, fund-raising foundations, family foundations, corporate foundations, community foundations. We have not included public grant-making agencies (e.g, local authorities or the research councils) because they have other accountability mechanisms. Our set is a mix of sizes: a fifth from the top quintile, a firth from the second quintile, and so on. The set is chosen at random from the list of the UK’s largest 300 foundations* as published in the Foundation Giving Trends report 2019 published by the Association of Charitable Foundations, plus the UK’s community foundations.
*In fact, that report details 338 foundations. We are looking at those, plus 45 community foundations (the 47 listed by UK Community Foundations minus the two for whom no financial information is given), ie., 383 foundations in total.
I’m a UK foundation. Can I opt into this?
Yes, by becoming a funder of the project. As mentioned above, all the project’s funders are included. Beyond them, our selection will be random.
Where do the criteria already under consideration come from?
They come from two sources. First, our own experience over many years with many funders (and in many countries). Second, we are borrowing some ideas from similar initiatives such as Transparency International’s Corporate Political Engagement Index, the “Who Funds You?” initiative for think tank funding transparency, the Equileap Global Gender Equality Index, , ACF Transparency & Engagement, Stonewall Workplace Equality Index and ShareAction’s annual ranking of fund managers.
Who is conducting the research and analysis? How will it be done?
Giving Evidence is doing the research and analysis. We have a team of researchers, all with experience of charities and funding, and with diverse backgrounds. They are also demographically diverse. Each foundation is assessed on the criteria by two researchers independently. Foundations will be allocated to researchers at random. The whole research process is overseen by Dr Sylvia McLain, a chemist who formerly ran a research group and taught at the University of Oxford. Where the two researchers’ scorings differ, there is a discussion between them and Sylvia to understand and resolve the difference: it is sometimes necessary for a third researcher to assess the foundation on the disputed criteria, and if so, they are ‘blind’ to their colleagues’ assessments. This is a standard process used in many situations, e.g., for marking university exams, assessing research applications. The whole project is overseen by Giving Evidence’s Director Caroline Fiennes. Details of our team are here.
When does Giving Evidence expect to publish the findings from the first year’s analysis?
How do I sign up to get updates on this project?
What is the timetable on what will happen next?
First, we defined the criteria. We ran a public consultation and did considerable work to trial the various suggestions to see which one are actually workable.
Second, we are doing the research. We expect that to be during August – October 2021. During November and December 2021, each included foundation is given the data about itself to check.
Third, we will do the analysis. We expect that to be during December/ January 2022.
Consequently, allowing a bit of time for contingency, preparing communications etc., – and the fact that we intend to publish in print, which has a lead-time – we expect to publish the first year’s ratings during ~March 2022.
Can Giving Evidence do this analysis for foundations based in other countries?
Of course – if somebody wants to fund us to do it. Contact us here.
Further information is available on Friends Provident Foundation’s website.