Rating UK foundations on their transparency, accountability and diversity

UK charitable foundation staff and trustees are very white and very male. They’re also often senior in years, and pretty posh. None of those characteristics is necessarily a problem of itself, but (a) the homogeneity creates risk of lacking diversity of views, experiences and perspectives. Increasing diversity has been shown by many studies and circumstances to lead to better decisions and better results – including on climate. And (b) foundation staff and boards may collectively have little experience of (and hence, good understanding of) the problems they seek to solve, and insights into the communities they seek to serve. 

Funded by a group of UK grant-making foundations, Giving Evidence has rated UK foundations on their diversity. We also looked at their transparency – e.g., how easy it is to find out what they fund and when – and their accountability practices – e.g., whether they have a complaints process, whether they cite the basis on which they assess applications and make decisions, whether they publish any analyses of their own effectiveness. (Read an article in Alliance Magazine announcing this project, and a second article with some early findings.)

The 2022 results are now public! They are summarised here, and the full report is here. Watch the launch event in which we discuss the results, here. We have run the analysis again during Autumn 2022 and will publish the Year Two results in Spring 2023.

The criteria for Year Two (research in 2022) are the same as in 2021: details here.

We consulted on the criteria. Our criteria are drawn in part from other rating schemes, such as: GlassPockets’ Transparency Standard, Give.Org’s BBB Standards for Charity Accountability and the Funders Collaborative Hub’s Diversity, Equity and Inclusion Data Standard. They are also informed by public consultation we ran in the summer 2021. For Year Two (2022/23) we did a second public consultation. Below are the criteria we started with – though some turned out not to work.

Giving Evidence will do and publish this rating annually – until the money runs out! Our research and assessment uses only a foundation’s public materials (specifically, its website and annual reports). This is because we are taking the stance of a prospective grantee – who may be deterred if a foundation looks too different from them. 

Please note that this is a rating. It is not a ranking. That is so that improvement isn’t zero-sum: everybody can rise or everybody can fall. Equally, everybody can get an A or everybody can get a D. It is also not an index (which shows how the whole pack moves over time). Unlike a ranking or index, a rating is an absolute measure. There is no pre-determined distribution across the ‘grades’ because that would make it zero-sum. This is the same as for commercial bond ratings: see graph below.

What are the criteria, and how can a foundation do well on them?

Before we gathered any data for Year One (2021), we published our ‘starting criteria’, downloadable here. These are the criteria with which we started: we expected that some would turn out to be ambiguous, duplicative, or unworkable, so might vary a bit from the final list – which they did. That document includes the provenance of each criterion. The eventual list that we actually used is in the report above, and here.

We also published guidance for any foundation on doing well on the rating. For example, one criterion is around having an accessible website, so we included guidance on how to do that.

Foundations are exempt from questions irrelevant to them, e.g., a foundation with no / few staff is exempted from reporting pay gap data, and a foundation with small investment is exempted from reporting its investment policy.

Whose idea was this and who is funding it?

The project is funded by various UK foundations, led by Friends Provident Foundation. Other funders include: Barrow Cadbury Trust, The Blagrave Trust, Esmee Fairbairn Foundation, John Ellerman Foundation, Joseph Rowntree Reform Trust, Lankelly Chase Foundation, Indigo Trust, John Lyon’s Charity, City Bridge Trust, and Paul Hamlyn Foundation. The group of funders meets periodically to advise on the project, and the Association of Charitable Foundations joins those meetings. The funders do not have any ‘editorial influence’ e.g., over the results of the research. The project draws on various social movements that have gained momentum over the years, for example, Black Lives Matter, Equal Pay, Disability and LGBTQ+ rights.

Giving Evidence has worked on these issues before. For example, in this study, we looked at how many UK and US charities and grant-making foundations have public meetings (e.g,. Annual General Meetings) or decision-making meetings in public. (Answer: very few.)  We were inspired by the 800-year-old City Bridge Trust in London, all of whose grant-making meetings are in public, and Global Giving, which has a public AGM. Caroline also wrote in the Financial Times about the lack of demographic diversity on foundation boards and some ideas for improving that. 

Will this include all UK grant-making foundations? 

No, simply because they are more than the resources allow. Each year, we assess 100 foundations. They are:

  1. All the foundations funding this project. They are all trying to learn and improve their own practice. This project is not about anybody hectoring anybody else. In Year One, 10 foundations funded the work. And: 
  2. The five largest foundations in the UK (by grant budget). This is because they are so large relative to the overall size of grant-making: The UK’s largest 10 foundations give over a third of the total given by the UK’s largest 300 or so foundations; and giving by the Wellcome Trust alone is 12% of that. We also know that the transactions that many charities experience cover the range of foundation sizes. 
  3. A stratified random subset of other foundations. We are including all types of charitable grant-making foundation, e.g., endowed foundations, fund-raising foundations, family foundations, corporate foundations, community foundations. We have not included public grant-making agencies (e.g, local authorities or the research councils) because they have other accountability mechanisms. Our set is a mix of sizes: a fifth from the top quintile, a firth from the second quintile, and so on. The set is chosen at random from the list of the UK’s largest 300 foundations* as published in the Foundation Giving Trends report 2019 published by the Association of Charitable Foundations, plus the UK’s community foundations.

*In fact, that report details 338 foundations. We are looking at those, plus 45 community foundations (the 47 listed by UK Community Foundations minus the two for whom no financial information is given), ie., 383 foundations in total. 

For subsequent years, we will rate foundations in categories (1) and (2) above, and will do a fresh sample for (3).

I’m a UK foundation. Can I opt into this?

Yes, by becoming a funder of the project. As mentioned above, all the project’s funders are included. Beyond them, our selection will be random. 

Some foundations have asked whether they can pay to have this assessment even if they are not in the set to be rated. If you are interested in this, please get in touch.

Where do the criteria already under consideration come from? 

They come from two sources. First, our own experience over many years with many funders (and in many countries). Second, we are borrowing some ideas from similar initiatives such as Transparency International’s Corporate Political Engagement Index, the “Who Funds You?” initiative for think tank funding transparency, the Equileap Global Gender Equality Index, , ACF Transparency & Engagement, Stonewall Workplace Equality Index and ShareAction’s annual ranking of fund managers.

Who conducts the research and analysis? How is done?

Giving Evidence does the research and analysis. We have a team of researchers, all with experience of charities and funding, and with diverse backgrounds. They are also demographically diverse. Each foundation is assessed on the criteria by two researchers independently. Foundations will be allocated to researchers at random. The whole research process is overseen by Dr Sylvia McLain, a chemist who formerly ran a research group and taught at the University of Oxford. Where the two researchers’ scorings differ, there is a discussion between them and Sylvia to understand and resolve the difference: it is sometimes necessary for a third researcher to assess the foundation on the disputed criteria, and if so, they are ‘blind’ to their colleagues’ assessments. This is a standard process used in many situations, e.g., for marking university exams, assessing research applications.

We then send each included foundation the data about itself for it to check. (We used the contact details that it provides publicly.) Once we had a data-set that we were happy was correct, we turned the data into scores: each foundation gets a numerical score on each of the three pillars. The total points available varies between foundations because of their exemptions. From that score and the total available, we calculate a % score on each pillar for that foundation, and that translates into their A/B/C/D rating on that pillar. The combination of those pillar scores translates into the foundation’s overall rating.

The whole project is overseen by Giving Evidence’s Director Caroline Fiennes. Details of our team are here. Masses of detail about the method is in this Twitter thread and the report above.

When does Giving Evidence expect to publish the findings from the Year One analysis? 

The Year One results are out: summarised in a Twitter thread here, and the full report is here. The Year Two (2022/23) results will publish in around March 2023.

How do I sign up to get updates on this project?

Sign up to Giving Evidence’s newsletter, here, and watch the Twitter feeds of Giving Evidence’s Director Caroline Fiennes here and the Foundation Practice Rating here

What is the timetable on what will happen next?

For Year Two, we ran a consultation, inviting comment and suggestions on any aspect of the FPR. We then published the list of criteria for Year Two. The research for Year Two has happened. We expect to publish the Year Two results in Spring 2023.

Can Giving Evidence do this analysis for foundations based in other countries? 

Actually, it needs to be run by somebody in-country, who really knows the foundations, data and regulatory requirements in that country. We’re happy to help you do it. You will need funding, lol. Contact us here

Further information is available on the Foundation Practice Rating website.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s