I know. You’ve never heard of the Flemish Red Cross. You realise that such a thing probably must exist but you’d never hitherto realised it, right?
This is an operational charity. It is not a funder nor fundamentally a research operation. Yet in this year alone (it’s still only November), the Flemish Red Cross (properly called the Belgian Red Cross-Flanders, BRC-F) has produced 27 academic papers(!), and since this work began in earnest 12 years ago, it has produced 92 peer-reviewed papers. All its research is to answer properly operational questions that arise from its on-the-ground work.
As I say, this is an operational charity. It runs the blood-banking service in Flanders (the Dutch-speaking part of Belgium, population 6.5m), so its HQ has half a floor where people donate blood (see right), a full lab full of centrifuges and test tubes and god-knows-what medical kit which screen it, and vans for transporting it. There’s a big book stack for its service taking library books to people in hospital (see photo below). It runs first aid training, does humanitarian work through the Red Cross network, and has 249 ‘chapters’ across Flanders which do earthy things like befriending isolated elderly people. It runs refugee reception centres – one of which got set on fire the week I wrote this article.
Once upon a time, a Dutch person turned up at one of its first aid training sessions, and asked why BRC-F taught a method for doing CPR (cardiopulmonary resuscitation, to resuscitate after a heart-attack) different from that taught in the Netherlands. One method will be better than the other, and they can be compared because CPR – more than almost any other intervention on Earth – has a super-clear outcome which is evident rapidly. This made BRC-F realize that it needed to ensure that first aid advice and training manuals are based on the best available evidence. It began to systematically collect scientific studies on CPR and other first aid topics. It transpired that much first aid advice is based more on folklore and tradition than proper evidence(!). It is (obviously!) important that first aid is as effective as possible, so the BRC-F has developed evidence-based first aid guidelines for the whole of Europe.
For each first aid topic, BRC-F does a systematic review (i.e., finds and synthesises all the existing rigorous evidence), and creates / amends its first aid guidelines accordingly. If inadequate rigorous trials exist, it creates them. In first aid, it has produced 7 systematic reviews and 448 evidence summaries / systematic reviews(!!): the picture shows its training manuals for first aiders, and the massive stack behind them is the systematic reviews and evidence summaries which underpin them. Its recent studies have looked at first aid responses to poisoning and to snake bites (separately!), and BRC-F has recently become the Cochrane centre on first aid. An example RCT that it is running is about methods for teaching first aid.
Clearly many of its findings are applicable internationally, and the BRC-F has worked with regional experts to produce guidelines for first aid in India and Africa. (Local context is important: for example, Oral Rehydration Solution is the standard response to diarrhoea in developed countries, but BRC-F did a review to create guidelines for responding if that is not available.) Also, the International Red Cross Red Crescent Federation uses the BRC-F’s evidence summaries for its international first aid guidelines which are shared with 190 Red Cross/Red Crescent National Societies globally.
On the blood side, there are various reasons for which blood-banking services are legally required to decline (‘defer’) some types of donor, on grounds of safety of the person receiving the blood. These rules vary by country – oddly – but can include people with epilepsy, people who’ve had tattoo/piercings, men who have sex with men, transgender people. Knowing which people to defer is an operational issue for the BRC-F. It has done systematic reviews of the evidence for nine of these reasons, each of which has been published in a peer-reviewed academic journal. On some reasons, they found no scientific basis for the exclusion, so have gotten the law changed to match the science. An example was changing the law in 2016 to end the exclusion of people with hemochromatosis.
It also does ‘bench’ research (lab, pipettes: the kind of research that you mustn’t drop on the floor. The photo shows one of the team) related to the blood work, e.g., into how platelets behave when being separated from everything else in your blood. And on the humanitarian side, it has looked at how much water is needed per person per day in a disaster situation – for drinking, cooking, washing – and forecasting first aid need by using weather forecasts of things like floods, extreme cold or cyclones.
And also very unusually, it is interested in the evidence-base for management practices, in order to run itself in an evidence-based fashion. For instance, it found an opportunity during the refugee influx in 2015 to investigate the effect of recruitment practices. It had to more than double its number of centres (from 11 to 23), sometimes with only a week’s notice that a new centre was needed. The time pressure meant that it could only run a stripped-down recruitment process. It realised that this was a “natural experiment”, and compared the performance of people recruited through the speedy process against that of people recruited normally. And put the results in an academic journal, for good measure. (More on that here.)
Furthermore, most of this research effort is funded from the charity’s own operational funds. Very little of the research is externally funded.
This is all remarkably different from how most charities operate. The average charity does not use academic research at all to design their activities: in our recent study of UK charities and donors, half of survey respondents said that they use academic journal articles ‘never’ or ‘hardly ever’. They produce ‘monitoring and evaluation’ (which are research, despite rarely being framed as such) mainly at the behest of funders: in a study in 2012, only 7% of UK charities said that their monitoring and evaluation was driven by the desire to improve their services, whereas over half said that it was driven by funders. It’s mainly to ‘prove’ their impact. The research methods used are often unsuitable to answer the question: most are case studies or pre-post studies which cannot demonstrate causation.
I have never come across an operational charity which routinely uses systemic reviews: normally I have to explain what they are. Still less an operational charity which produces systemic reviews if none already exists. And still less an operational charity which has a dozen researchers on permanent staff producing systemic reviews worthy of A-grade academic journals and using them to inform their own work, other organisations’ work and change the law. Funded almost entirely from its own operational budget.
Pull up a chair and watch the Flemish Red Cross. It’s like going to Mars.
This lovely 3 minute video explains it: