Charities need a better way to show evidence of impact to the public

16 Dec 2015 Voices

The infrastructure for evidencing the effectiveness of social programmes is fundamentally flawed, says Genevieve Maitland Hudson.

The infrastructure for evidencing the effectiveness of social programmes is fundamentally flawed, says Genevieve Maitland Hudson.

The charity reaction to last week’s report by the True and Fair Foundation highlighted the disastrous lack of agreement around the ‘right’ ways of accounting for what charities do. Overhead is not the answer, certainly, but the kerfuffle around it only demonstrates the need for better quality shared evidence.

Despite good intentions, particularly from increasing numbers of providers on the frontline, we have a system that is poorly conceived for improving services for those we aim to help. This is in large part because of the incompatibility between the product we need – good quality useful data – and our methods of production – ‘unique’ evaluation frameworks and reports.  

Improving social programmes from evidence works best when datasets are relatively large, reliable, as longitudinal as possible and shared. What we produce is relatively small, of poor quality, short-term and private.

Why is the current system so bad, and how can we create a better evidence infrastructure for the future?

There are several intertwined elements that conspire to embed the current system. Here are three:

  • Misaligned incentives: commissioners and funders treat evidence as a form of compliance or a contractual safeguard; providers respond in kind and collect poor quality, largely unhelpful data which is fed up a chain and too often left unanalysed by either party. Intention and collection stand in for usefulness.  
  • The consultancy market: the ways in which evidence collection is incentivised works with the relative lack of appropriate skillsets in provider settings to promote the buying in of evaluation expertise. That market makes it hard to argue for collective, shared datasets since consultancy tends to promote the opposite. By arguing for shared data, evidence experts would risk an established income stream.
  • The culture of innovation: the particular emphasis on innovation that has driven so much funding and commissioning as the evidence agenda has become established has made collective and shared approaches to data more difficult to implement.

The key facets of innovation are change and USP. The best indicators for programme improvement are relatively constant across providers. These two are often in direct opposition to one another. And data tends to make a social entrepreneur feel a lot less special, however awesome their digital dashboard.

In order to make the substantive improvements that would cut through these systemic difficulties, we need a very different attitude to evidence. We need to start understanding data collection as a shared, collective endeavour rather than an individual exercise in self-promotion. Producing evidence should be different in kind to marketing or fundraising.

We should also think harder about the infrastructure itself. Who collects what kind of data, and where does it sit? How can data collection be made cheaper and easier? Above all, how can it become useful to frontline delivery and improve services for those we want to help?

There are some hopeful signs. The Ministry of Justice Data Lab is an excellent example of shared outcome data that is useful to social programmes and can be provided at minimal cost from information already collected. Launched as a pilot in 2013, it has now been confirmed as a permanent service. An Education Data Lab has also been launched to help social programmes access data from the National Pupil Database. Research is in progress for Employment and Health Data Labs which would continue this work.

In Scotland the new Looked After Children Data Strategy aims to broaden the evidence base for children in, and leaving, care by linking existing data sources to healthcare, justice and employment data. This initiative will aim to tackle the specific problems that come from aligning data sources whilst protecting privacy. There will be much to learn from this newly launched initiative.

The John Jay College of Criminal Justice is New York has established the Evidence Generation initiative to support social programmes working in youth justice to produce better quality evidence for their effectiveness. Evidence Generation gets beyond the current UK paradigm which matches a social research student with a social programme for a short term placement (evidence for the usefulness of this approach is unclear). Short term evaluation is not the core of Evidence Generation’s work. Instead they aim to build internal capacity for data collection and measurement within their partner agencies in order to meet the requirements of larger systematic reviews.

These are a few methods that could help in rebuilding the infrastructure of evidence to improve programmes for those we want to help. Each of these is one part of the story, one building block.

To get a better system we need to consider where each effort works best, and bring them together in a shared and collaborative effort. There are significant challenges to overcome. Sharing information is risky. But in this case it is a risk worth taking. 

Genevieve Maitland Hudson is a director of impact measurement consultancy OSCA