Take part in the 2025 Charity Shops Survey!

Now in its 34th year, the survey provides detailed benchmark data, giving you a better understanding of the charity retail sector. Deadline for submissions is 4th July.

Take part and find out more

Hitting the target but missing the point?

03 Sep 2015 Voices

Sally Cupitt explains why the evaluation of grant funding by charities needs to be more flexible and innovative. 

Sally Cupitt

Sally Cupitt explains why the evaluation of grant funding by charities needs to be more flexible and innovative. 

Funders and commissioners need to monitor and evaluate how their funding is used; not just to check that money is being spent correctly, but also to demonstrate the difference their money makes and to share learning. Many voluntary organisations understand why their funders need to ask for information, and are happy to provide it.

But while some funders are doing really well in this area, others have not yet got it right. There is good evidence that many voluntary organisations are still being asked for too much information, or the wrong information.

Too much monitoring?

One of the key principles of good funding practice is that monitoring requirements should be proportionate to the level and type of funding. This sounds obvious, but at NCVO Charities Evaluation Services we still hear of examples where the monitoring of £10,000 grants is similar to those of £1,000,000.

Of course, funders may want to increase the level of evaluation of innovative or pilot projects, especially those where there is potential for rolling out the intervention. This is sensible and advisable, but that additional monitoring and evaluation must be supported by the funder.

Our 2008 Accountability and Learning research reported that many voluntary organisations found their monitoring and evaluation requirements burdensome. More worryingly, there was good evidence that too much of the information requested wasn’t being used by the funder. A good rule of thumb is that if you won’t use the data, don’t collect it.

The same research found that, paradoxically, a heavy focus on reporting may actually make voluntary organisations less likely to use monitoring and evaluation for learning and development, as they begin to associate the activity with external compliance rather than internal improvement.

Collecting the wrong data

Some voluntary organisations tell us that the data they are required to collect for their funders isn’t also of use to their organisation. Others note that different funders ask for different things, or in different ways (often using different language for key terms) resulting in an increased monitoring burden.

Because many voluntary organisations rely heavily on project funding, this makes things worse - a lack of time or planning leads some organisations to develop multiple reporting systems to satisfy the needs of a range of funders.

There remains a heavy focus within our sector on output-based reporting, focused on scrutiny and not outcomes - perhaps exacerbated in the current climate of austerity. Yet it is widely accepted that voluntary organisations and their funders should be focusing on outcomes too. This is made worse by short-term funding and reporting timescales, where there simply hasn’t been sufficient time for outcomes or impact to be achieved.

Demonstrating impact

There are some interesting tensions in impact reporting for funders, about which further debate would be useful. If a funder wants to be outcomes-based, how much flexibility do they give the funded? A full outcomes approach may mean not being prescriptive about activities, recognising that as long as outcomes are achieved for the intended beneficiaries, it actually doesn’t matter how they are achieved.

Such flexibility may need to extend to the choice of outcomes indicators; one might argue that these should be locally identified. But such flexibility may lead to difficulties with aggregation of non-comparable data later on.

Going forward

While we understand the difficulties for funders, a degree of flexibility in their requirements would be very helpful. Alongside proportionate and appropriate monitoring and evaluation requirements, such flexibility might also include shared reporting with other funders, fitting in with the existing monitoring and evaluation systems of an organisation or moderating reporting frequency according to the nature of funded work.

We recognise, as do many funders, that the quality of outcomes data collected by some voluntary organisations needs to improve if funders are to be able to use it well for demonstrating their impact. While funders need to ensure their requirements are reasonable, the voluntary sector needs to fulfil its part of the bargain and ensure that the quality of their data is sufficiently good.

If you want to read more on good practice for funders, see: the NCVO KnowHow NonProfit pages on commissioning and evaluation and the Inspiring Impact’s funder pages.

Sally Cupitt is the Head of NCVO Charities Evaluation Services