Not everything that matters can be measured

24 Feb 2014 Voices

Debra Allcock Tyler rails against the growing demands to provide EBOs (evidence of the bleedin’ obvious).

I frequently hear nowadays that our sector doesn’t actually know if we’re doing any good.

This makes me really cross. There’s something ridiculously patronising about the notion that day after day a bunch of charity sector workers and volunteers turn up to work with no idea at all if what they’re doing is helping!

Charities do know whether or not the work they’re engaged in is any good.  The real problem is not that they don’t know, or even that they don’t measure it, it’s that they don’t measure it using the same metrics, or language, as those people exhorting them to ‘demonstrate’ their impact.

There’s a small charity I know on the south coast which got a small grant from the local authority to provide a volunteer outreach service to elderly folk who were still living in their own homes but were unable to get out much and no longer had family around them.  The purpose of the work was to alleviate some of the profound loneliness experienced by the beneficiaries as a result of their situation. The volunteer programme basically consisted of willing folk visiting these elderly people in their homes to provide company and comfort.    
 
They were told by the local authority that unless they could demonstrate the impact of the outreach volunteer programme, funding would be withdrawn.

The charity already monitors the work of its volunteers; they are debriefed on their visits, asked what further support they or the elderly person might need and so on.  All this is recorded.  But this apparently wasn’t sufficient.   So what are they supposed to do?  Ask Mr Jones, aged 87, to complete a questionnaire: “How lonely were you before the visit?  How lonely were you afterwards?”  Ridiculous.  

This is typical of what is so often demanded. Not useful reporting, in the charity’s own words, of what they are doing.  But what I call EBOs (evidence of the bleedin’ obvious).  It’s bleedin’ obvious that there is a real benefit to the lonely elderly having companionship.  Is it really worth wasting valuable funds, time, energy and effort creating some ‘measurement’ models that can be put on a piece of paper and fed into a computer?

Further – the cost of doing some form of ‘formal’ measurement that suited the local authority’s requirements was clearly disproportionate to the size of the grant.  

The whole notion of impact measurement, as it is currently perceived, sits uneasily with me.  Don’t get me wrong.  Of course charities should know what they achieve and be able to demonstrate it – but for the purpose of delivering for beneficiaries, not to satisfy some poorly thought-through requirement of a (usually statutory) funder. There are those who would say it’s unarguable that charities should produce impact reports.  But let’s ask - what do funders do with these ‘impact’ reports they insist upon receiving?  Because, let’s be honest, an awful lot of them probably don’t even get read at all, let alone acted upon.  What’s wrong with accepting what’s said in the annual report?

So – let me outline the issues with the current concept of ‘impact measurement’ as I see it.

  1. Who pays for it?  In my experience conservatively-minded funders are loathe to pay for what can be seen as ‘administration’ costs undertaken by those mythical gremlins of the charity world – ‘backroom staff’. If funders demand measurement without offering the funds to cover it there is a real danger that it will divert resources from frontline work.
  2. How do you know if what you are asking to be measured is actually what matters?  How do you set your criteria or monitor largely intangible outcomes? How long is that proverbial piece of string? I’m a huge fan of data and evidence generally – but we need to get it in perspective.  
  3. What about the ‘hidden’ unmeasurable impact – which is arguably more important?  Human lives are a dynamic system where the actions or decisions of one person can have profound and unforeseen effects on their future.  For example, how do you account for the interventions of a charity in the life of a child? During the formative years of childhood we are surrounded by ideas and choices which shape who we become later in life. Take a seemingly benign intervention in the form of an after-school club.  The real impact is not something that can be measured short term.  It’s long term and largely unmeasurable (at least affordably).  Do the children stay longer in education; are they less likely to end up in prison; more likely to earn more?  How on earth would you know?  It’s an EBO.
  4. Who is the measurement actually for?  Far too often it’s not actually about gathering evidence to improve a service – but satisfying the requirements of a funder who is asking for it because someone somewhere told them to.  The assumption being everyone can measure, ergo everyone should measure.   The problem is that if you can’t measure the work you do, the temptation is to do the work you can measure.
  5. Is it actually worth it?  Is the measurement going to tell you anything you don’t already know?  Or put a disproportionate burden on smaller charities when funders begin to expect the same level of detail in impact reports that they require from larger charities? For a community centre seeking a grant for a new roof the benefit is obvious – what ‘impact’ could they possibly demonstrate on a piece of paper that’s worth the effort?  “98 per cent of our visitors no longer get wet.  We know this because we asked them to fill in a form”?!

Perhaps this is our own fault.  Do we just blindly go along with whatever a funder asks for in order to secure the money – and end up doing stuff we know isn’t worth doing for the sake of a quiet life?

I think we should challenge them at the start.  Ask them: who is this for?  Who is going to read it?  What are you going to do with it?  I suspect that more than half the time it’s just a tick-box exercise – in which case we might just as well make it up.  

Charities should not just roll over and invent ever more complex ways of reporting on their work to satisfy half-thought-through demands from some funders.  We need to engage in dialogue with them.  Find out what they want – then show them how we know what we do matters.

And let’s never forget that old mantra – ‘not everything that can be measured matters, and not everything that matters can be measured’.

Debra Allcock Tyler is chief executive of the Directory of Social Change

More on