Ian Allsop demonstrates why a lot of research isn’t worth the paper it’s written on.
There was a time when finding out the answers to seemingly trivial questions could take a lot of painstaking research. Nowadays you are but a Google click away from finding the solution you need, and sometimes it may even be correct.
As an example, I will ask you now: “Who has scored the most own goals in their career?” See if you can find out the answer by the time you reach the end of this column.
While the internet, and technology generally, has facilitated our ability to capture and analyse data and knowledge, it has also exponentially increased the amount of information out there, meaning it can be harder to determine that which is useful and that which is, at best, ‘back of a fag packet’.
So many surveys
As a journalist, I am inundated with the results of one survey or another – indeed civilsociety.co.uk itself has been known to add to the oeuvre of work – though at the very thorough and credible end of the spectrum. But a lot of surveys are poorly researched, statistically insignificant and often self-serving.
Indeed some 74 per cent of all published research is known to be instigated and presented to prove a preconceived position, or to sell something.
I know this as I researched it myself, using a carefully designed questionnaire that would give me the precise result I needed to make this point.
The reason I have been pondering this is because last week I had another phone call from a research company having a nose around people’s perceptions of charities.
Admittedly, during a random sampling exercise I am as likely to be contacted as the next bloke with quite a lot of knowledge about the area being researched, but three in one year seems high.
Indeed, I asked this latest person to ring me if it was an entirely random call, and whether the fact I worked in a field where knowing people’s perceptions of charities was vital might disqualify me from taking part? Or was that in fact why I had been selected?
She dutifully went off to talk to her supervisor and then returned to the phone. She didn’t answer any of my queries (these researchers don’t like it when you turn it round on them) but said it was ok for her to continue to cross-examine me.
Her first question was: “Name as many charities as you can”. Now, I am not professing to know the names of all the 180,000-odd, but I do know a shedload. I explained, while suppressing a chuckle: “Well, as I just told you, my job is as a charity sector journalist (I may have even elevated myself to commentator) so I know literally thousands. Do you want me to list all of them?”
In the way that indicates the interrogator is reading from a script and not really listening, she said: “Yes, just list as many as you can”. I said: “We could be here all day.”
I did consider at this point merely repeating here in print the ones I mentioned over the next ten minutes, to fill space until the end of the column and avoid me having to think much more about how I am going to draw it to a conclusion. But I will stick to edited highlights.
Baffling the researcher
I started with the usual household names: Oxfam, National Trust, Cancer Research UK.
But then I got bored and thought I would have some fun and go a bit off-piste: Manchester Grammar School, Eton College, Charity of Mrs Mary Ann Blakemore Minor for Gentlewomen in Distressed Circumstances, Charifaith Common Investment Fund (other pooled investment vehicles aimed at charities are available).
I did regularly ask her if I had listed enough, but she kept encouraging me to continue. I threw in some exempt ones, some excepted ones, some small local obscure ones. What about Scottish ones, I asked? Do they count? I even – and I am not proud of this – made one up just to see what I could get away with.
Eventually, she could sense I was running out of steam and said I could stop; but then asked me some more-targeted questions about charity brand recognition in certain spheres of charitable activity.
As I said goodbye I did wonder what future initiative, or promotion, my responses would be used to try to influence, and the attached problems of drawing conclusions, and even formulating policy, on the basis of such research.
Ultimately, I suppose, it comes down to whether anyone is researching the research. And, if they are, whether they themselves are doing it properly.
Oh, and the answer to the research question I posed at the beginning? Why, William Shawcross of course.