Features

Making an impact

The methodology and dissemination of economic impact studies needs examining, says Richard Fletcher

Richard Fletcher
5 min read

I bet you’re thinking about economic impact studies. There can’t be many in the public or not-for-profit sector that aren’t. You might have a great artistic track record or be renowned for social and community work – but what seems to get support these days? Jobs, tourism, income. Underlying philosophical arguments aside, the arts must have some kind of role to play in economics, even if this isn’t, or shouldn’t be, their strongest suit.

Economic studies of the arts may have an air of novelty having been avoided due to general resistance from the sector in the past. ‘The economic importance of the arts in Britain’ report by John Myserscough in 1988 is considered the first study of its kind in the UK, and it seems the debate has not yet been satisfactorily concluded. For example, a recent summary of Museums, Libraries and Archives Council case studies found that of 274 study themes, the majority were social or educational, and 34 could be classed as having economic, tourism or regeneration themes. What might this balance look like in five years?

Have we shied away from investigation, for fear that our impact is negligible, or even negative? Little growth, few jobs created, not much earned? Or because we see ourselves inhabiting a fundamentally different world? We do not want to accidentally give ammunition to the unnervingly familiar criticisms of wasteful extravagance, or of the poor subsidising the leisure of the rich. The arts world often isn’t familiar with what needs to be uncovered, and may be openly suspicious of the results and their applied value. I recently helped carry out such an investigation, working with De Montfort University, Creative Leicestershire and Performing Arts Leicester, and since then have further considered the usage, value and limits of economic studies.

A persistent criticism is bias. You wanted a result and you found it. The report gets ‘PR-ed’ and polished, and cut down for wider consumption. One element of a strong, effective study is the clear application of a ‘scientific’ approach: show your methodology, and make your estimates clear even if people like me are the only ones to look for them. Take off your ‘art is irreducible’ hat for now. Fast but shallow studies may lead to Pyrrhic victories then, as more emerge, the shortcomings become more apparent. Once the buzz has quietened down, we realise that not much forward movement has actually occurred. Only suspicion and worse, disinterest, have grown.

Everyone likes the headlines of quantitative research, even if the process is loathed. Data gives you rock solid facts to get started with. If we consider wealth as a flow, how far can you or should you follow its impact? X% of visitors went to a pub or restaurant, and spent an average of £Y, making £Z a year, and keeping Q people employed?

Creation and dissemination of the final report hold a range of problems. The terms wealth, spend, value and impact all sound wonderful, but they do not mean the same thing! Try and give scale to huge figures. Look out for weasel words, generalisations and unnecessary padding. It can be tempting in the final stages to show something more elaborate and fantastic than a couple of pages of tables and figures. The pressure may be to keep digging, obfuscating your methods, until you come away with headlines. Why not? Who’s going to check? However, this risks deflating the crucial layer of impartiality central to effective research.

Am I arguing against passion for research? No, it’s great that there is a desire to look deeper. Academically, we see a new frontier out there, and we hold our share of responsibility for moving past instrumentalism and into fully fledged cultural economics. See ‘Measuring the value of culture: a report to the DCMS’ by Dr Dave O’Brien for a recent review of the struggle to evaluate culture, referencing similar issues within other policy areas such as environment and health. On a more practical note, ‘Making sense of statistics’ by the charity Sense about Science is a great primer, particularly for understanding that the ‘what’ of statistics can only work in combination with the ‘how and why’.

This research lark shouldn’t only be for researchers and policy-makers. I have to thank the clients and practitioners who, at the least, give time to our interventions, and, at best, provide inspiration and a genuine desire to critically examine their work. I hope those who continue to do so are satisfied not only in the modest direct benefits to their organisation, but can see how each study represents an opportunity to take the whole field a small step forward or back.