Photo: Julie70 Joyoflife on Visualhunt / CC BY-NC-ND
The data illusion
Funders are increasing their demands for data from arts organisations while only paying lip service to quality in their own statistics. It’s time to stop indulging them, says Jonathan Knott.
A recent report from Nesta argued there is a need to develop new ways of measuring the value of culture beyond narrow economic metrics. The authors say that “within government there are still far more developed means for measuring investment in physical things – railways, roads, airports – than in intangibles like reducing social isolation, improving health or the arts”.
To illustrate its point, the report highlights research where people were asked how much they would pay to save a local museum from closure, or a cathedral or historic city from damage by climate change. By averaging these answers, the researchers reached a numerical figure supposedly indicating how much people valued these assets. This kind of approach, it says, should form part of a commitment from Government to invest in “establishing the value related to cultural and heritage activity”.
The authors’ desire that Government should consider more than market value when deciding what to invest in is fair enough. There is power in some hard numbers for those making a case for arts funding. But this paper is part of a growing trend towards quantification in the arts, and if the sector indulges the expectation to communicate its value in metrics and numbers, it could end up paying a heavy price.
Misleading
Funders such as Arts Council England (ACE) have been at the forefront of this trend: last year, it introduced its Impact and Insight Toolkit – previously known as Quality Metrics – requiring many National Portfolio Organisations (NPOs) to submit quarterly data on how people rate the artistic quality of their work.
This approach has faced significant criticism from academics and practitioners alike, not least because of concerns over the flaky methodology and proposed aggregation of the data being collected.
ACE’s calls to supply data would carry more weight if the funder’s own use of statistics was not so questionable. In its latest Annual Report and Accounts, ACE reported a huge drop in visitor numbers at NPOs. It now emerges that the reported trend was largely down to poor methodology, leading ACE to apparently overstate a UK-wide drop in exhibition visitors by around 350%. Yet despite the UK Statistics Authority backing up the concerns AP raised about the figures, ACE still refuses to engage in a good faith dialogue about them.
The Arts Council of Northern Ireland also made a major blunder recently. A report that informed its new five-year strategy seemed to show that the total income of its portfolio organisations had risen by 16% over three years, despite core funding cuts of more than 40%. The figures didn’t make sense, however, and when AP asked the funder, it admitted that “the total income figures do not add up correctly. This is due to an error in the Excel formula used to calculate the total Income cells”. The corrected figures showed a fall in total income. The good news is that they at least fessed up to the mistake.
These were major oversights, included in important publications. But the fact that both mistakes took place within the past year shows how this kind of issue is far from rare. If we (generously) discount the possibility that the funders are being intentionally misleading, then the worrying conclusion must be that the people who sign off on these reports either don’t read, or don’t understand, their own data.
Performative
So why are funders so keen to request data from the sector, while apparently being so careless with their own? The answer may be in the symbolic role of data. The academic Eleonora Belfiore believes that “the taking part in the auditing process itself becomes a performative act: it is the very fact of gathering data and publishing, more than the concern for what the data tell you, or the rigour (or lack thereof) of their collection that becomes paramount”.
Constantly demanding data, while changing formats, metrics, methodology and requirements every few years, creates the illusion of order and control, while actually making meaningful insight more difficult. The situation is convenient for funders, as it reinforces their power while making it harder to hold their own performance to account. It also provides useful work for consultants and researchers. For arts organisations themselves, however, the advantages are less obvious.
Join the Discussion
You must be logged in to post a comment.