Photo: giladlotan / Visual hunt / CC BY-NC
Does more data mean more progress?
ArtsProfessional’s recent revelations about Culture Counts and Quality Metrics prompt Richard Fletcher to question how we manage information in the arts. Is it time to speak up about what we want and need?
The recent exclusive from ArtsProfessional related to Quality Metrics / Culture Counts is a useful example of how troubling issues of transparency in the sector are linked to wider information management issues in general. I was also pleased to see it picked up on by the latest issue of Private Eye.
As a researcher, I am always optimistic about what big data and more standardised approaches to applied research can do for the sector. It is challenging enough to get organisations talking the same language of demographics and segmentation through something like Audience Finder. While some may be understandably sceptical I even think some kind of shared definition of ‘quality’ through the quality metrics can be valid and of some use in decision making. More broadly, there are plenty of other platforms and standards emerging besides Audience Finder and Culture Counts. Look to any one of a number of ‘do-it-all’ event management and box office systems, and audiences aside, what about Julie’s Bicycle for environmental data, perhaps financial benchmarking by MyCake?
The real novelty of these new platforms comes from the range and number of users, the agreed standards and methods
Does more data necessarily mean more progress and is information management only a technological problem? This is where my scepticism kicks into gear. The fears and questions we hear repeated about all of these platforms range from: how much it will cost both now and in the future; will we be obliged to participate; do we have to do it every year; we already have our own systems set up, we have our own set of priorities, is there any chance of influencing the direction in the future; can we tweak some of the questions; can we include our own approaches? While some are more easily dispelled than others, the reality behind all of these fears is a nervousness about handing over or signing up to a system in which the end user has very little ownership or influence. It’s not the same as handing over the details of your personal life to Facebook either, there are whole organisations, future budgets and grants that will be affected by your decision.
ArtsProfessional rightly highlights the risk to Arts Council England (ACE) and of course the wider sector from investing (cash, time and indeed data itself) in a system in which it seems to have little to no hard stake in the future. And if ACE themselves have limited influence in the system, do you think your organisation will? Why should it take a Freedom of Information request to even get this far? (I could also recommend AP readers make an FOI request if they are interested in the bid documents related to the 2013 UK City of Culture shortlist.)
People used to scoff a little at technocratic phrases like ‘Data is new Oil’ but we are surely starting to see it play out. £2,000 a year for about 700 NPOs adds up to quite a bit. Virtually all of these new platforms are based on a software-as-a-service (SaaS) business model, with the upfront costs being cheaper (often free) but the enhanced subscription elements running year after year. SaaS can still have open-source roots though too, look at WordPress, you can pay for it or host it yourself and all the features are effectively available to you either way.
Even the use of that word ‘data’ can be a handy smokescreen. Data sounds like a neutral, unrefined commodity that only becomes ‘information’ or ‘knowledge’ once it has been processed: Raw data has no value, so give it to us for free so we can process it and sell it back to you. Are we in a situation where one (mainly) publicly funded body regularly charges another (mainly) publicly funded body to hold its own data, even its own ‘knowledge’ to ransom?
I won’t deny there are genuinely novel, big picture analyses going on with these platforms and they do help speed up analysis. We have to admit though, that a huge amount of the technical side is admittedly large-scale but fundamentally quite simple calculation. The real novelty of these new platforms comes from the range and number of users, the agreed standards and methods. I used to think that data-sharing and all this exciting big picture stuff didn’t happen because we literally did not have the tools, funding or standards to be able to easily do so. Now these things seem to be possible, it’s all the more sad that they still don’t happen because the data is mostly siloed away.
Shared standards, in cases where they make sense and are seen to be valuable by all parties are great, but there can never be a one-size-fits-all approach and organisations will always want to prioritise some areas over others. This is already hard enough for relatively discrete, concrete topics such as demographics, and surely even more challenging for abstract issues of artistic quality. Even if ‘we’, in the arts had well-worn, sector-wide standards that ‘we’ all agreed on, what about local authority demands, sponsors, businesses and who knows how many other perspectives? And what if we step outside of the UK, to wider Europe or the rest of the world? I couldn’t agree more with the quote given from a research company: “Any software tool designed for continuous use creates barriers to exit for clients.” I would add this sentiment is equally concerning if we consider not only the practical effects but the more intellectual ‘barriers to exit’ too: what gets asked, what doesn’t, who decides what’s important?
AP points out this specific case may even be in breach of state aid rules. Arguably it is also indicative of missing the point of the Government IT Strategy as well, which encourages greater uptake of open source alternatives to commercial solutions. While technology itself is not the driving challenge of good information management, for completions sake I should at least point out that the things these platforms do are not exclusive to commercial providers. At the DIY, open-source end of things there are the likes of Open-Geodemographics for segmentation and Open Data Kit for data collection. The UK Data Service is an excellent example of various resources made available to government, business and citizens alike. In a similar vein, so is the international Open Government Partnership. On the design side, good-looking, dynamic and comprehensible dashboard-style data viewers are increasingly available to organisations of every budget. Once again, it’s not the technology that is the critical element, it’s the users, their data, the support and community around these platforms that really gives the value.
I would be more reassured if any of these organisations who are increasingly taking ownership of the sector’s data would state what their long term plan is in relation to opening up access; if indeed they are planning to at any point. We have to ask ourselves if this information would have more value to society if it was not only free as in ‘free beer’ but also free as in ‘free speech’. Given the already established and most likely ongoing public cost for these platforms, the need is all the greater. If these formulas, calculators, metrics and datasets are all building up to something that we ultimately want to call ‘truth’ then transparency, access, understanding and discussion must take place as widely as possible.
Richard Fletcher is Research Assistant in Arts and Festivals Management at De Montfort University.
E [email protected]
Join the Discussion
You must be logged in to post a comment.