Articles

Culture-led funding or funding-led culture?

There is a crucial disconnect between how the sector evaluates the impact of culture and using it to influence policy. It’s time to reset the wheel, say Ben Walmsley and Emma McDowell.

Ben Walmsley and Emma McDowell
6 min read

In early 2020 we ran a sector survey to better understand how and where the Centre for Cultural Value could add the most value to the cultural sector. We gathered 311 responses from cultural practitioners and researchers from across the UK and beyond, and these insights complemented the face-to-face events that we ran earlier the same year. Several key findings emerged across the centre’s three core activities: research, evaluation, and policy engagement. In this article, we summarise these findings and reflect on them in light of our recent ‘What value culture?’ festival

Hunger for research

One of the centre’s core objectives is to make existing research more relevant and accessible so we were delighted to see a real hunger across the cultural sector to engage with research findings. Over the next four years we hope to work with as many cultural practitioners, funders, policymakers and researchers as possible. We have a growing library of research digests, podcasts, webinars, essential reads, how-to guides and policy reviews. However, we know we need to do more: as well as summarising existing research, we must foster long-term, strategic partnerships between our key stakeholder groups and combine, or at least coordinate, our research efforts. The centre can offer valuable support here, from training academics and cultural practitioners through to supporting the co-commissioning of research. We will do this via our new Collaborate fund launching in the spring and a free online learning course planned for 2023.

We hear loud and clear the call from the sector for a more profound shift in our shared understanding of evaluation practice – a shift to focus on process over outcome and place creative, innovative methodologies within an inclusive, diverse and representative practice at the centre. This perhaps reflects the question that underscores all attempts to capture culture’s value: who decides what is valuable? In other words, who holds the power and who doesn’t? Whose cultural values matter most and why?

Defining cultural value

Our survey highlighted a huge variety of definitions and conceptions of evaluation. Many respondents saw it as a process of assessing success and making judgements about the efficacy of different projects and practices. The idea that evaluation involved measuring and proving impact was equally common, as was comparing outputs from projects and activities to their original objectives. However, many respondents also emphasised how evaluation can offer a deeper understanding and enhance future practice. As current research by two of our associate directors, Leila Jancovich and David Stevenson, reveals, definitions of success and failure depend greatly on who is doing the evaluation. This was echoed by cultural consultant Harpreet Kaur in one of our festival events. Harpreet argued that evaluation done well enables learning and growth and urged participants to be less afraid of failure and admit when things don't go to plan. 

Cultural evaluation is beset with ingrained challenges and tensions. Funders and policymakers often bemoan the sector’s inability to move beyond advocacy and produce rigorous evaluation, while practitioners vent their frustration at funders’ inability to standardise evaluation protocols and engage with research findings in a meaningful way. Our survey certainly highlighted significant skills gaps – especially in relation to analysing and interpreting quantitative and qualitative data. We hosted two evaluation events during our festival that suggested a shared vision based for all parties based on proportionality and reflective learning. Despite their preference for traditional quantitative and even econometric data, it seems that policymakers too are keen to engage with a range of methods such as qualitative, creative, process-focussed, and even ‘non-rational’ research and evaluation.

These methods must be at the heart of cultural evaluation principles if we are to stop the damage caused by focussing on the least important impacts of cultural activity. As one survey respondent put it: “We feel that funders are often more concerned with the quantitative evidence which makes for ‘easy’ reading for their stakeholders [and] that we are more interested in a deep interrogation of the approaches used in any given project, rather than just looking at measurement-centred notions of social and economic impact.” Only a fifth of respondents to our survey felt their priorities for evaluation practice matched the outcomes their funders expected. It is clear that evaluation offers a rich process of learning that isn’t taking place currently. We don’t have to reinvent the wheel – but we do need to reset it. 

Engaging policymakers

While 73% of survey respondents felt explaining activity and impact to funders was a priority in evaluation, only 19% said influencing policy was important, even when considering the perceived priorities of funders. This discrepancy highlights a crucial disconnect between our understanding of evaluation as the practical process of explaining activity and impact to stakeholders on the one hand and influencing policy on the other. It also begs the question of what – and who – evaluation is for. There may be many reasons for practitioners’ disillusionment with policy. For some, the frustration lies with not knowing what funders and policymakers actually do with all the data that is painstakingly produced by the sector.

We tried to demystify policymaking a little in our festival. Panellists such as Harman Sagger from DCMS clarified that politicians and ministers tend to like stories of impact more than analysts and civil servants do, providing a useful reminder that policy engagement is nuanced and relational. Another key learning was that researchers need the skills to ‘translate’ and ‘mediate’ their findings into policy if they are to have an impact.

However, our survey and festival participants made a crucial point that it is not just a problem with training and lack of appropriate skills. Many socially and community-engaged practitioners are already demonstrating their value on a daily basis and they want greater recognition of cultural value at a grassroots level. As one participant put it: “We seem to have a problem with funding-led culture, rather than culture-led funding.”

Ben Walmsley is Director of the Centre for Cultural Value and Professor of Cultural Engagement at the University of Leeds. Emma McDowell is an audience researcher and arts marketer studying for a PhD in arts marketing and audience engagement at the University of Leeds.
 @BenWalmsley | @emmamcdoofus

Download the full summary report of the Centre for Cultural Value's sector survey.

This article, sponsored and contributed by the Centre for Cultural Value, is part of a series supporting an evidence-based approach to examining the impacts of arts, culture and heritage on people and society.