Articles

Creating space for collective learning

Evaluation reports in the cultural sector can be packed full of learning. Emma McDowell explores how we might unearth this existing knowledge.

Emma McDowell
5 min read

When it comes to evaluation, it’s easy to be drawn to the next new thing. Or to feel like what you’re already doing is not good enough. But the arts, culture and heritage sector is brimming with examples of good practice. So why do we hear so little about them?

In a 2020 study conducted by the Centre for Cultural Value (the Centre), we found that only 18% of cultural organisations routinely share their evaluations beyond the inner circle of their immediate funders and stakeholders. Often, the learning captured in these reports collects dust on shelves as staff members move on and funding conditions and policies shift. 

Some of this learning may only be relevant to the immediate contexts in which the evaluations were produced. Nevertheless, given that funders and stakeholders continually ask for evaluations, it’s hard not to regret the wealth of untapped potential contained within them. Surely, we could all learn a lot about both cultural practice and evaluation approaches from this hard work and investment. 

So many reports, so little time

When communicating evaluation findings, most organisations produce a written report. This format can lead to learning being buried deep in lengthy narratives or obscured behind paywalls. Many reports are also kept confidential out of an understandable fear of exposing inner workings or perceived failures.

Even when reports are more widely accessible, few people have the time or resource to trawl through page after page of analysis to locate learning that is specifically relevant to them. 

Of course, one solution is to explore different formats and techniques for understanding and communicating findings. For example, the Centre has produced guidance for developing learning case studies and has shared real-life examples. These resources include Creative People and Places project Back to Ours examining their experience of learning through dialogue and Birds of Paradise Theatre Company reflecting on learning from mistakes. Other creative methods, such as those explored in this Evaluation Principles workshop, can be successfully used to deepen and share understanding.

That said, there’s a reason why reports remain a popular evaluation output of choice. Their flexible structure gives space to fully explore both the detail and context of projects. They can also be enriched with additional media to document impacts and are often considered resource-efficient. 

For all these reasons, more traditional reports are here to stay. So, how can we make them more valuable to the wider sector? 

Learning from the large scale

Unearthing the insights in existing evaluations is a central driver of our latest project: the Evaluation Learning Space. The first resources in this new online hub focus on what we can learn from the evaluations of UK-based Cities and Capitals of Culture from 1990 to 2021. 

Of course, the sheer size and complexity of these city-wide programmes arguably make somewhat unique case studies.

Yet, when we began to develop the resources, it became increasingly clear that there was rich, widely applicable learning. As Jonothan Neelands from Warwick Business School and the Coventry 2021 Monitoring and Evaluation team says:

“[Cities of Culture are] a massive, unprecedented investment in culture in place. It’s absolutely fundamental that the rest of the UK gains from that … From what you discover about levers of change and what you discover about what works and doesn't work.” 

We have reviewed and analysed a plethora of reports and have spoken to some of the people behind them. Through this process we’ve surfaced learning about the make-up of evaluation teams and governance structures. We’ve also provided insight into the methods and frameworks used to conduct evaluation activity and the challenges encountered along the way.

Learning from our peers

Access to the work and learning of others in the sector can only deepen our understanding of our own practice. It certainly can be a more productive use of time than starting from scratch. But it’s more than that. 

If we are to contribute to a wider ecosystem of learning at all levels, we need to pay more attention to what has already been done; learn more from what has already been written.

Productive evaluations result in our failures becoming crucial data to inform future actions. We can then design frameworks that are meaningful to our staff. We can articulate our impact using language that resonates with the artists and communities with whom we work. We can stop pretending we have to generate all the answers ourselves. 

As the Centre’s Evaluation Principles remind us, purposeful evaluation is much more than simply about serving the needs of the funders and stakeholders. And, as evaluator Dawn Cameron argues in a recent article, it’s a process, not an endpoint. 

The Evaluation Learning Space can now support this collective endeavour to learn from one another. We plan to continue sharing resources that distil learning in ways that are relevant, accessible and representative of the diverse sector in which we work. We’d love to hear what you think.

Emma McDowell is a Postdoctoral Researcher at the Centre for Cultural Value.
 culturalvalue.org.uk
@valuingculture | @emmamcdoofus

Funded by the Esmée Fairbain Foundation, the Evaluation Learning Space is led by the Centre for Cultural Value, in partnership with CultureHive, the Arts Marketing Association’s knowledge hub. 

This article, sponsored and contributed by the Centre for Cultural Value, is part of a series supporting an evidence-based approach to examining the impacts of arts, culture and heritage on people and society.