twitter YouTube LinkedIn

Evaluation principles

Evaluation principles

We want to build a shared understanding of the differences that arts, culture, heritage and screen make to people’s lives and to society.​

These collaboratively produced evaluation principles are a sharing of ideas to inform how evaluation is carried out and used in the cultural sector.

Photo by Clarence Alford

Introduction

We talked to lots of people working in the cultural sector about what they would find useful. A primary request was for ‘evaluation principles’: a sharing of ideas to inform how evaluation is carried out and used within the sector. ​

How did we develop the principles?

As well as gathering input via a series of public events, we’ve been developing the principles with a working group of over 40 representatives from across the sector. The group members cover a range of roles and perspectives on evaluation (including those who do it, who use it, and whose work is evaluated).​

Read more about the global context behind the principles, and the key ideas and current challenges that informed their development.

The work was led by Dr Beatriz Garcia from the University of Liverpool and Oliver Mantell from The Audience Agency.

Download a PDF version of the principles

Woman crouching looks up at a girl
Fun-Palaces 2019. Photo: Roswitha-Chesher

Your feedback – what do you think?

The principles are a starting point of a conversation and will evolve in response to your feedback over the next 12 months (Oct 2021 – Oct 2022). Collective agreement is central to the principles’ value and purpose, and we will revise the principles based on your feedback.

You can feedback via email:​ ccv@leeds.ac.uk or by taking part in online events over the next year – sign up to our mailing list to be the first to hear about these.

Evaluation should be:

Beneficial

Committed to learning and/or change

This includes

  • Where possible, prioritising the learning (and learning what to do differently) over monitoring goals of evaluation, and understanding over judgement.​
  • Incorporating previous learning into activity and evaluation.​
  • Understanding before, during and after
    (including baselines or comparisons where possible).​
  • Being prepared to challenge projects’/ activities’ purpose, where relevant.​
  • Using iterative approaches where possible, to make use of learnings within a project.​

How could this be applied in practice

Organisation​

  • Start evaluation early enough to capture baselines – or align with existing data.​

Evaluator​

  • Explore why and how things happen, as well as what (which may require more qualitative approaches).​
  • Share results during a project, even where they are more provisional (e.g. due to smaller samples, or only some activity having taken place).​
  • Discuss with the team the desired learning outcomes from the evaluation (what do they want to discover?)​

Funder​

  • Allow for longer-term follow-up where possible.​

Ethical

This includes

  • Not being exploitative (e.g. re the time/money of participants).​
  • Ensuring the evaluation provide benefits as defined by all major stakeholders (especially participants).​
  • Delivering the evaluation with integrity, aligned with own principles.​
  • Meeting relevant ethical and professional standards (e.g. university ethics guidelines, Market Research Society guidelines, standards set by the programme being evaluated)​

How could this be applied in practice

Organisation​

  • Consider whether it would be appropriate to pay participants for their time.​

Evaluator​

  • Ensure that the input of participants can lead to substantive change and/or learnings.​
  • Avoid asking questions that can’t be used (including testing surveys and other activities in advance)​
  • Ask different groups what they would like to get out of the evaluation (including beneficiaries).​
  • Share findings of evaluation with those who have taken part, where appropriate.​
  • Check in advance which standards are relevant are share with the evaluation team.

Applicable

This includes

  • Being clear about whether/how the evaluation will make a difference (and therefore whether it is worth undertaking).​
  • Where possible, including actionable, realistic and relevant recommendations.​
  • Making it useful for (and transmissible to) the full range of stakeholders (not least participants/ beneficiaries, but also other sectors and stakeholders).​
  • Where possible, ensuring that it does inform future decisions/plans (including consideration of how findings will be communicated and adopted).​

How could this be applied in practice

Organisation​

  • Frame initial conversations about evaluation in terms of the difference it will make.​

Evaluator​

  • Think through the ‘route to impact’ of the evaluation and whether there are ways to make it work more smoothly.​
  • Discuss recommendations with those who would need to act on them, to ensure that they are feasible, including the plans and processes they need to feed into.​
  • Tailor communication of results to different audiences, including the format, length and content.​

Funder​

  • Look for opportunities to share findings beyond the obvious, to connect with other sectors.​

Robust

Rigorous

This includes

  • Grounding evaluation in empirical research (not just opinion or anecdote).​
  • Supporting assertions with evidence adequate to the conclusions drawn (e.g. re methods, quality of delivery, samples, uncertainty etc).​
  • Using suitable methods, and consider mixed methods (i.e. combinations of qualitative and quantitative research) to enrich understanding.​
  • Applying methods appropriately and according to established standards.​
  • Clearly articulating methods, so they are replicable.​

How could this be applied in practice

Organisation​

  • Don’t exceed what the data can validly say when using the evaluation​.

Evaluator​

  • Keep a file of key sources/evidence on a variety of topics for evaluation.​
  • Be clear about what the evidence available does – and does not – allow you to say.​
  • Differentiate between description of evidence and interpretation.​
  • Consider alternative interpretations of the data, as well as what data might be missing.​
  • Include enough detail in descriptions of methods to allow replication, e.g. how respondents were selected, responses coded/ filtered, analysis applied etc.

Open-minded

This includes

  • Being willing to incorporate unintended outcomes and new viewpoints.​
  • Being receptive to innovation, including creative methods.​
  • Being prepared to learn from others and to critically engage with both new learnings and prior belief and assumptions.​
  • Including failure as well as success.​
  • Not being partisan or setting out to advocate for the arts, heritage and creative sectors.​
  • Understanding and interrogating your own position and always strive to act with independence.

How could this be applied in practice

Organisation​

  • Try to set evaluation briefs based on what needs evaluation, rather than specifying method to allow for innovations.​

Evaluator​

  • Reflect on your expectations before you have data, consider whether that reflects bias, and ensure you are collecting (and will listen to) evidence that could contradict any prior beliefs.​
  • Continually refresh your knowledge of evaluation techniques, for examples of innovative practice and where they are most suitable.​

Funder​

  • Make clear that you value a diversity of approaches​.
  • Review evaluations looking for (and valuing) evidence of learning from failure​.

Proportionate

This includes

  • Taking practical and pragmatic approaches, with scope, scale and cost tailored to circumstances.​
  • Ensuring that it is sufficient, but not excessive, to achieve its purpose (and only done when useful).​
  • Having high enough quality of delivery to ensure that conclusions drawn are valid.
  • Being open and realistic about possible limitations.​

How could this be applied in practice

Evaluators:​

  • Focus on areas of greatest potential learning​.
  • Consider in advance what range of responses are possible with the method used, how interpretation could differ and potential validity of interpretations​.

Funders:​

  • Request evaluations only where they will be useful (and consider whether what needs evaluating is [a sample of] the funding programme, or the specific grant).​
  • Consider providing examples of the type or level of evaluation expected for different funding amounts​.

People-centred

Empathetic

This includes

  • Taking into account what matters to people, especially participants/beneficiaries.​
  • Where possible, using a range of methods and creative approaches, tailored to people, place and activity.​
  • Respecting and enhancing the autonomy, dignity and agency of participants/beneficiaries.​
  • Where possible, ensuring that evaluation doesn’t diminish or disrupt participants’ experience.​

How could this be applied in practice

Evaluators:​

  • Ask participants what is important to them about the activity (not just what it is intended to do)​.
  • Treat responses as true to a viewpoint, even where you may disagree with what they are saying.​
  • Consider who is and isn’t being made available for involvement in evaluation, and whose feedback is and isn’t received.
  • Proactively attempt to reach those not usually available (e.g. review the scope set: is it inclusive of all activity?)​.
  • Provide alternative ways to engage, so people can choose what they prefer​.
  • Give special consideration to responses which are outliers (rather than excluding them as ‘exceptions’): consider if they are clues to perspectives otherwise being missed​.
  • Consider the potential impact of the evaluation process on participants’ experience.​

Many-voiced

This includes

  • Seeking out, listening to and including different voices and perspectives.​
  • Allowing many perspectives to shape the scope of the evaluation, including what is treated as valuable.​
  • Including multiple perspectives in reporting and dissemination.​
  • Avoiding a falsely unifying authorial stance.​

How could this be applied in practice

Organisations:​

  • Consult during the formation of a project about what value it could have for participants and use that to inform purpose (and hence evaluation)​

Evaluators:​

  • Ensure that reporting retains multiple voices (e.g. using people’s own words, avoiding ‘tidying up’ expression, ignoring non-consensus opinions, making source data available where possible)​
  • Avoid implying that viewpoints/opinions were unanimous if only expressed by some people.​

Socially-engaged

This includes

  • Where possible, co-creating with representatives of beneficiary groups to determine the evaluation’s purpose, method and interpretation.​
  • Ensuring that evaluation seeks out and engages with a diverse range of perspectives, experiences and opinions – especially from groups which have traditionally been marginalised.​
  • Enabling beneficiaries’ self-defined interests to inform development of future cultural activity.​

How could this be applied in practice

Evaluators:​

  • Consider using approaches where participants can respond to evaluation findings/ interpretation.​
  • Make sure you know the profile of who the activity is intended to reach, and check that marginalised groups aren’t under-represented in activity or evaluation responses (and that their contributions aren’t restricted to being in relation to their membership of that group)​
  • Seek out direct responses in preference to people ‘representing’ others’ views.​

Funders:​

  • Use feedback from participants to shape the priorities of funding programmes.

Connected

Transparent

This includes

  • Being open with method, data and interpretation (in ways suitable for different groups).​
  • Acknowledging limitations in the evaluation process and results.​
  • Sharing findings publicly, where possible.​

How could this be applied in practice

Organisations:​

  • Share versions of evaluation findings in different, accessible, formats.​
  • Publish findings/evaluation reports online.

Evaluators:​

  • Include permissions for sharing of (non-personal) data at the point of data collection.
  • Publish data on open data platforms.

Funders:​

  • Include sharing information as a condition of funding.​

Aware

This includes

  • Having sufficient understanding of context, activity, stakeholders and impacts (including previous research and evaluation).​
  • Using comparisons, benchmarks and standard measures for context where possible and relevant.​
  • Being conscious of own role, position and agency and how it might affect the evaluation.​

How could this be applied in practice

Organisations:​

  • Ask evaluators about existing research​

Evaluators:​

  • Carry out background research/ a literature review, to identify what new knowledge you should focus on.
  • Consider key issues or news that are affecting particular participant groups and how they may affect their experience of the activity/evaluation.

Funders:​

  • Request use of comparable tools and benchmarks where relevant​

Shared

This includes

  • Communicating findings to those involved in the project/activity, with the opportunity for feedback.​
  • Making the evaluation available to others who could benefit from the learnings.​
  • Considering the benefits of conducting joint evaluation activity.​
  • Participating in peer learning around evaluation.​

How could this be applied in practice

Organisations:​

  • Share findings with relevant sector organisations, in a simple summary format (with access to more detail where desired). ​

Evaluators:​

  • Incorporate a participant feedback stage where possible (this may also help ensure the evaluation fairly reflect the views and voice of participants).​

 

Keep in touch,
sign up to our newsletter



You can unsubscribe at any time by clicking the link in the footer of our emails or by emailing ccv@leeds.ac.uk For information about our privacy practices, please visit our website www.culturalvalue.org.uk

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.