- Where possible, prioritise the learning (and learning what to do differently) over monitoring goals of evaluation, and understanding over judgement.
- Incorporate previous learning into activity and evaluation.
- Understand before, during and after (including baselines or comparisons where possible).
- Be prepared to challenge projects’/ activities’ purpose, where relevant.
- Use iterative approaches where possible, to make use of learnings within a project.
In practice
Organisation
- Start evaluation early enough to capture baselines – or align with existing data.
Evaluator
- Explore why and how things happen, as well as what (which may require more qualitative approaches).
- Share results during a project, even where they are more provisional (e.g. due to smaller samples, or only some activity having taken place).
- Discuss with the team the desired learning outcomes from the evaluation (what do they want to discover?)
Funder
- Allow for longer-term follow-up where possible.
- Don’t be exploitative (e.g. re the time/money of participants).
- Ensure the evaluation provide benefits as defined by all major stakeholders (especially participants).
- Deliver the evaluation with integrity, aligned with own principles.
- Meet relevant ethical and professional standards (e.g. university ethics guidelines, Market Research Society guidelines, standards set by the programme being evaluated)
In practice
Organisation
- Consider whether it would be appropriate to pay participants for their time.
Evaluator
- Ensure that the input of participants can lead to substantive change and/or learnings.
- Avoid asking questions that can’t be used (including testing surveys and other activities in advance)
- Ask different groups what they would like to get out of the evaluation (including beneficiaries).
- Share findings of evaluation with those who have taken part, where appropriate.
- Check in advance which standards are relevant are share with the evaluation team.
- Be clear about whether/how the evaluation will make a difference (and therefore whether it is worth undertaking).
- Where possible, include actionable, realistic and relevant recommendations.
- Make it useful for (and transmissible to) the full range of stakeholders (not least participants/ beneficiaries, but also other sectors and stakeholders).
- Where possible, ensure that it does inform future decisions/plans (including consideration of how findings will be communicated and adopted).
In practice
Organisation- Frame initial conversations about evaluation in terms of the difference it will make.
Evaluator
- Think through the ‘route to impact’ of the evaluation and whether there are ways to make it work more smoothly.
- Discuss recommendations with those who would need to act on them, to ensure that they are feasible, including the plans and processes they need to feed into.
- Tailor communication of results to different audiences, including the format, length and content.
Funder
- Look for opportunities to share findings beyond the obvious, to connect with other sectors.
- Ground it in empirical research (not just opinion or anecdote).
- Support assertions with evidence adequate to the conclusions drawn (e.g. re methods, quality of delivery, samples, uncertainty etc).
- Use suitable methods, including a range where possible (i.e. ideally mixtures of qualitative and quantitative research) to enrich understanding.
- Apply methods appropriately and according to established standards.
- Clearly articulate methods, so they are replicable.
In practice
Organisation
- Don’t exceed what the data can validly say when using the evaluation.
Evaluator
- Keep a file of key sources/evidence on a variety of topics for evaluation.
- Be clear about what the evidence available does – and does not – allow you to say.
- Differentiate between description of evidence and interpretation.
- Consider alternative interpretations of the data, as well as what data might be missing.
- Include enough detail in descriptions of methods to allow replication, e.g. how respondents were selected, responses coded/ filtered, analysis applied etc.
- Be willing to incorporate unintended outcomes and new viewpoints.
- Be receptive to innovation, including creative methods.
- Be prepared to learn from others and to critically engage with both new learnings and prior belief and assumptions.
- Include failure as well as success.
- Don’t be partisan or set out to advocate for the arts, heritage and creative sectors.
- Understand and interrogate your own position and always strive to act with independence.
In practice
Organisation
- Try to set evaluation briefs based on what needs evaluation, rather than specifying method to allow for innovations.
Evaluator
- Reflect on your expectations before you have data, consider whether that reflects bias, and ensure you are collecting (and will listen to) evidence that could contradict any prior beliefs.
- Continually refresh your knowledge of evaluation techniques, for examples of innovative practice and where they are most suitable.
Funder
- Make clear that you value a diversity of approaches.
- Review evaluations looking for (and valuing) evidence of learning from failure.
- Take practical and pragmatic approaches, with scope, scale and cost tailored to circumstances.
- Ensure that it is sufficient, but not excessive, to achieve its purpose (and only done when useful).
- Have high enough quality of delivery to ensure that conclusions drawn are valid (and openness and realism about limitations).
In practice
Evaluators:
- Focus on areas of greatest potential learning.
- Consider in advance what range of responses are possible with the method used, how interpretation could differ and potential validity of interpretations.
Funders:
- Request evaluations only where they will be useful (and consider whether what needs evaluating is [a sample of] the funding programme, or the specific grant).
- Consider providing examples of the type or level of evaluation expected for different funding amounts.
- Take into account what matters to people, especially participants/beneficiaries.
- Where possible, use a range of methods and creative approaches, tailored to people, place and activity.
- Respect and enhance the autonomy, dignity and agency of participants/beneficiaries.
- Where possible, ensure that evaluation doesn’t diminishes or disrupt participants’ experience.
In practice
Evaluators:
- Ask participants what is important to them about the activity (not just what it is intended to do).
- Treat responses as true to a viewpoint, even where you may disagree with what they are saying.
- Consider who is and isn’t being made available for involvement in evaluation, and whose feedback is and isn’t received.
- Proactively attempt to reach those not usually available (e.g. review the scope set: is it inclusive of all activity?).
- Provide alternative ways to engage, so people can choose what they prefer.
- Give special consideration to responses which are outliers (rather than excluding them as ‘exceptions’): consider if they are clues to perspectives otherwise being missed.
- Consider the potential impact of the evaluation process on participants’ experience.
- Seek out, listen to and include different voices and perspectives.
- Allow many perspectives to shape the scope of the evaluation, including what is treated as valuable.
- Include multiple perspectives in reporting and dissemination.
- Avoid a falsely unifying authorial stance.
In practice
Organisations:
- Consult during the formation of a project about what value it could have for participants and use that to inform purpose (and hence evaluation)
Evaluators:
- Ensure that reporting retains multiple voices (e.g. using people’s own words, avoiding ‘tidying up’ expression, ignoring non-consensus opinions, making source data available where possible)
- Avoid implying that viewpoints/opinions were unanimous if only expressed by some people.
- Where possible, co-create with representatives of beneficiary groups to determine the evaluation’s purpose, method and interpretation.
- Ensure that evaluation seeks out and engages with a diverse range of perspectives, experiences and opinions – especially from groups which have traditionally been marginalised.
- Enable beneficiaries’ self-defined interests to inform development of future cultural activity.
In practice
Evaluators:
- Consider using approaches where participants can respond to evaluation findings/ interpretation.
- Make sure you know the profile of who the activity is intended to reach, and check that marginalised groups aren’t under-represented in activity or evaluation responses (and that their contributions aren’t restricted to being in relation to their membership of that group)
- Seek out direct responses in preference to people ‘representing’ others’ views.
Funders:
- Use feedback from participants to shape the priorities of funding programmes.
- Be open with method, data and interpretation (in ways suitable for different groups).
- Acknowledge limitations in the evaluation process and results.
- Share findings publicly, where possible.
In practice
Organisations:
- Share versions of evaluation findings in different, accessible, formats.
- Publish findings/evaluation reports online.
Evaluators:
- Include permissions for sharing of (non-personal) data at the point of data collection.
- Publish data on open data platforms.
Funders:
- Include sharing information as a condition of funding.
- Have sufficient understanding of context, activity, stakeholders and impacts (including previous research and evaluation).
- Use comparisons, benchmarks and standard measures for context where possible and relevant.
- Be conscious of own role, position and agency and how it might affect the evaluation.
In practice
Organisations:
- Ask evaluators about existing research
Evaluators:
- Carry out background research/ a literature review, to identify what new knowledge you should focus on.
- Consider key issues or news that are affecting particular participant groups and how they may affect their experience of the activity/evaluation.
Funders:
- Request use of comparable tools and benchmarks where relevant
- Communicate findings to those involved in the project/activity, with the opportunity for feedback.
- Make the evaluation available to others who could benefit from the learnings.
- Consider the benefits of conducting joint evaluation activity.
- Participate in peer learning around evaluation.
In practice
Organisations:
- Share findings with relevant sector organisations, in a simple summary format (with access to more detail where desired).
Evaluators:
- Incorporate a participant feedback stage where possible (this may also help ensure the evaluation fairly reflect the views and voice of participants).
- Where possible, co-create with representatives of beneficiary groups to determine the evaluation’s purpose, method and interpretation.
- Ensure that evaluation seeks out and engages with a diverse range of perspectives, experiences and opinions – especially from groups which have traditionally been marginalised.
- Enable beneficiaries’ self-defined interests to inform development of future cultural activity.
In practice
Evaluators:
- Consider using approaches where participants can respond to evaluation findings/ interpretation.
- Make sure you know the profile of who the activity is intended to reach, and check that marginalised groups aren’t under-represented in activity or evaluation responses (and that their contributions aren’t restricted to being in relation to their membership of that group)
- Seek out direct responses in preference to people ‘representing’ others’ views.
Funders:
- Use feedback from participants to shape the priorities of funding programmes.
Feedback section