This month I gave presentation on “evaluation findings – types and influences” at the Swiss national health promotion conference. Some of the key points I raised were:
- Use may not be instrumental and direct as expected
- Stakeholder involvement is critical to use
- Organisations can influence evaluation use
- Use can be unpredictable, opportunistic & unexpected
Curious? View my slides below!
A new e-learning course is available from TRAASS international ; Cutting-Edge M&E: A Guide for Practitioners. The course is taught by Colin Jacobs, a senior trainer with more than 25 years’ experience in international development. Colin’s recent roles include President at the UK Evaluation Society and Head, Governance and Civil Society at British Council.
This online course lays the ground for Monitoring and Evaluation (M&E) to make vital contributions to incentivising change and measure performance. The course considers challenges in current M&E practice, introduces a tool-box of evaluation techniques and shows where these can be best applied. Ways of promoting early participation and the engagement of key stakeholders are explored and a step-by-step action plan to improve practice of M&E is provided. Further information>>
Full disclosure; I also present an e-learning course for TRAASS International; Effective and creative evaluation report writing.
Here is an interesting new publication “Advocating for Evaluation: A toolkit to develop advocacy strategies to strengthen an enabling environment for evaluation” (pdf) from Eval Partners. The focus of the toolkit is on how to advocate for a supportive environment for evaluation.
A very interesting event is scheduled for February 20-21 2017 in London; the Future of technology for monitoring, evaluation, research and learning – MERL TECH; learn more about the event>>
A very interesting report is just out from the Innovation Network on the Evaluation Capacity and Practice in the US Nonprofit Sector (pdf).
Here are some excerpts on resources and evaluation:
- 99% of organisations have someone responsible for evaluation
- 84% of organisations spend less than 5% on evaluation
- 16% spend zero on evaluation (!)
There are also more interesting findings on evaluation use and barriers/supporting factors for evaluation – view the report here (pdf)>>
ALNAP has recently released their Evaluation of Humanitarian Action Guide.
The guide was six years in the making and contains detailed advice and tips on evaluating humanitarian action. Even if your focus is not on evaluating humanitarian activities, Chapter 17 on Communicating and Reporting Findings and Results is well worth a read.
The UK government’s Communication Service has produced a framework for evaluating communications (pdf).
The framework provides an overview of an integrated approach to evaluating communication activities and sets out eight golden rules for communication evaluation:
1. Set SMART objectives well before the start of your activity
2. Think carefully about who your target audience is when selecting relevant metrics from each of the five disciplines*
3. Ensure you adopt an integrated channel approach when evaluating your communications activity
4. Collect baselines and benchmarks where possible
5. Include a mix of qualitative and quantitative evidence
6. Regularly review performance
7. Act on any insight to drive continuous improvement and inform future planning
8. Make the link between your activity and its impact on your organisational goals or KPIs
*Media, digital, marketing, stakeholder engagement, internal communications
Are there any more to add? I would add the need to integrate evaluation within the daily work of communication professionals – so it is thought about before starting activities and during…
View the complete guide here (pdf)>>