Posts filed under ‘Evaluation use’
Video: Seven new ways to present evaluation findings
Further to my earlier post on my presentation at the recent EES conference on “seven new ways to present evaluation findings”, a video was made of my presentation that you can view below.
Thanks to Denis Bours of SEA Change Community of Practice for filming me!
Seven new ways to present evaluation findings
As regular readers will know, I am very interested in how findings of evaluations are presented and used, as I’ve written about before. I’ve put together a brief presentation on this subject (see below) entitled “Seven new ways to present evaluation findings” that I’m presenting today at the European Evaluation Society Conference in Helsinki, Finland. Comments and other ideas welcome!
Using video to communicate evaluation results
I’ve written a brief post on the Climate-Eval blog about using video to communicate evaluation reports. Read the post here>>.
Understanding and use of evaluation – new report
Here is an interesting paper from ALNAP looking at how the understanding and use of evaluation in humanitarian action can be improved:
Harnessing the Power of Evaluation in Humanitarian Action: An initiative to improve understanding and use of evaluation (pdf)
The paper sets out a framework for improving the understanding and use of evaluation in four key areas:
Capacity Area 1: Leadership, culture and structure
• Ensure leadership is supportive of evaluation and monitoring
• Promote an evaluation culture
• Increase the internal demand for evaluation information
• Create organisational structures that promote evaluation
Capacity Area 2: Evaluation purpose and policy
• Clarify the purpose of evaluation (accountability, audit, learning)
• Clearly articulate evaluation policy
• Ensure evaluation processes are timely and form an integral part of the decision-making cycle
• Emphasise quality not quantity
Capacity Area 3: Evaluation processes and systems
• Develop a strategic approach to selecting what should be evaluated
• Involve key stakeholders throughout the process
• Use both internal and external personnel to encourage a culture of evaluation
• Improve the technical quality of the evaluation process
• Assign high priority to effective dissemination of findings, including through new media (video, web)
• Ensure there is a management response to evaluations
• Carry out periodic meta-evaluations and evaluation syntheses, and review recommendations
Capacity Area 4: Supporting processes and mechanisms
• Improve monitoring throughout the programme cycle
• Provide the necessary human resources and incentive structures
• Secure adequate financial resources
• Understand and take advantage of the external environment:
– Use peer networks to encourage change
– Engage with media demands for information
– Engage with donors on their evaluation needs
Why are evaluation results not used…?
I’ve written previously on the issue of how to make sure that evaluation results are used (or at least considered…). Here is a new publication Making Evaluations Matter: A Practical Guide for Evaluators (pdf) from the Centre for Development Innovation that goes into much depth about this issue.
They state four general reasons why evaluation results are often not used:
- Fail to focus on intended use by intended users and are not designed to fit the context and situation
- Do not focus on the most important issues – resulting in low relevance
- Are poorly understood by stakeholders
- Fail to keep stakeholders informed and involved during the process and when design alterations are necessary.
I think the first and last reasons are particularly pertinent. We often don’t have enough insights into how evaluation results will be used – and we also fail to inform and involve stakeholders during the actual evaluation.
Presenting evalution results in photostories
![]()
I am always interested in new ways to present evaluation results.
Here is a very engaging and accessible format to present evaluation results – photostories.
This photostory(pdf) tells the story of an evaluation of a programme in Kenya on reconciliation.
Summarizing evaluation reports
As I’ve written about previously, evaluation reports are notoriously under-read and underutilized. Aside from the executive summary, evaluators need to find ways of presenting their key findings in a summarized format that make them attractive to their publics.
Aside from the predictable Powerpoint summary (which still can serve a purpose), some of the techniques I have used – and that were well received by publics – are as follows:
Multimedia video: using interviews, graphs and quotes in a video to bring the evaluation results “to life” (see this post for an example)
Scorecard or “snapshot”: highlighting the key findings graphically in one page. See this example:
Summary sheet: summarizing the main findings, conclusions and recommendations in fact sheet of 2-4 pages. See this example: Summary Sheet (pdf)
Findings table: summarizing the main findings, particularly useful where the evaluation is responding to pre-set objectives and indicators, as per this example:
I’m always interested to learn of new methods to summarize evaluation findings, so if you have any more ideas, please share them!
Presenting evaluation results in multimedia video
As I’ve written about before, the way in which we present evaluation findings – usually in a long undigestable report – certainly has its limitations. It’s been sometime I’ve been thinking that with the developments in multimedia there must be better ways than the written document to communicate evalution findings – and here it is! We’ve just completed a multimedia video report on the evaluation of the LIFT France conference:
This is certainly the way forward. Thanks to Patricia (concept & inteviews) , Thierry (filming & production), Benchpoint (survey) and Yona (graphics).
What to be avoided when writing evaluation reports
I’ve written previously about what is recommend in putting together a *good* evaluation report.
I came across an interesting fact sheet from the Bruner Foundation on “Using evaluation findings (pdf)”. On page three the authors list eight points to be avoided in writing evaluation reports, sumarised as follows:
1. Avoid including response rates and problems with your methodology as part of your findings.
2. Avoid reporting both numbers and percents unless one is needed to make the other clear.
3. Avoid listing in a sentence or a table, all of the response choices for every question on a survey or record review protocol.
4. Avoid reporting your results with excessive precision.
5. Avoid feeling compelled to keep your results in the same order as they appeared on the survey or the interview protocol.
6. Avoid compartmentalizing your results.
7. Avoid feeling compelled to use all of the information you collected.
8. Avoid including any action steps or conclusions that are not clearly developed from your findings.
Getting the final evaluation report right / write

For many evaluation projects, an important “deliverable” is the final evaluation report, which contains the findings, conclusions and recommendations of the evaluation. Having been through many evaluations as part of a team or as an individual, I am surprised at how often this important step gets neglected or simply messed up. Following are a couple of recommendations on putting together a final evaluation report:
- Link the findings to the original evaluation questions: Not my own idea, but something I’ve seen others do well – structure the findings of the evaluation around the original questions from the brief that defined the evaluation. In this way, people reading the report can make the connection between the questions asked and what was found out.
- Summarise the key findings in one diagram or table: Aside from reading the executive summary, people often appreciate grasping the key results in one view. Without vulgarising the findings, I find it is useful to sumarise the key findings visually. You can see an example of this idea (called a “snapshot”) on page five of this evaluation report (pdf).
- Separate the recommendations from the findings: Often you see recommendations spread throughout the main body of the report. I find it confusing and believe it is easier to go through recommendations when they are found after the findings (while still making clear reference to the findings).
- Make the executive summary a summary: An executive summary should be just that – a summary. I’m surprised at how many reports actually include new information in their executive summaries that are not found elsewhere in the reports. I recommend summarising the main findings and touching on the recommendations if space allows.
- Include all the details for the really interested and pedantic: There will be a small number of your readers that will love to look further into the details – read all the 1000s of responses to the open questions, study the way the sample was selected, etc. For these readers, I recommend including these details of the evaluation as annexes. These details, such as the survey questions, interview guidelines, description of methodology, further analysis of demographics, existing research consulted, etc. will only strengthen your report and answer some questions for a select group of readers.
Related to this topic, I’ve also written previously about how to ensure that your results are used and how to present monitoring and evaluation results.
And if you want to read further, here are some very comprehensive guidelines from the World Bank on Presenting Results (pdf).
Glenn

