Posts filed under ‘Evaluation reporting’
The voices of affected populations in evaluation
The notion of listening to the voices of the affected populations is nothing new in humanitarian evaluation. However, in the past there has been a lot of talk with little action. The Listening Project is one of the first structured and global initiatives to look at this issue – not only from the evaluation perspective but more broadly – and have recently produced a summary study Time to Listen: Hearing People on the Receiving End of International Aid (pdf) based on discussions with almost 6,000 people in 20 countries. You can also read a news report about this issue on IRIN news.
As part of a stakeholder consultations I’ve been involved with for the Joint Standards Initiative, we’ve also been listening to affected populations – from Senegal to Pakistan to Mexico. The video below provides some short excerpts of interviews with affected populations, in addition to humanitarian workers from these consultations.
Blogging the evaluation process
Blogging and other social media are often used as part of a communicating evaluation results – that is, once the evaluation is finished. However, blogging can also be useful to communicate the evaluation process – that is, as the evaluation is collecting data. I’ve recently been involved in a stakeholder consultation for the Joint Standards Initiative, where as part of communicating the progess of the consultation, myself and the other team members have been blogging “snapshots from the consultation” – from various and diverse locations such as Beirut, Juba and Richard Toll (Senegal).
This we found useful to provide stakeholders with an update of our work and offer some insights into our initial findings.
(The image above is taken from a discussion in Cairo by team member Inji El Abd)
How to use data to improve communications
For those interested in how data can be used to improve communications – and how can communications products can make data more accessible – here is an interesting presentation from Timo Lüge from Social Media for Good – examples are from the non-profit sector:
Evaluation the lowest priority for US non-profits
The US-based Innovation Network has published a very interesting study on the State of Evaluation in US non-profit organisations.
The study, based on a survey of some 550 non-profits in the US produced some interesting findings, including the headline above, which is admittedly the more pessimistic of the following:
- 90% of organizations report evaluating their work (up from 85% in 2010)
- 100% (!) of organizations reported using and communicating their evaluation findings
- Budgeting for evaluation is still low. More than 70% of organizations are spending less than 5% of organizational budgets on evaluation
- On average, evaluation-and its close relation, research, continue to be the lowest priorities (compared to fundraising, financial management, communications, etc.)
I find it incredible that 100% report using and communicating their evaluations – If only this would be “significant” usage then we would all be happy…
Video: Seven new ways to present evaluation findings
Further to my earlier post on my presentation at the recent EES conference on “seven new ways to present evaluation findings”, a video was made of my presentation that you can view below.
Thanks to Denis Bours of SEA Change Community of Practice for filming me!
Seven new ways to present evaluation findings
As regular readers will know, I am very interested in how findings of evaluations are presented and used, as I’ve written about before. I’ve put together a brief presentation on this subject (see below) entitled “Seven new ways to present evaluation findings” that I’m presenting today at the European Evaluation Society Conference in Helsinki, Finland. Comments and other ideas welcome!
Using video to communicate evaluation results
I’ve written a brief post on the Climate-Eval blog about using video to communicate evaluation reports. Read the post here>>.
advocacy evaluation: influencing climate change policy
Often I don’t get to share the findings of the evaluations I undertake, but in this case of an advocacy evaluation, an area that I’ve written about before, the findings are public and can be shared.
I was part of a team that evaluated phase 1 of an advocacy/research project – the Africa Climate Change Resilience Alliance (ACCRA). ACCRA aims to increase governments’ and development actors’ use of evidence in designing and implementing interventions that increase communities’ capacity to adapt to climate hazards, variability and change. Advocacy plays a large role in trying to influence governments and development actors in this project. You can read more in the Executive_Summary (pdf) of the evaluation findings.
The evaluation also produced 5 case studies highlighting successesful advocacy strategies:
- Capacity building and district planning
- Secondment to a government ministry
- Reaching out to government and civil society in Uganda
- Disaster risk profiling in Ethiopia
- Exchanging views and know-how between ACCRA countries
The case studies can be viewed on the ACCRA Eldis community blog (n.b. you have to join the Eldis community to view the case studies, it’s free of charge).
To disseminate the evaluation findings widely we also produced a multimedia clip, as featured below.
Workshop: Integrating communications in evaluation
Together with Raj Rana, I will be running a workshop on communications and evaluation this coming November in Bern, Switzerland, further information:
Integrating communications in evaluation
Date and place : 10-11 November 2011, Bern
Organisers: University of Fribourg & Swiss Evaluation Society
An often-overlooked step of evaluation is ensuring that findings are communicated, understood and acted upon. Communicating throughout the evaluation process equally poses many challenges. Communicating effectively implies using different means, messages and methods to reach different stakeholder groups, with very different needs and expectations.
A mix of presentations, case studies and practical exercises will be used to promote new approaches for communicating results including social media, interactive presentations and data visualization. The workshop delivery will include a mix of facilitation techniques to introduce effective means of engaging stakeholders in the evaluation process (World Café methodology, buzz groups, visualization techniques, developing consensus, etc.) Participants are encouraged to bring examples of evaluations they have commissioned/implemented, to be used as case studies during the workshop.
More information & registration>>
Glenn
Understanding and use of evaluation – new report
Here is an interesting paper from ALNAP looking at how the understanding and use of evaluation in humanitarian action can be improved:
Harnessing the Power of Evaluation in Humanitarian Action: An initiative to improve understanding and use of evaluation (pdf)
The paper sets out a framework for improving the understanding and use of evaluation in four key areas:
Capacity Area 1: Leadership, culture and structure
• Ensure leadership is supportive of evaluation and monitoring
• Promote an evaluation culture
• Increase the internal demand for evaluation information
• Create organisational structures that promote evaluation
Capacity Area 2: Evaluation purpose and policy
• Clarify the purpose of evaluation (accountability, audit, learning)
• Clearly articulate evaluation policy
• Ensure evaluation processes are timely and form an integral part of the decision-making cycle
• Emphasise quality not quantity
Capacity Area 3: Evaluation processes and systems
• Develop a strategic approach to selecting what should be evaluated
• Involve key stakeholders throughout the process
• Use both internal and external personnel to encourage a culture of evaluation
• Improve the technical quality of the evaluation process
• Assign high priority to effective dissemination of findings, including through new media (video, web)
• Ensure there is a management response to evaluations
• Carry out periodic meta-evaluations and evaluation syntheses, and review recommendations
Capacity Area 4: Supporting processes and mechanisms
• Improve monitoring throughout the programme cycle
• Provide the necessary human resources and incentive structures
• Secure adequate financial resources
• Understand and take advantage of the external environment:
– Use peer networks to encourage change
– Engage with media demands for information
– Engage with donors on their evaluation needs