Posts filed under ‘communicating evaluation results’
A very interesting report is just out from the Innovation Network on the Evaluation Capacity and Practice in the US Nonprofit Sector (pdf).
Here are some excerpts on resources and evaluation:
- 99% of organisations have someone responsible for evaluation
- 84% of organisations spend less than 5% on evaluation
- 16% spend zero on evaluation (!)
There are also more interesting findings on evaluation use and barriers/supporting factors for evaluation – view the report here (pdf)>>
ALNAP has recently released their Evaluation of Humanitarian Action Guide.
The guide was six years in the making and contains detailed advice and tips on evaluating humanitarian action. Even if your focus is not on evaluating humanitarian activities, Chapter 17 on Communicating and Reporting Findings and Results is well worth a read.
Here is a very interesting webinar presented by the UN Evaluation Group on communications and evaluation use;
— Communications and Evaluation Use, 8 November, 2016, 9.30 am EST (USA)
They have also produced a working paper on the subject:
How can communications improve engagement with evaluation users, and eventually increase evaluation use?
How can evaluators improve communications to increase evaluation use?
Please join us on 8 November to hear views from evaluators and communication experts from UN Women, the Global Environment Facility (GEF), and the World Bank.
You are cordially invited to attend a UNEG webinar on “Communications and Evaluation Use”. This webinar is organized by UNEG working groups under Strategic Objective Two for Use of Evaluation.
When: Tuesday, 8 November, from 9:30 to 10:30 am (New York time).
Marco Segone, Director, Independent Evaluation Office of UN Women, and Alexandra Capello, Evaluation Specialist, Independent Evaluation Office of UN Women
Geeta Batra, Chief Evaluation Officer and Deputy Director, Independent Evaluation Office of the GEF, and Kseniya Temnenko, Knowledge Management Officer, Independent Evaluation Office of the GEF
Daniel Musiitwa, Senior Communications Officer, Independent Evaluation Group of The World Bank, and Kristen Milhollin, Online Communications Officer, Independent Evaluation Group of The World Bank
Here is a great example (pdf) from the ILO evaluation office of using cartoons to explain an evaluation process / service; it just shows that visual methods can also be used to explain evaluation processes and services – in addition to communicating evaluation results.
“The integrated evaluation framework will guide you through the process from aligning objectives to establishing a plan, setting targets and then measuring the outputs, outtakes, outcomes and impact of your work.”
The new framework also comes with a taxonomy that describes for each step of the process the key steps required, the metrics and milestones and the methods that should be considered.
Here is a useful article from the Independent Evaluation Group of the World Bank. They highlight five tips to make your evaluation more influential as illustrated in the infographic below. I certainly agree with all the tips; I’d just add that influence may not be immediate and direct; it may take for some years to manifest itself and often in unexpected ways (to be explained in a future post!)
The UN Joint Inspection Unit (JIU), an external oversight body of all of the United Nations, has just produced a very interesting study on Public information and communications policies and practices in the United Nations system (pdf)
Aside from providing an interesting and critical view of communications in the UN, the report also looks at the monitoring and evaluation (M&E) of communication activities – concluding that M&E needs to better feed into management directions and decision-making. Here are some key findings from the report on M&E and communications:
-Only half of UN agencies have included M&E in their communication frameworks
-Indicators used are predominantly output-based
-There is absence of an evaluation culture among communication staff
– Existing monitoring systems (e.g. for media coverage) were largely descriptive, rarely analysed and did not feed into decision-making.
View the full report here (pdf) – M&E aspects discussed from page 22 onwards.