A new discussion paper has been released by ALNAP entitled “Addressing causation in humanitarian evaluation: A discussion on designs, approaches and examples”.
The paper discusses possible evaluation designs and approaches that can help provide credible answers to these types of questions, using examples from Oxfam, WFP, UNHCR and NRC.
The paper cites an evaluation I worked on with NRC in the realm of advocacy evaluation.
A challenge of advocacy evaluation is in analysing and interpreting data and information in a systematic and rigorous manner. For a recent advocacy evaluation I carried out with my colleague Patricia for the Norwegian Refugee Council, we used a simplified content analysis to assist us with this task.
In carrying out this analysis, we asked four questions:
1) What were the policy changes desired by NRC (“the asks”)?
2) What were the reaction of targeted institutions, individuals and allies to these asks?
3) What was the level/significance of policy change (if any)?
4) What was the role of NRC in any change seen?
We then summarised this in a table, listing the some 30 (!) policy asks of NRC’s advocacy, here is an extract:
|Ask||Reaction||Change seen||Role of NRC advocacy|
|UNICEF and partners need to adapt RRMP to include assessment of protection needs.||UNICEF deployed protection specialist for six months to work with RRMP.||High||High|
|Organisations need to ensure that pressure to move quickly does not marginalize commitment to longer-term work with more impact.||This and broader thinking of report taken on-board in creation of DRC DMG network.||Medium||Medium
(NRC advocacy was one of many influences on DMG)
Rhonda Schlangen and Jim Coe (independent evaluation consultants) have just published a very interesting paper “The value iceberg: weighing the benefits of advocacy and campaigning” on the BetterEvaluation website.
The paper looks at how concepts of ‘value’ and ‘results’ are being applied to advocacy and campaigning and presents some alternative strategies for assessing advocacy. You can see the “value iceburg” below.
In a recent blog post, Ann K. Emery sets out 6 great ideas for displaying qualitative data:
- Word clouds
- Showcasing Open-Ended Survey Data Beside Closed-Ended Data (see example below)
- Photos Beside Participants’ Responses
- Icons Beside Descriptions and Responses
- Diagrams to Explain Concepts and Processes
- Graphic Timelines
Example of point 2 from Anne K. Emery:
Here is an excellent presentation on network analysis and evaluation originally presented at the America Evaluation Conference 2014:
I’m currently in Uganda where I’ve been conducting a workshop on “communicating evaluation findings effectively” as part of the GIZ project on Evaluation Capacity Development in Uganda.
I also made a presentation for the Uganda Evaluation Association as part of their Kampala Evaluation Talk series, focusing on “Four challenges and opportunities to communicating evaluation finding” which can be seen below.
Thanks to the participants of both the workshop and the talk for their enthusiasm and interest!