The latest edition of Evaluation Connections (pdf), newsletter of the European Evaluation Society, has an interesting article “Advocacy evaluation: lessons from Brazil (and the internet)” by William N. Faulkner.
The article describes some new methods the evaluation team has used such as Sankey diagrams and mind mapping for qualitative analysis.
I’ve reproduced in this post, the Sankey diagram from the article, which shows the flow from “outputs” to “outcomes” for the advocacy – quite a good visualisation of the information.
View the article in this pdf, go to page 7>>
Some interesting courses coming up, in London UK:
Results in Advocacy
It is increasingly important for organisations to effectively measure the impact of their advocacy work, not only to inform programming, but for purposes of accountability to a range of stakeholders, including beneficiaries, donors, staff and volunteers. With competition for limited resources a reality, organisations which fail to demonstrate their impact, are arguably at increased risk of funding shortfalls and limit their ability to learn, adapt and evolve.
London – 13-15 April 2015
Participatory Action Research, Planning and Evaluation
This three-day workshop in London (UK) engages people in hands-on learning and practice using flexible and rigorous PAR tools developed and tested by an international community of practice in settings around the world. Participants will learn and apply practical tools for exploring problems, knowing the actors and assessing options. These tools and associated skills provide people with opportunities to mobilize knowledge from various sources and support collaborative thinking and planning for social change. Coaching support as follow-up to the workshop will help participants adapt and sequence tools for specific organizational contexts including monitoring and evaluation and grassroots capacity building initiatives (campaigns, advocacy, etc.).
London – 22-24 April 2015
A new discussion paper has been released by ALNAP entitled “Addressing causation in humanitarian evaluation: A discussion on designs, approaches and examples”.
The paper discusses possible evaluation designs and approaches that can help provide credible answers to these types of questions, using examples from Oxfam, WFP, UNHCR and NRC.
The paper cites an evaluation I worked on with NRC in the realm of advocacy evaluation.
A challenge of advocacy evaluation is in analysing and interpreting data and information in a systematic and rigorous manner. For a recent advocacy evaluation I carried out with my colleague Patricia for the Norwegian Refugee Council, we used a simplified content analysis to assist us with this task.
In carrying out this analysis, we asked four questions:
1) What were the policy changes desired by NRC (“the asks”)?
2) What were the reaction of targeted institutions, individuals and allies to these asks?
3) What was the level/significance of policy change (if any)?
4) What was the role of NRC in any change seen?
We then summarised this in a table, listing the some 30 (!) policy asks of NRC’s advocacy, here is an extract:
|Ask||Reaction||Change seen||Role of NRC advocacy|
|UNICEF and partners need to adapt RRMP to include assessment of protection needs.||UNICEF deployed protection specialist for six months to work with RRMP.||High||High|
|Organisations need to ensure that pressure to move quickly does not marginalize commitment to longer-term work with more impact.||This and broader thinking of report taken on-board in creation of DRC DMG network.||Medium||Medium
(NRC advocacy was one of many influences on DMG)
Rhonda Schlangen and Jim Coe (independent evaluation consultants) have just published a very interesting paper “The value iceberg: weighing the benefits of advocacy and campaigning” on the BetterEvaluation website.
The paper looks at how concepts of ‘value’ and ‘results’ are being applied to advocacy and campaigning and presents some alternative strategies for assessing advocacy. You can see the “value iceburg” below.
In a recent blog post, Ann K. Emery sets out 6 great ideas for displaying qualitative data:
- Word clouds
- Showcasing Open-Ended Survey Data Beside Closed-Ended Data (see example below)
- Photos Beside Participants’ Responses
- Icons Beside Descriptions and Responses
- Diagrams to Explain Concepts and Processes
- Graphic Timelines
Example of point 2 from Anne K. Emery: