Posts filed under ‘Evaluation methodology’
How to – outcome monitoring and harvesting
Often it’s useful in an evaluation during data collection to define or check what the given intervention sought to achieve in terms of outcomes – both intended and unintended. One approach that has become popular is outcome monitoring or harvesting. The World Bank has produced a very useful guide in this area that provides practical guidance – do take a look!

Nudge theory and evaluation
There has been a lot of talk about “nudge theory”- basically people being guided, encouraged and nudged towards the right decision, rather than being told – and now the evaluation unit of the UN agency, WIPO has produced a guide on how nudge theory can apply to evaluation – interesting reading!

Implications of COVID-19 on evaluation
The ILO have published useful guidelines on “Implications of COVID-19 on evaluations in the ILO: Practical tips on adapting to the situation“. The guidelines are well worth a read as they can provide guidance for many of us carrying out evaluations remotely these days.
CART principles for monitoring
Innovations for Poverty Action have developed some useful guidance on activity monitoring and evaluation based on their own CART principles: credible, actionable, responsible, and transportable (see summary graphic below).
Particularly useful for those interested in monitoring which is an issue many organisations find challenging. Read more here>>
Evaluation during COVID19 – infographic
Here an informative infographic from UNDP independent evaluation office on evaluation during COVID19 (view the pdf version here).
Evaluation and COVID19
There have been many useful and interesting articles on how evaluation can adapt and cope with the current COVID19 pandemic. Here is a collection of what I’ve found to date:
Practical tools/advice:
Conducting phone-based surveys during COVID 2019
A quick primer on running online events and meetings
Covid-19 crisis: how to adapt your data collection for monitoring and accountability
Think pieces:
Zenda Ofir: Evaluation in times of COVID19 outbreak
World Bank: Conducting evaluations in times of COVID-19
Chris Lysy: The Evaluation Mindset: Evaluation in a Crisis
M Q Patton: Evaluation Implications of the Coronavirus Global Health Pandemic Emergency
New and better evaluation criteria
The OECD/DAC evaluation criteria – the main guidance used for most evaluations has been revised. Following a broad consultation, the revised criteria have been published (pdf).
The main changes are the addition of a new criterion: Coherence; a better explanation of how to use the criteria; and a recognition that the criteria are used across many sectors and not only for the development sector. The explanatory document (pdf) is well worth a read.
A framework for context analysis
Here is an interesting tool to help with context analysis: ‘Context Matters’ framework – to support evidence-informed policy making. The tool is interactive and you can view the different elements from various perspectives. Designed to support the use of knowledge in policy-making, it could also be of interest to researchers and evaluators as an analytical tool for contexts.
View the interactive framework here>>
Read more about the framework here>>
Thanks to Better Evaluation for introducing this new resource to me.
New e-learning course: Real-time evaluation and adaptive management
My friends at TRAASS have launched a new e-learning course on Real-time evaluation and adaptive management:
“What exactly is an RTE/AM approach and how can it help in unstable or conflict affected situations? Do M&E practitioners need to ditch their standard approaches in jumping on this latest bandwagon? What can you do if there is no counterfactual or dataset? This modular course covers these challenges and more.”
Tips for young / emerging evaluators
The Evaluation for Development blog from Zenda Ofir has been collating tips for young / emerging evaluators – that even experienced evaluators will find interesting. Here are some highlights:
From Zenda herself:
Top Tip 1. Open your mind. Read
Top Tip 2. Be mindful and explicit about what frames and shapes your evaluative judgments.
Top Tip 3. Be open to what constitutes “credible evidence”.
Top Tip 4. Focus a good part of your evaluative activities on “understanding”.
Top Tip 5. Be or become a systems thinker who can also deal with some complexity concepts.
Read more about these tips>>
From Juha Uitto:
Top Tip 1. Think beyond individual interventions and their objectives.
Top Tip 2. Understand, deal with and assess choices and trade-offs made or that should have been made.
Top Tip 3. Methods should not drive evaluations.
Top Tip 4. Think about our interconnected world, and implore others to do the same.
Read more about these tips>>
From Benita Williams:
Top Tip 1. The cruel tyranny of deadlines.
Top Tip 2. Paralysis from juggling competing priorities.
Top Tip 3. Annoyance when you are the messenger who gets shot at
Top Tip 4. Working with an evaluand that affects you emotionally
Top Tip 5. Feeling rejected if you do not land an assignment
Top Tip 6. Feeling demoralized when you work with people who do not understand evaluation
Top Tip 7. Feeling discouraged because of wasted blood sweat and tears
Top Tip 8. Feeling lazy if you try to maintain work-life balance when other consultants seem to work 24/7
Top Tip 9. Feeling overwhelmed by all of the skills and knowledge you should have
Read more about these tips>>
And from Michael Quinn Patton, just one tip:
Top tip 1: Steep yourself in the classics.
Read more about this tip>>