Webinar – Humanitarian Standards – too much of a good thing? on Feb 28
For those readers interested in humanitarian assessment and evaluation, this webinar that I am co-hosting may be of interest:
Webinar – Humanitarian Standards – too much of a good thing? on Feb 28, 2013 2:00 PM GMT
Are you interested in driving up the quality and accountability of humanitarian action? The Joint Standards Initiative (JSI) is an exciting collaboration between HAP International, the Sphere Project and People In Aid to work out how to improve standards coherence and in turn to improve the quality of humanitarian programmes. This webinar is part of a series of stakeholder consultation events to hear the humanitarian communities views on the use, utility and relevance of humanitarian standards. John Cosgrave will present highlights from two related papers he has written for JSI on this subject and Robert Schofield (JSI Coordinator) and Glenn O’Neil (JSI Consultant) will facilitate a discussion with webinar participants.
Register here for the webinar:
https://attendee.gotowebinar.com/register/8403836000444936448
Download John Cosgrave’s thinkpiece (pdf): Humanitarian Standards – too much of a good thing?
http://pool.fruitycms.com/humanitarianstandards/Humanitarian-standards-too-much-of-a-good-thing-John-Cosgrave-Feb-2013.pdf
After registering, you will receive a confirmation email containing information about joining the webinar.

The ultimate social network mapping software?
It could have been predicted…The most sophisticated network mapping tool of social networks has been developed for espionage…
“A multinational security firm has secretly developed software capable of tracking people’s movements and predicting future behaviour by mining data from social networking websites.”
Visualizing Information for Advocacy

Just came across this interesting guide(pdf) to using visuals for advocacy from the Tactical Technical Collective – here is an explanation from the authors:
Visualising Information for Advocacy(pdf): An Introduction to Information Design is a manual aimed at helping NGOs and advocates strengthen their campaigns and projects through communicating vital information with greater impact. This project aims to raise awareness, introduce concepts, and promote good practice in information design – a powerful tool for advocacy, outreach, research, organisation and education. Through examples, the booklet demonstrates how to use innovative visual graphics to tell a complex and powerful story in a snapshot.
How to use data to improve communications
For those interested in how data can be used to improve communications – and how can communications products can make data more accessible – here is an interesting presentation from Timo Lüge from Social Media for Good – examples are from the non-profit sector:
Best Social Media Metrics?
Further to my post of last week, here is an interesting take on the issue of social media metrics from Avinash Kaushik, a leading expert in web metrics.
Avinash sets out metrics for the following areas:
- Conversation
- Amplification
- Applause
- Economic Value
View his concept here, it’s interesting reading…
Measuring reach to engagement on social media
The #SMMStandards Initiative is a cross-industry effort to simplify and unify the measurement of social media that I’ve written about before.
Digging deeper into their proposed standards, I found interesting their preliminary guidance on how to “standardise” the following terms, as I summarise here:
Reach & impressions: how to compare visits/viewers/circulation?
They caution against using any type of “multipliers” given that it is probably overestimated the number of people actually “viewing” content, e.g. an estimated 10% of your “friends” see your average Facebook post.
Opinion/advocacy: this needs to be broken down into types: e.g. “opinions” (it’s good), “recommendations” (try it), “feeling” (makes me feel good) intended action (going to do it).
Influence: it is multi level and multi-dimensional – difficult to rely on an automated measure alone.
Engagement: occurs after reach, consider it at different levels:
Low: Facebook “like”, Twitter “follows”
Medium: Blog/video comments, Twitter “retweets”
High: Facebook “shares”, original content/video posts created by users.
The #SMMStandards Initiative have also proposed a Sources and Methods Transparency Table for use when analysing/collecting social media data which is now open for comments.
Read more about #SMMStandards Initiative here>>
Evaluation the lowest priority for US non-profits
The US-based Innovation Network has published a very interesting study on the State of Evaluation in US non-profit organisations.
The study, based on a survey of some 550 non-profits in the US produced some interesting findings, including the headline above, which is admittedly the more pessimistic of the following:
- 90% of organizations report evaluating their work (up from 85% in 2010)
- 100% (!) of organizations reported using and communicating their evaluation findings
- Budgeting for evaluation is still low. More than 70% of organizations are spending less than 5% of organizational budgets on evaluation
- On average, evaluation-and its close relation, research, continue to be the lowest priorities (compared to fundraising, financial management, communications, etc.)
I find it incredible that 100% report using and communicating their evaluations – If only this would be “significant” usage then we would all be happy…
BetterEvaluation – great resources for M&E
Here is a new website (well, new for me), that I recently discovered:
“An international collaboration to improve evaluation practice and theory by sharing information about options (methods or tools) and approaches. “
There are many resources on the website, for example, if you are interested in advocacy evaluation, there are useful resources on “process tracing”, a useful method for this area.
Seven things an evaluator should avoid saying…
Having worked for some years as an evaluator, been in many different teams and seen other evaluators in action, I’ve had the benefit of seeing how people do evaluation in so many different ways. And I’ve also had the occasion to see evaluators do things – well – in not such a correct way – shall we say, acting a little pompous, like the illustration above. So I’ve put together the following list of seven thing an evaluator should avoid saying…of course, I’ve never been guilty of any of these :~}
1. “For me, the Terms of Reference are only a rough guide for us” I once heard an evaluator say this and the client nearly fell off their chair. Of course, a terms of reference has to be commented on and modified, normally in the inception report, but it’s key to the evaluation, no one likes “evaluation creep” where the evaluation goes everywhere but fails to answer the questions, oh, that are often in the terms of reference.
2. “I’ve already written the report in my mind” Ah the number of times I’ve heard this gem when coming out of a first meeting with a client… Even before a scrap of evidence has been collected…
3. “Interesting, in my opinion this is what happened…” I’ve been guilty of this, where the evaluator elaborates on their theory of what works and what doesn’t to a poor interviewee. When ever I’ve tried it, the person 9 times out of ten has replied “that’s not how it happened…”.
4. “Don’t worry, I’ve got a long plane trip coming up, I’ll write your report then…” We are all short for time, but a client expects work to be done seriously…even if you are catching up on the report during that flight trip, should you say it out loud?
5. “So our initial results are…” Not so much the evaluator’s fault but the pressure on evaluators to deliver initial results before the findings are in. We should avoid jumping to conclusions in the early days of an evaluation as often we find that our initial hunches may be wrong…
6. “This program is so #%$%! Who is running this thing?” As an evaluator you may come across programmes and projects that are less than ideally run. But it helps if you are a little diplomatic as you may be talking to someone who set up and/or manages what you are evaluating. There may even been connections to the programme or project that you are not aware of.
7. “The way this evaluation is managed is just rubbish!” And I’ve also heard this – the evaluator criticising openly and widely the evaluation commissioner who has …employed them… In general, I think part of the success of the evaluation will be down to good collaboration between the evaluators, the commissioner and the programme/project being evaluated.
Know of any more things to avoid saying? please send them in!
Above fabulous drawing entitled “pompous bastard” by TannerMorrow.
Glenn
Video: Seven new ways to present evaluation findings
Further to my earlier post on my presentation at the recent EES conference on “seven new ways to present evaluation findings”, a video was made of my presentation that you can view below.
Thanks to Denis Bours of SEA Change Community of Practice for filming me!