Evaluating conferences and events: new approaches and initiatives
Today at the European Evaluation Society Conference in Helsinki, Finland, I am chairing a session on conference evaluation, and I’m happy to share my introductory slides with readers, found below.
Evaluation of communication activities of international and non-governmental organisations: A 15 year systematic review
As part of my PhD studies, I have undertaken a systematic review of how international and non-governmental organisations are evaluating their communication activities. I’m presenting a summary of this today at the European Evaluation Society Conference in Helsinki, Finland. Below are the slides, hope you find them interesting.
Seven new ways to present evaluation findings
As regular readers will know, I am very interested in how findings of evaluations are presented and used, as I’ve written about before. I’ve put together a brief presentation on this subject (see below) entitled “Seven new ways to present evaluation findings” that I’m presenting today at the European Evaluation Society Conference in Helsinki, Finland. Comments and other ideas welcome!
New advocacy evaluation guide
Bond, the UK alliance of NGOs, has produced an interesting guide on advocacy evaluation:
Assessing effectiveness in influencing power holders (pdf)
The guide looks at the challenges of influencing power holders (usually done through activities grouped under the umbrella of “advocacy”) but comes to the conclusion that evaluation is feasible:
it is possible to tell a convincing story of an organisation’s contribution to change through their influencing and campaigning work by breaking down the steps of the process that led to change, and looking at how an organisation has created change at each step.
The guide also sets out these steps and provides examples of advocacy evaluation tools from NGOs including Oxfam, CARE, Transparency International amongst others.
Measuring the impact of journalism
There has been a lot written and researched on the impact of communications – but little thought on how to measure the impact of journalism – how can the media measure the impact of their work?
Two recent posts explore this issue:
Ethan Zuckerman writes about how to measure the civic impact of journalism and one conclusion is:
“A possible metric – the efficacy of a story in connecting people to community organizations, volunteering opportunities, and other forms of civic engagement.”
He goes on to conclude:
“If we measure only how many people view, like or tweet, but not how many people learn more, act or engage, we run the risk of serving only the market and forsaking our civic responsibilities, whether we’re editing a newspaper or writing a blog.”
Jonathan Stray writes about the metrics of journalism and says:
“The first challenge may be a shift in thinking, as measuring the effect of journalism is a radical idea. The dominant professional ethos has often been uncomfortable with the idea of having any effect at all, fearing “advocacy” or “activism.” While it’s sometimes relevant to ask about the political choices in an act of journalism, the idea of complete neutrality is a blatant contradiction if journalism is important to democracy. Then there is the assumption, long invisible, that news organizations have done their job when a story is published. That stops far short of the user, and confuses output with effect.”
Both posts make interesting reaading and propose useful ideas. Both posts come to similar conclusions: The need to go beyond output metrics and look at the impact of journalism on events, individuals and policies. There are also some interesting parallels that can be seen with advocacy evaluation – food for thought!
Survey-key drivers and patterns in Corporate Communications Strategic Management
Calling all communication professionals, please assist PhD student Lukasz Bochenek by undertaking a brief survey on key drivers and patterns in Corporate Communications Strategic Management:
https://www.surveymonkey.com/s/strategiccomms
Thank you!
Using video to communicate evaluation results
I’ve written a brief post on the Climate-Eval blog about using video to communicate evaluation reports. Read the post here>>.
Understanding public attitudes to aid and development
Here is a fascinating research paper on Understanding public attitudes to aid and development (pdf) from the UK-based ODI and IPPR.
Relevant to monitoring and evaluation, it recommends:
“Campaigns should do more to communicate how change can and does happen in developing countries, including the role aid can play in catalysing or facilitating this change. Process and progress stories about how development actually happens may be more effective communication tools than campaigns focused straightforwardly on either inputs (such as pounds spent) or outputs (such as children educated).”
This is a weakness of campaigning about development and aid, in that the steps towards change are not explained – the so-called “theory of change” – “if we do that – it will lead to that” – is a mystery – for the public and often those running the programmes have not always thought it through either…
Thanks to the Thoughtful Campaigner blog for bringing this to my attention.
Standards for social media measurement?
At the recent AMEC Measurement Summit there was an interesting discussion on setting standards for social media – a group of specialists have been working on this for the past years – as more and more companies use social media and wonder how to measure the outcomes- and it would be useful if some common standards were set…View the presentation below for an update on the latest developments:
Overall spending on PR flat but evaluation up by 5%

Spending on PR/communications in companies and organisations is flat – but spending on communication evaluation is up by 5%, according to a new study of senior-level PR/communication practitioners in the USA.
The USC Annenberg’s Generally Accepted Practices (GAP) for Public Relations study found that compared to 2009, total spending on evaluation in PR/communication budgets jumped from 4% to 9% in 2012 – even when some 80% of practitioners reported overall PR/communication budgets flat or decreasing.
The study also found a shift in focus towards “outcome” measures, such as influence on reputation, attitudes and awareness – and away from “output” measures such as clip counting/media coverage.