Posts filed under ‘Campaign evaluation’

Resource: Evaluating Policy Influence and Advocacy

Better Evaluation logo

The website Better Evaluation has many great resources and explanations of evaluation approaches, processes and methods.

I just came across this page on Evaluating Policy Influence and Advocacy that details well the methods and types of advocacy/policy influence – well worth a read!

March 7, 2017 at 10:00 am Leave a comment

My PhD on communication evaluation in 10 slides…

Earlier this year I had the opportunity to present the findings of my PhD in 30 minutes (!) to the Geneva Communicators Network. I titled my presentation “communication evaluation: challenges and complexities” – and you can view it below – it’s a very summarised version of my PhD! If you are really keen, you can view the full PhD thesis here.

 

November 22, 2016 at 10:47 am Leave a comment

Tracking Use of Campaign Evaluation Findings

This week I made a presentation at the European Evaluation Society conference on a tracking study on the use of campaign evaluation (that I had carried out).  For those interested in this subject, my slides are here for your consumption!

September 29, 2016 at 12:05 pm Leave a comment

communication evaluation event in Zurich – 28 April 2016

For any readers in the Zurich, Switzerland area, I will be giving a presentation for the EMScom Alumni Association  (of which I am an alumni of..) on communication evaluation;  here is a short description:

Evaluation of communication activities is consistently named as one of the top concerns of communication professionals. Yet paradoxically not even half reportedly undertake any evaluation. Drawing from his recent PhD studies and over a decade of experience in evaluating communication campaigns and programmes, Glenn O’Neil will set out the challenges and complexities of evaluation and offer insights into solutions and approaches to ensure that evaluation brings value to communication professionals and their organisation

Thursday, April 28, 2016, 18h30-21h00
Widder Hotel, Zürich
Cost: 50 CHF (free for EMScom alumni)

Hope to see some of you there! Further information >>

Register also by email: emscomalumni@usi.ch

Glenn

April 6, 2016 at 4:18 pm Leave a comment

Beyond online vanity metrics

Here is a very interesting study (pdf) from the Mobilisation Lab on what counts and doesn’t for online metrics and campaigns.

The study looks at what they call “vanity metrics” for online campaigns that they define as “data that are easily manipulated, are biased toward the short-term, often paint a rosy picture of program success, or do not help campaigners make wise strategic decisions”. Examples of vanity metrics include: number of petition signatures; web traffic, number of “opens” (of emails I guess).

So what do they recommend campaigns should be measuring?

They have plenty of good suggestions and insights. Here are some of the metrics they mentioned that could be more significant (and possible to measure online):

  • Monthly members returning for action
  • Actions per member (rather than size of lists)
  • Number of members actively part of a campaign

View the study here (pdf)>>

July 28, 2015 at 8:28 am 1 comment

10 elements of success for advocacy

The US-based Media Impact Funders have produced a ten point list on elements of success for policy change (what I’ve labeled “advocacy”), as reproduced below in this post.

Although it is looking from a US perspective of influence on policy through advocacy, what struck me was that many of the points are relevant to advocacy done globally or in other countries/regions.  For example:

Point 1: Solutions – when evaluating advocacy initiatives and talking with policy-makers, a common complaint I have heard is that advocacy is not “solution focused”, i.e. it makes valid points about the given issues (that often policy-makers are also aware of) but don’t necessarily put forward possible solutions to these issues.

Point 3: Agility – the ability to be flexible is so important – to take advantage of opportunities that arise, that were not necessarily included in the original advocacy plan. That I saw recently in an evaluation I carried out for Oxfam on global development policy – where they had enough flexibility to move resources as the issues peaked and new opportunities emerged.

Point 5: Humanity – often advocacy focuses on the “facts” but what can also make an impact is the “human factor”. One effective example of this was in the creation of the Arms Trade Treaty where state representatives drafting the treaty were directly confronted by survivors of armed violence; certainly bringing a human face to the dry legalistic treaty process and language.

 

 

July 14, 2015 at 11:28 am 1 comment

Monitoring and Evaluating Advocacy toolkit

I recently came across this resource from UNICEF, which is their  guide to monitoring and evaluating advocacy (pdf).  It’s a companion to their larger advocacy toolkit (pdf).

It’s a very comprehensive guide and I’d add it to my previous list of best resources for advocacy evaluation>>

May 21, 2015 at 6:20 am Leave a comment

Two New Advocacy Evaluation Tools

Here are two new advocacy evaluation tools from the Center for Evaluation Innovation:

The Advocacy Strategy Framework (pdf): presents a simple one-page tool for thinking about theories of change that underlie policy advocacy strategies. Check out the “interim outcomes and indicators” on the last page – very good range of advocacy outcomes/indicators.

Four Tools for Assessing Grantee Contribution to Advocacy Efforts (pdf): offers funders practical guidance on how to assess a grantee’s contribution to advocacy outcomes.The four tools include:
1. A question bank
2. Structured grantee reporting
3. An external partner interview guide
4. Contribution analysis

 

April 1, 2015 at 4:49 pm 1 comment

Advocacy evaluation using contribution analysis

A challenge of advocacy evaluation is in analysing and interpreting data and information in a systematic and rigorous manner.  For a recent advocacy evaluation I carried out with my colleague Patricia for the Norwegian Refugee Council,  we used a simplified content analysis to assist us with this task.

In carrying out this analysis, we asked four questions:

1) What were the policy changes desired by NRC (“the asks”)?

2) What were the reaction of targeted institutions, individuals and allies to these asks?

3) What was the level/significance of policy change (if any)?

4) What was the role of NRC in any change seen?

We then summarised this in a table, listing the some 30 (!) policy asks of NRC’s advocacy, here is an extract:

Ask Reaction Change seen Role of NRC advocacy
UNICEF and partners need to adapt RRMP to include assessment of protection needs. UNICEF deployed protection specialist for six months to work with RRMP. High High
Organisations need to ensure that pressure to move quickly does not marginalize commitment to longer-term work with more impact. This and broader thinking of report taken on-board in creation of DRC DMG network. Medium Medium
(NRC advocacy was one of many influences on DMG)

View the full report here (pdf) (see annex 1 for the contribution analysis table, page 27)>>

January 6, 2015 at 8:35 am 2 comments

New resource: the value iceberg: weighing the benefits of advocacy and campaigning

Rhonda Schlangen and Jim Coe (independent evaluation consultants) have just published a very interesting paper “The value iceberg: weighing the benefits of advocacy and campaigning” on the BetterEvaluation website.

The paper looks at how concepts of ‘value’ and ‘results’ are being applied to advocacy and campaigning and presents some alternative strategies for assessing advocacy. You can see the “value iceburg” below.

View the paper here>>

December 19, 2014 at 4:37 pm Leave a comment

Older Posts Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,145 other followers

Categories

Feeds