E-Newsletter on advocacy evaluation

For those interested in advocacy evaluation, here is an interesting e-newsletter “Advocacy Evaluation Update” that is published some four times  per year by the Innovation Network and the Center for Evaluation Innovation.

You can also read the latest edition (pdf) here>>

October 3, 2010 at 5:13 pm Leave a comment

Webinars on Development Evaluation

UNICEF, the Rockefeller Foundation, Claremont Graduate University, IOCE and DevInfo, are offering a series of live webinars on “Emerging Practices in Development Evaluation”.

Following are the dates, speakers and subjects of the webinars:

13 October 2010
Using a Developing Country Lens in Evaluation
Zenda Ofir, Independent Consultant; Former President, African Evaluation Association
Shiva Kumar, Independent Consultant, India

16 November 2010
Emerging Practices in Evaluating Policy Influence
Fred Carden, Director, Evaluation Unit, International Development Research Center (IDRC)

7 December 2010
Evaluating Networks and Partnerships
Jared Raynor, Senior Consultant at TCC Group

January 2011
Evaluating Capacity Development
Peter Morgan, Independent Consultant

February 2011
Evaluating Organizational Performance
Charles Lusthaus, Co-Founder and Chairman of the Board of Directors, Universalia Management Group; and Associate Professor, McGill University

March 2011
Evaluating Innovation
Steve Rochlin, Director and U.S. Representative for AccountAbility

April 2011
Evaluating Sustainable Development
Alastair Bradstock, Business Development Director, International Institute for Environment and Development (IIED)

For further information>>

September 24, 2010 at 7:25 am 4 comments

New evaluation website

A new evaluation website has been launched: Eval Central. It brings together on one website, feeds from different blogs and sites that focus on evaluation (including this one!). Here is an explanation from its creator:

“This experimental site integrates feeds from a variety of evaluation blogs in order to develop a single evaluation news source that can run with very little overhead.  Essentially this site is based on a directory but by opting for aggregate feeds, rather than a static list of links, the site becomes a dynamic source for readers on the lookout for new evaluation content.  When a reader clicks on any single post, they are taken directly to the source blog.”

Visit Eval Central>>

September 15, 2010 at 6:18 am 1 comment

New book on PR measurement

I’m just reading my copy of the new publication “A Practitioner’s Guide to Public Relations Research, Measurement and Evaluation“.  The book, by Drs. Stacks and Michaelson is a no-nonsense guide to PR measurement and evaluation that’s well worth a read. I like how they stress the importance of “measurable” objectives and endeavour to move the focus from measuring “outputs” to “outcomes”. They recommend three steps essential for evaluating PR and communication programmes, that I summarise as follows:

– Set clear and well defined research objectives
– Apply rigorous research design that enables reliable research results
– Provide detailed documentation with full transparency

I couldn’t agree more…

You can learn more about the book here>>

Please note, this blog has no commercial interest in this publication, we just believe it’s a good read!

September 9, 2010 at 7:15 pm 1 comment

“Masterclasses” on evaluation

Two interesting “Masterclasses” coming up from the European Association of Development Research and Training Institutes (EADI), Bonn Germany in the evaluation field:

Complexity in Project Management and Evaluation, 16 – 17 September 2010, Bonn
The Masterclass is designed for experienced evaluation practitioners and researchers who are able to bring and speak about examples of their own practice during the workshop. Participants will be exposed to ideas derived from complexity sciences, addressing a range of methods, and in particular the importance of reflection and reflexivity in research. The workshop will be lead by Dr. Chris Mowles, who has worked in international development for thirty years and currently combines consultancy practice with directing the innovative doctoral programme at the Complexity Research Group, University of Hertfordshire.
More information>>

Practical Skills for Conducting Impact Evaluations, 4 – 5 November, Bonn
As an interactive workshop, this Masterclass will focus on training participants to conduct impact evaluations in the context of development projects and programmes. Apart from addressing key challenges and methodological approaches, the event will also hold discussions based on a comparative analysis of methods used in impact evaluation practices. Several exercises during the workshop will facilitate practical skills-building for the participants’ future work.
More information >>

August 30, 2010 at 12:05 pm Leave a comment

Barcelona Declaration of Measurement Principles – Final Version

The Barcelona Declaration of Measurement Principles for communication evaluation, which I wrote about previously,  have now been finalised, here they are:

Seven Principles

1.Importance of Goal Setting and Measurement
2.Measuring the Effect on Outcomes is Preferred to Measuring Outputs
3.The Effect on Business Results Can and Should Be Measured Where Possible
4.Media Measurement Requires Quantity and Quality
5.AVEs are not the Value of Public Relations
6.Social Media Can and Should be Measured
7.Transparency and Replicability are Paramount to Sound Measurement

The explanatory text (pdf) is well worth a read as it explains the thinking behind the above principles. As I pointed out before, two out of seven principles focus on media measurement, indicating our sector’s focus in this area –  what I consider a distraction from real “outcome” evaluation which communications needs.

The International Association for Measurement and Evaluation of Communication (AMEC) has set up taskforces to find the answers to two questions related to principles 5 & 6, notably:

1) What are the “validated metrics” to replace AVEs?
2) How do you get started in measuring social media, and what are the definitions of relevant metrics?

I look forward to learning more about their work.

Read about this initiative further>>

August 24, 2010 at 7:56 am 3 comments

2010 Claremont evaluation debate – webcast – 21 August

The Claremont Graduate School (USA) are hosting their annual Claremont Evaluation Debate, which can be viewed online free of charge.

“Using Systems Theories to Improve Evaluation Practice: Promise and Pitfalls”
August 21, 2010, 12:15 pm (Pacific time)

Systems theories have been put forth as one of the latest innovations to improve evaluation practice. What are they? How can they be used to improve evaluation practice? When should they be used? When should they be avoided? What are their limitations? Is systems thinking in evaluation the latest short lived fad or a legitimate breakthrough? The answers to these questions will be discussed and debated head-to-head in Claremont.

The Promise of System Theories: Dr. Michael Quinn Patton and Dr. Bob Williams
The Pitfalls of System Theories: Dr. Michael Scriven and Dr. Stewart Donaldson

More information>>

August 16, 2010 at 7:45 pm Leave a comment

Outgoing UN evaluation chief urges evaluators to “tell it as it is”

Below is an extract of the farewell speech of Ms. Inga-Britt Ahlenius, Under-General Secretary Oversight & Evaluation of the UN. I particularly like her statement that if evaluators do not “tell it as it is” – then no one else will…

“I am aware that some of you are facing challenges to the independence of your work; management in some cases would like to continue to maintain control over the ambit of your work. They want good news, not bad news. So when you have bad news, you learn to tell the bad news in clever ways. Let me tell you a little story.

There is the old story of the Lion King who calls all his subjects to his rather smelly cave and asks them to tell him how his room smells. Nobody dares to do anything, until the dog steps up, sniffs the room and tells the King honestly that it smells. The King devours the dog for his insolence. The monkey then decides to be smarter and tells the King the room smells like roses. The King devours the monkey for his dishonesty and sycophancy. Lastly, with all else in the room trembling with fear, the sly fox steps up and tells the King that he has had a cold for the past few days and cannot smell. The King rewards the fox by making him Prime Minister of his Kingdom.

Now, regardless of the moral of this story – we in this room are NOT to be sly foxes. We are mandated to be dogs! So the question is – how do we survive as dogs when the King asks you if his room smells?

To those of you who are facing hard challenges to your operational independence, and to your professional integrity as evaluators, I would like to remind you of a quote by Dag Hammarskjold which I now and then have reason to repeat. You will find it engraved in the pavement of Dag Hammarskjöld Plaza at 47th Street and First Avenue –

“Never for the sake of peace and quiet, deny your own experience or convictions”.

Because if you, in your position as the United Nations’ evaluators do not “tell it as it is”, what you believe to be correct, then it is unlikely that anybody else in the UN will. I urge you – do not deny your convictions as evaluators!”

August 12, 2010 at 9:04 am Leave a comment

The theory of change explained…

Using the “theory of change” in evaluation has proven for me to be very useful – it basically maps out from activities to impact how the given intervention would bring about change.

So for those interested to know more, Organizational Research have produced a 2 page guide (pdf) to the theory of change – all you needed to know – and in brief!

August 6, 2010 at 9:44 pm Leave a comment

Measuring long term impact of conferences

Often we evaluate conferences with their participants just after the conferences, measuring mostly reactions and learnings, as I’ve written about previously.

Wouldn’t it be more interesting actually to try and measure the longer term impact of a conference? This is what the International AIDS Society has done concerning one of its international conferences – measuring longer term impact 14 months after the conference – you can view the report (pdf) here.

Their overall assessment of impact was as follows:

“AIDS 2008 had a clear impact on delegates’ work and on their organizations, and that the conference influence has extended far beyond those who attended, thanks to networking, collaboration, knowledge sharing and advocacy at all levels.”

July 8, 2010 at 7:52 pm Leave a comment

Older Posts Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,665 other subscribers

Categories

Feeds