Posts filed under ‘Evaluation reporting’

Why are evaluation results not used…?

I’ve written previously on the issue of how to make sure that evaluation results are used (or at least considered…). Here is a new publication Making Evaluations Matter: A Practical Guide for Evaluators (pdf) from the Centre for Development Innovation that goes into much depth about this issue.

They state four general reasons why evaluation results are often not used:

  • Fail to focus on intended use by intended users and are not designed to fit the context and situation
  • Do not focus on the most important issues – resulting in low relevance
  • Are poorly understood by stakeholders
  • Fail to keep stakeholders informed and involved during the process and when design alterations are necessary.

I think the first and last reasons are particularly pertinent. We often don’t have enough insights into how evaluation results will be used – and we also fail to inform and involve stakeholders during the actual evaluation.

Read the publication here (pdf)>>

May 24, 2011 at 12:34 pm Leave a comment

How does evaluation results influence policy?

Here is an interesting paper from the International Initiative for Impact Evaluation that focuses on how does evaluation results (of impact evaluations) influence policy:

“Sound expectations: from impact evaluations to policy change” (pdf)

A main conclusion of the paper is as follows:

“The paper concludes that, ultimately, the fulfillment of policy change based on the results of impact evaluations is determined by the interplay of the policy influence objectives with the factors that affect the supply and demand of research in the policymaking process.”

View the paper (pdf) here>>

May 3, 2011 at 8:14 am Leave a comment

Presenting evalution results in photostories

I am always interested in new ways to present evaluation results.

Here is a very engaging and accessible format to present evaluation results – photostories.

This photostory(pdf) tells the story of an evaluation of a programme in Kenya on reconciliation.

December 10, 2010 at 10:11 pm 1 comment

Presenting evaluation results – interactive mapping

One of the challenges faced in evaluation is presenting evaluation findings in way that facilitates their use, as I’ve written about before.

Now here’s an interesting idea – presenting evaluation results in an interactive map. This example is for the monitoring, evaluation, and communications for an agriculture development program in Afghanistan. Here is a screenshot of the map:

View the interactive map>>

Interactive map produced by Jasha Levenson of Cartametrix.

January 21, 2010 at 1:41 pm Leave a comment

Summarizing evaluation reports

As I’ve written about previously, evaluation reports are notoriously under-read and underutilized. Aside from the executive summary, evaluators need to find ways of presenting their key findings in a summarized format that make them attractive to their publics.

Aside from the predictable Powerpoint summary (which still can serve a purpose), some of the techniques I have used – and that were well received by publics – are as follows:

Multimedia video: using interviews, graphs and quotes in a video to bring the evaluation results “to life” (see this post for an example)

Scorecard or “snapshot”: highlighting the key findings graphically in one page. See this example:

Summary sheet: summarizing the main findings, conclusions and recommendations in fact sheet of 2-4 pages.  See this example: Summary Sheet (pdf)

Findings table: summarizing the main findings, particularly useful where the evaluation is responding to pre-set objectives and indicators, as per this example:

I’m always interested to learn of new methods to summarize evaluation findings, so if you have any more ideas, please share them!

November 25, 2009 at 12:57 pm 1 comment

Workshop on communications evaluation

I recently conducted a one day training workshop for the staff of Gellis Communications on communications evaluation. We looked at several aspects including:

  • How to evaluate communication  programmes,  products and campaigns;
  • How to use the “theory of change” concept;
  • Methods specific to communication evaluation including expert reviews, network mapping and tracking mechanisms;
  • Options for reporting evaluation findings;
  • Case studies and examples on all of the above.

Gellis Communications and myself are happy to share the presentation slides used during the workshop – just see  below (these were combined with practical exercises – write to me if you would like copies)

November 12, 2009 at 9:17 pm 2 comments

Presenting evaluation results in multimedia video

As I’ve written about before, the way in which we present evaluation findings – usually in a long undigestable report – certainly has its limitations.  It’s been sometime I’ve been thinking that with the developments in multimedia there must be better ways than the written document to communicate evalution findings – and here it is! We’ve just completed a multimedia video report on the evaluation of the LIFT France conference:

This is certainly the way forward. Thanks to Patricia (concept & inteviews) , Thierry (filming & production), Benchpoint (survey) and Yona (graphics).

October 2, 2009 at 2:31 pm 5 comments

What to be avoided when writing evaluation reports

I’ve written previously about what is recommend in putting together a *good* evaluation report.

I came across an interesting fact sheet from the Bruner Foundation  on “Using evaluation findings (pdf)”.  On page three the authors list eight points to be avoided in writing evaluation reports, sumarised as follows:

1. Avoid including response rates and problems with your methodology as part of your findings.

2. Avoid reporting both numbers and percents unless one is needed to make the other clear.

3. Avoid listing in a sentence or a table, all of the response choices for every question on a survey or record review protocol.

4. Avoid reporting your results with excessive precision.

5. Avoid feeling compelled to keep your results in the same order as they appeared on the survey or the interview protocol.

6. Avoid compartmentalizing your results.

7. Avoid feeling compelled to use all of the information you collected.

8. Avoid including any action steps or conclusions that are not clearly developed from your findings.

View the fact sheet (pdf)

March 24, 2009 at 10:10 am 2 comments

95 theses on evaluation

95thesesDisturbed by the state of affairs in evaluation, Professor Cronbach and colleagues wrote a 95 theses on reform in evaluation (inspired by Martin Luther’s 95 theses in 1517). They speak of the need for:

“A thoroughgoing transformation. Its priests and patrons have sought from from evaluation what it cannot, probably should not, give.”

Although written 28 years ago, the 95 theses (pdf) makes may pertinent points still valid today.

Here are several favourites that have stood the test of time (no. 75 is my favourite):

9. Commissioners of evaluations complain that the messages from evaluations are not useful, while evaluators complain that the messages are not used.

35. “Evaluate this program” is often a vague charge because a program or a system frequently has no clear boundaries.

49. Communication overload is a common fault; many an evaluation is reported with self-defeating thoroughness.

75. Though the information from an evaluation is typically not used at a foreseeable moment to make a foreseen choice, in many evaluations a deadline set at the start of the study dominates the effort.

95. Scientific quality is not the principle standard; an evaluation should aim to be comprehensible, correct and complete, and credible to partisans on all sides.

Read the full 95 theses (pdf) – despite this poor copy it’s well worth a read. The 95 theses originally appeared in the book “Towards reform of program evaluation“.

Glenn

November 19, 2008 at 8:19 am Leave a comment

Getting the final evaluation report right / write

For many evaluation projects, an important “deliverable” is the final evaluation report, which contains the findings, conclusions and recommendations of the evaluation. Having been through many evaluations as part of a team or as an individual, I am surprised at how often this important step gets neglected or simply messed up. Following are a couple of recommendations on putting together a final evaluation report:

  • Link the findings to the original evaluation questions: Not my own idea, but something I’ve seen others do well – structure the findings of the evaluation around the original questions from the brief that defined the evaluation. In this way, people reading the report can make the connection between the questions asked and what was found out.
  • Summarise the key findings in one diagram or table: Aside from reading the executive summary, people often appreciate grasping the key results in one view. Without vulgarising the findings, I find it is useful to sumarise the key findings visually. You can see an example of this idea (called a “snapshot”) on page five of this evaluation report (pdf).
  • Separate the recommendations from the findings: Often you see recommendations spread throughout the main body of the report. I find it confusing and believe it is easier to go through recommendations when they are found after the findings (while still making clear reference to the findings).
  • Make the executive summary a summary: An executive summary should be just that – a summary. I’m surprised at how many reports actually include new information in their executive summaries that are not found elsewhere in the reports. I recommend summarising the main findings and touching on the recommendations if space allows.
  • Include all the details for the really interested and pedantic: There will be a small number of your readers that will love to look further into the details – read all the 1000s of responses to the open questions, study the way the sample was selected, etc. For these readers, I recommend including these details of the evaluation as annexes. These details, such as the survey questions, interview guidelines, description of methodology, further analysis of demographics, existing research consulted, etc. will only strengthen your report and answer some questions for a select group of readers.

Related to this topic, I’ve also written previously about how to ensure that your results are used and how to present monitoring and evaluation results.

And if you want to read further, here are some very comprehensive guidelines from the World Bank on Presenting Results (pdf).

Glenn

February 18, 2008 at 9:52 pm 2 comments

Older Posts Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,665 other subscribers

Categories

Feeds