Network mapping on LinkedIn

linked network
The online networking site LinkedIn has introduced quite a new interesting feature where you can make a network map of your contacts – mine is found above.  How it works is that it assign colors based on how all of the people in your network — such as people you went to school with, friends or colleagues — are interconnected – so the different colors represent your main “groups”.

You can try it our here>>

February 12, 2011 at 5:45 pm Leave a comment

Measuring success in online communities

At the Lift conference this week in Geneva, I heard a lot of speakers mention the need to measure and evaluate how online tools are being used, for what purpose and with what impact (about time!).

One speaker, Tiffany St James spoke on  “How to encourage involvement in online communities”.  The above illustration shows the main aspects of her presentation, where she suggested some key performance indicators for measuring online communities, notably:

Outputs: how many visits, referrals, subscribers, loyalty, web analytics,  bounce rates

Outtakes: messages and experience for user satisfaction, measuring change of attitude

Outcomes: action-what do you want the user to do?

You can view a video of Tiffany’s presentation here>>

(illustration fabulously done by Sabine Soeder of Alchemy).

February 6, 2011 at 8:22 pm 1 comment

Communication evaluation lecture at Bournemouth University, UK

On Tuesday, 15 March 2011 our friends and colleagues Professor Tom Watson & Professor Dr Ansgar Zerfass will be giving a lecture at Bournemouth University “The evolution of evaluation – Public Relations’ erratic path to the measurement of effectiveness” and “Corporate communications revisited – How communication drives corporate value and legitimisation”

The lecture is part of Bournemouth University’s free lecture series, open to everyone, giving you a unique insight into the knowledge and research that they are renowned for. Previous lectures attracted a varied audience, including fellow academics, businesses and members of the local community who learned how BU’s research is helping to change the world for the better.

You are welcome to attend as many lectures as you wish. Register online >>

Lectures will be held in the Executive Business Centre, 89 Holdenhurst Road, Bournemouth BH8 8EB, UK (Lansdowne Campus) with registration and refreshments available from 5pm.

February 2, 2011 at 7:31 pm Leave a comment

Realist evaluation workshop, 29 March 2011, the Netherlands

Here is an interesting workshop on “Realist evaluation”, taking place in Wageningen, the Netherlands on 29 March 2011.  More information from the organisers:

Realist evaluation – understanding how programs work in their context.

‘Realist evaluation (Pawson and Tilley, 1997) is one type of theory based evaluation. It aims to explore “what works, for whom, in what contexts, to what extent and how”. It adopts a particular understanding of how programs work, and uses a particular format for program theories to help guide evaluation design, data collection and analysis.

Realist evaluation has a particular focus on understanding the interactions between programs and their contexts and the ways that these influence how programs work. Evaluation expert Dr. Gill Westhorp will discuss the concepts and assumptions that underpin this theory based evaluation approach. What is it that realist evaluation brings to the table of evaluating development programs? How is it different from existing approaches in evaluation in development? How does it understand, and deal with, complexity? What new insights can help strengthen the utility of evaluation for development?

During the morning, Gill will introduce the basic assumptions and key concepts in realist evaluation. She will also briefly demonstrate how these ideas can be built into other evaluation models using two examples. These models – realist action research and realist program logic – are participatory models which were designed for use in settings where limited resources, lack of capacity to collect outcomes data, complex programs, and (sometimes) small participant numbers make evaluation difficult. In the afternoon, the practical implications for evaluation design, data collection and analysis will be discussed. Examples and practical exercises will be included throughout the day.

More information and registration>>

January 31, 2011 at 12:56 pm 3 comments

New US policy on evaluation

USAID, the US government body responsible for foreign aid programs has issued a new policy on evaluation.  According to the USAID itself,  the new policy “seeks to redress the decline in the quantity and quality of USAID’s recent evaluation practice”. They highlight six key points of this policy:

1.       Defining impact evaluation and performance evaluation and requiring at least one performance evaluation for each major program and any untested and innovative interventions, and encouraging impact evaluation for each major development objective in a country program, especially for new or untested approaches and interventions:

2.       Calling for evaluation to be integrated into programs when they are designed;

3.       Requiring sufficient resources be dedicated to evaluation, estimated at approximately three percent of total program dollars;

4.       Requiring that evaluations use methods, whether qualitative or quantitative, that generate the highest quality evidence linked to the evaluation questions and that can reasonably be expected to be reproducible, yielding similar findings if applied by a different team of qualified evaluators;

5.       Building local capacity by including local evaluators on evaluation teams and supporting partner government and civil society capacity to undertake evaluations; and

6.       Insisting on transparency of findings with the presumption of full and active disclosure barring principled and rare exceptions.

view the new policy here (pdf)>>

January 28, 2011 at 4:19 pm 2 comments

Data visualisation – the many possibilities

I am always interested in learning of different ways to represent data visually.

Well here is something that will fascinate you if you are also interested in the many possibilites of displaying data.

Visual-literacy.org have produced a fantastic “Period Table of Visualization Methods” (reduced version shown above). Being inspired from the standard chemistry period table, they have listed virtually every possible type of data visualization and categorised them. The only type missing I see is the “word cloud”.

View the “Period Table of Visualization Methods”>>

January 24, 2011 at 11:05 am Leave a comment

Online campaigning handbook

Here is a newish (well I just discovered it..)  online campaigning handbook (pdf) from Publiczone.

Point 10 of the handbook “Keeping track of what you are doing” focuses on monitoring and evaluating online campaigns. Here is an extract of what they recommend:

Effective monitoring and evaluation can make the difference between an average and an amazing campaign. Monitor and evaluate as you go along and you’ll keep finding new opportunities to optimise your campaigning…The trick is to design your evaluation before you start, paying close attention to how you are going to collect data. Too often, charities leave evaluation to the end, only to discover they can only form a patchy picture of their campaign due to an absence of data.

View the manual (pdf)>>

January 18, 2011 at 8:51 am Leave a comment

Seminar with Dr. Patton : ‘Developmental evaluation – new kid on the evaluation block’: 29th March 2011, the Netherlands

Here is information on a seminar coming up with Dr Michael Quinn Patton – I’ve just bought his new book which is a very good read:

Date: 29th March 2011 in the Netherlands: ‘Developmental evaluation – new kid on the evaluation block’.

Description: Developmental evaluation is based on insights from complex dynamic systems, uncertainty, nonlinearity, and emergence. World renowned, award-winning evaluation expert Dr. Michael Quinn Patton will discuss the developmental evaluation framework as detailed in his book `Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use’. Patton will illustrate how developmental evaluation can be used for a range of purposes: ongoing program development; adapting effective principles of practice to local contexts; generating innovations and taking them to scale; and facilitating rapid response in crisis situations.

During the morning, Dr. Patton will explain developmental evaluation and illustrate it with many examples from his own experience. In the afternoon, participants will debate the practical application of developmental evaluation in development, based on participants’ existing evaluation questions.

For more info and registration: http://www.cdi.wur.nl/UK/newsagenda/agenda/DevelopmentalEvaluation_MichaelPatton

January 13, 2011 at 7:01 am Leave a comment

Online course: Participatory Monitoring and Evaluation

Here is an interesting  online course  on Participatory Monitoring and Evaluation from International Institute for Development and Colorado State University:

Description: This course stresses participatory methods in monitoring and evaluation for community development, where multiple stakeholders are involved in the process of planning, collecting, interpreting, communicating, and using information. This approach emphasizes a regular monitoring process that leads to continuous improvements. The course uses a case study and team discussions to illustrate the participatory monitoring and evaluation process.

The course starts next on 18 March 2011. More information >>

January 10, 2011 at 12:43 pm Leave a comment

A Guide to Actionable Measurement

The Bill & Melinda Gates Foundation has produced a new publication “A Guide to Actionable Measurement” (pdf).

To paraphrase, what they mean is evaluation and monitoring activities that can be used and acted up.

Following are some excerpts from the guide that are well worth considering:

7 points on what is actionable measurement?

1. Consider measurement needs during strategy development and review
2. Prioritize intended audiences
3. Do not privilege a particular evaluation design or method
4. Focus on a limited set of clearly articulated questions
5. Align results across strategy, initiatives, and grants
6. Obtain information needed to inform decisions in a timely way
7. Allow time for reflection and the development of insight

4 points on evaluation at the strategy level:

1. Measure outcomes more frequently than impact
2. Measure for contribution, not attribution
3. Harmonize and collaborate
4. Limit the tracking of inputs, activities, and outputs at the strategy level

View “A Guide to Actionable Measurement” (pdf)>>

January 6, 2011 at 4:24 pm Leave a comment

Older Posts Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,665 other subscribers

Categories

Feeds