Posts filed under ‘Conference / event evaluation’

conference evaluation and network mapping


Often we attend conferences where one of the stated objectives is “increase/build/create networking” and I always found it odd that there is never any attempt to measure if networking really took place.

A possible solution is to map networks created by participants at conferences – and compare these networks to those that existed before the conferences.

This is exactly what I have done recently in a network mapping study that you can view here (pdf – 1 MB) and the above image is from. From the LIFT conference of 2007, we mapped the networks of 28 participants (out of 450 total participants) before and after the conferences. We found some quite surprising results:

  • These 28 participants had considerable networks prior to the conference – reaching some 30% of all participants.
  • These networks increased after the conference -the 28 people were then connected to some 50% of all participants.
  • Based on the sample of 28 participants, most participants doubled their networks at LIFT07 – e.g. if you went to the conference knowing five people, you would likely meet another five people at the conference – thus doubling your network to ten.

Although this is only a mapping of 28 participants, it provides some insight into conferences and how networks develop – it’s also quite interesting that 28 people can reach 50% (225 people) of the total conference participants in this case.

View the full report here (pdf – 1 MB).

If you are after further information on network mapping, I recommend Rick Davies’ webpage on network mapping. Although it focuses on development projects it contains a lot of useful information on network mapping in general.


January 14, 2008 at 8:20 pm 12 comments

Changing behaviour – immediate responses

Adding to what I wrote about last week concerning measuring behaviour changes that result from communication campaigns – and why I recommend to consider looking at immediate responses (or “outtakes”) as an alternative to long-term changes – I can see parallels in areas other than in campaigns.

As you may know, a favourite of mine is measuring the impact of conferences and meetings. Industry conferences are traditionally sold as being great places to learn something and network, network – and network. But I’m always surprised when attending such conferences at how organisers, if they measure something, focus on measuring the reactions to the conferences, usually in terms of satisfaction. No attempt is made to measure immediate changes to behaviour (such as extending a network) or longer term behaviour or impact in general.

But it is certainly possible, this diagram (pdf) illustrates what I did to measure immediate and mid-term changes to behaviour following a conference (LIFT). Despite the limitations of the research as I explain here, I was able to track some responses following the conference that could be largely contributed to participating in the conference – such as meeting new people or using new social media in their work. One year after the conference, participants also provided us with types of actions that they believed were influenced largely by their participation. Actions included:
– launching a new project
– launching a new service/product
– establishing a new partnership
– Initiating a career change
– Invitations for speaking engagements

Some of these actions were anticipated by the conference organisers – but many were not. It shows that it can be done and is certainly worth thinking about in conference evaluation.


September 6, 2007 at 1:57 pm Leave a comment

Evaluation of LIFT07 – can we measure the long term impact of conferences?

I’ve just finished an interesting evaluation study on LIFT07, an international conference on emerging technology and communications that was held in Geneva during February 2007 – you can view the complete evaluation report here (pdf -339 kb). Our main evaluation tool was a survey of the conference attendees (48% of participants completed the survey).

Apart from providing useful feedback for the conference organisers that will assist them in improving future conferences, the study set out to find out what was the longer term impact of the first LIFT conference (held in February 2006). By surveying attendees that participated in both the 2006 and 2007 conferences we were able to “track” some key points of changes in attitudes and behaviours and to what extent they could be attributed to the LIFT conference. My findings are summarised in this diagram (pdf).

What I found very interesting is that one year after the conference, 28% of attendees (of a 50 person sample) said they started new activities partly due to LIFT06, such as forming a partnership, creating a blog or launching a new partnership. Further, 90% of attendees said that the conference influenced them in finding and exchanging information.

Of course, we have to recognise the limitations of the study, notably that it is self reported (and not backed-up by independent confirmation) and it is a relatively small sample (i.e. 17.5% – 50 people out of 285 participants). Nevertheless, we can make certain conclusions that the conference did have longer term impact in quite precise areas with some participants; establishing new contacts, inspiring new ideas and ways to find and exchange information.


June 19, 2007 at 8:40 pm 3 comments

Workshop participation & short term impact

An interest of mine is looking at the short & long term impact of conferences and workshops. A lot of work has been done on evaluating the impact of training that I have written about before. Basically, we can look at four levels of impact: 1. Reaction, 2. Learning, 3. Behavior & 4. Results. A lot of conference/workshop evaluation focus on the “reaction” aspect – what did participants like/prefer about an event.

But more interesting is to look at the learning, behavior and- if possible – results aspect. This usually takes time – however, if we are clear about what a workshop/conference is trying to achieve, we can often identify changes in learning/behavior in the short term.

A practical example. When I ran the “Do-It-Yourself Monitoring and Evaluation” workshop (pictured above that’s – David Washburn and myself at the workshop) at the LIFT07 conference, my main objective was to get people thinking about how they could integrate monitoring and evaluation into their own projects. Using a basic evaluation framework (pdf) groups worked to break down projects into the main steps needed for evaluation.

So was the “learning” aspect succesful? – I’d like to think so. Quite a few people commented to me how it got them thinking about monitoring/evaluation and what they could do with their own projects. Also, the following participants blogged about the workshop, an indication of what they took away from the workshop – and also crossing into the “behaviour” area: they processed some thoughts and took the action (behaviour) of writing about it:

Even more so, one participant told me about how he used the information from the workshop the same week, which supports my idea about the possiblity to identify short term impact, even in terms of behaviour:

“When we got back from the workshop, I took out the evaluation framework and sat down with my colleagues and planned out what we were going to monitor and evaluate in our major projects, setting down objectives and evaluation indicators. So we can use the framework as a guide in the coming six months.”


February 23, 2007 at 10:03 pm 2 comments

LIFT 07 – evaluation, networks & social media

The LIFT blog beat me to it in announcing that I will be involved in working with the LIFT team in evaluating the 2007 event that will take place in February 2007. LIFT is an international conference that takes place in Geneva and looks at the relationship between technology and society.

My experience with LIFT 06 in evaluating the reactions and initial impact to the event is written up in this journal paper (pdf) or directly on the LIFT website.

In 2007, I hope to go further by exploring the impact of social media on the event setting and looking at networks that develop. It should be fun!


November 6, 2006 at 10:40 pm 2 comments

Evaluation of events and conferences

I’ve written in previous posts about my work in evaluating the impact of events. A very interesting paper on this subject “A Guide to Measuring Event Sponsorship” has been published by US-based Institute for Public Relations. The title is misleading as the paper focuses on how to measure the effectiveness of an event and not on sponsorship evaluation (a separate subject, don’t get me started on it…).

The guide states:

“There are four central questions to keep in mind concerning
event evaluation:
1. How effective was the event? To what extent did the event impact the target public in the desired manner?
2. Did the event change the targeted public in unexpected ways,
whether desirable or undesirable?
3. How cost effective was the event?
4. What was learned that will help improve future events? “

The Guide goes further than I have done in event evaluation by looking at calculating ROI and at the impact on sales (applicable for a commercially focused event). It also confirms my general opinion on event evaluation – we have to go further than simply counting attendees, general reactions and press coverage – we have to look at the impact on attendees’ knowledge, attitudes, behaviours and anticipated behaviour (e.g. intention to purchase a product).


May 12, 2006 at 2:36 pm 2 comments

Evaluation of LIFT06: can we measure the impact of conferences?

I’ve just finished a very interesting project – the evaluation of the impact of the LIFT06 conference that took place in Geneva in February 2006. In a true open source spirit, the evaluation report is available for everyone to consult. With this evaluation, we tried to go beyond the standard assessment of reactions to a conference. We looked at changes to knowledge, attitudes and behaviours. Using a triangulation approach combining quantitative and qualitative research methods, I believe we could identify the influence of LIFT06 on these above variables. We were aware of the limitations to the evaluation given that it was a punctual evaluation and based largely on self-assessment of attitudinal and behavioural changes, which I explain here in the report.

What sort of changes could we identify?

Changes to awareness and attitudes: Through and online survey, the majority of attendees (82%) agreed that LIFT06 provided them with interesting information on the usage of emerging technologies and 70% agreed that LIFT06 influenced what they thought about the subject. This quote taken from an attendee’s blog illustrates this point:

“And just think; if I had never gone to Lift06 I would not be feeling anything like this strongly about the issue”

Changes to Behavior: Evaluations of conferences are rarely able to show a direct relation between the event and changes in behaviour of attendees. With LIFT06, some attendees indicated a change in behaviour, such as starting a blog or getting a new partnership. another key objective of LIFT06 was to “connect” people – 94% of attendees reported that they met new people at LIFT06.

April 27, 2006 at 7:16 pm 1 comment

Older Posts Newer Posts

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,128 other followers