Posts filed under ‘Conference / event evaluation’

evaluating events and conferences

I’ve written previously about evaluating events and conferences, and in a recent evaluation I undertook for the LIFT09 conference, apart from measurements of attitudes and reactions to the conference, we also looked at what online visibility the conference generated. We found three interesting results based on qualitative and quantiative analysis of blogs and tweets:

– 22% of the blog posts analysed had embedded videos of conference presentations (or liked to them). This is an indication of the importance of the videos in promoting the conference and its themes.

– 32% of the people blogging on the conference were not actually attending the conference – indicating the “reach” of the conference outside of the direct participants.

– The number of tweets on the conference peaked sharply during the three days of the conference (on the second day notably) while blog posts, in smaller numbers, continued to be written about the conference weeks later. The graph below illustrates this point:

graph_tweetblog

Graph data was generated by http://www.technorati.com and http://www.hashtags.org.

The full conference evaluation report can be viewed here (pdf)>>

May 12, 2009 at 7:26 pm Leave a comment

Blog on event evaluation

This is the first blog I’ve seen that focuses 100% on event evaluation: Constellation Communication blog. Many interesting posts on event evaluation and ROI for events.

Similar to what I do with an “event scorecard”, here is an interesting post with visuals on how to represent event evaluation – the ROI measures are particularly useful.

View the blog here >>

Glenn

October 2, 2008 at 7:46 am Leave a comment

Evaluating events and conferences

Like me, you might be suprised to know that businesses spend an estimate 150 billion US dollars per year in organising meetings and events in the US alone.

But in my experience, organisations rarely measure the impact of meetings and events: what did attendance at meetings or events change in the performance of individuals and organisations as a whole?

I’ve done some event evaluation projects in the past years and have developed an “event scorecard”. To summarise and share my experiences, I have just created a fact sheet “Evaluating events and conferences (pdf)” – good luck with it!

Glenn

(above photo from Lift conference 2007, Geneva, Switzerland – photo by noneck).

September 24, 2008 at 1:17 pm 2 comments

Event scorecard

In the work I do to evaluate conferences and events, I have put together what I believe is a “neat” way of displaying the main results of an evaluation: an event scorecard. In the evaluation of a conference that occurs every year in Geneva, Switzerland, the LIFT conference, the scorecard summarises both qualitative and quantitative results taken from the survey of attendees. Above you can see a snapshot of the scorecard.

As I have evaluated the conference for three years now, we were also able to show some comparative data as you can see here:

If you are interested, you can view the full scorecard by clicking on the thumbnail image below:

And for the really keen, you can read the full evaluation report of the LIFT08 evaluation report (pdf).

Greetings from Tashkent, Uzbekistan from where I write this post. I’m here for an evaluation project and off to Bishkek, Kyrgyzstan now.

Glenn

May 11, 2008 at 10:55 am 7 comments

conference evaluation and network mapping

lift07_nm_lifters_11_after.jpg

Often we attend conferences where one of the stated objectives is “increase/build/create networking” and I always found it odd that there is never any attempt to measure if networking really took place.

A possible solution is to map networks created by participants at conferences – and compare these networks to those that existed before the conferences.

This is exactly what I have done recently in a network mapping study that you can view here (pdf – 1 MB) and the above image is from. From the LIFT conference of 2007, we mapped the networks of 28 participants (out of 450 total participants) before and after the conferences. We found some quite surprising results:

  • These 28 participants had considerable networks prior to the conference – reaching some 30% of all participants.
  • These networks increased after the conference -the 28 people were then connected to some 50% of all participants.
  • Based on the sample of 28 participants, most participants doubled their networks at LIFT07 – e.g. if you went to the conference knowing five people, you would likely meet another five people at the conference – thus doubling your network to ten.

Although this is only a mapping of 28 participants, it provides some insight into conferences and how networks develop – it’s also quite interesting that 28 people can reach 50% (225 people) of the total conference participants in this case.

View the full report here (pdf – 1 MB).

If you are after further information on network mapping, I recommend Rick Davies’ webpage on network mapping. Although it focuses on development projects it contains a lot of useful information on network mapping in general.

Glenn

January 14, 2008 at 8:20 pm 12 comments

Changing behaviour – immediate responses


Adding to what I wrote about last week concerning measuring behaviour changes that result from communication campaigns – and why I recommend to consider looking at immediate responses (or “outtakes”) as an alternative to long-term changes – I can see parallels in areas other than in campaigns.

As you may know, a favourite of mine is measuring the impact of conferences and meetings. Industry conferences are traditionally sold as being great places to learn something and network, network – and network. But I’m always surprised when attending such conferences at how organisers, if they measure something, focus on measuring the reactions to the conferences, usually in terms of satisfaction. No attempt is made to measure immediate changes to behaviour (such as extending a network) or longer term behaviour or impact in general.

But it is certainly possible, this diagram (pdf) illustrates what I did to measure immediate and mid-term changes to behaviour following a conference (LIFT). Despite the limitations of the research as I explain here, I was able to track some responses following the conference that could be largely contributed to participating in the conference – such as meeting new people or using new social media in their work. One year after the conference, participants also provided us with types of actions that they believed were influenced largely by their participation. Actions included:
– launching a new project
– launching a new service/product
– establishing a new partnership
– Initiating a career change
– Invitations for speaking engagements

Some of these actions were anticipated by the conference organisers – but many were not. It shows that it can be done and is certainly worth thinking about in conference evaluation.

Glenn

September 6, 2007 at 1:57 pm Leave a comment

Evaluation of LIFT07 – can we measure the long term impact of conferences?

I’ve just finished an interesting evaluation study on LIFT07, an international conference on emerging technology and communications that was held in Geneva during February 2007 – you can view the complete evaluation report here (pdf -339 kb). Our main evaluation tool was a survey of the conference attendees (48% of participants completed the survey).

Apart from providing useful feedback for the conference organisers that will assist them in improving future conferences, the study set out to find out what was the longer term impact of the first LIFT conference (held in February 2006). By surveying attendees that participated in both the 2006 and 2007 conferences we were able to “track” some key points of changes in attitudes and behaviours and to what extent they could be attributed to the LIFT conference. My findings are summarised in this diagram (pdf).

What I found very interesting is that one year after the conference, 28% of attendees (of a 50 person sample) said they started new activities partly due to LIFT06, such as forming a partnership, creating a blog or launching a new partnership. Further, 90% of attendees said that the conference influenced them in finding and exchanging information.

Of course, we have to recognise the limitations of the study, notably that it is self reported (and not backed-up by independent confirmation) and it is a relatively small sample (i.e. 17.5% – 50 people out of 285 participants). Nevertheless, we can make certain conclusions that the conference did have longer term impact in quite precise areas with some participants; establishing new contacts, inspiring new ideas and ways to find and exchange information.

Glenn

June 19, 2007 at 8:40 pm 3 comments

Workshop participation & short term impact

An interest of mine is looking at the short & long term impact of conferences and workshops. A lot of work has been done on evaluating the impact of training that I have written about before. Basically, we can look at four levels of impact: 1. Reaction, 2. Learning, 3. Behavior & 4. Results. A lot of conference/workshop evaluation focus on the “reaction” aspect – what did participants like/prefer about an event.

But more interesting is to look at the learning, behavior and- if possible – results aspect. This usually takes time – however, if we are clear about what a workshop/conference is trying to achieve, we can often identify changes in learning/behavior in the short term.

A practical example. When I ran the “Do-It-Yourself Monitoring and Evaluation” workshop (pictured above that’s – David Washburn and myself at the workshop) at the LIFT07 conference, my main objective was to get people thinking about how they could integrate monitoring and evaluation into their own projects. Using a basic evaluation framework (pdf) groups worked to break down projects into the main steps needed for evaluation.

So was the “learning” aspect succesful? – I’d like to think so. Quite a few people commented to me how it got them thinking about monitoring/evaluation and what they could do with their own projects. Also, the following participants blogged about the workshop, an indication of what they took away from the workshop – and also crossing into the “behaviour” area: they processed some thoughts and took the action (behaviour) of writing about it:

Even more so, one participant told me about how he used the information from the workshop the same week, which supports my idea about the possiblity to identify short term impact, even in terms of behaviour:

“When we got back from the workshop, I took out the evaluation framework and sat down with my colleagues and planned out what we were going to monitor and evaluate in our major projects, setting down objectives and evaluation indicators. So we can use the framework as a guide in the coming six months.”

Glenn

February 23, 2007 at 10:03 pm 2 comments

LIFT 07 – evaluation, networks & social media

The LIFT blog beat me to it in announcing that I will be involved in working with the LIFT team in evaluating the 2007 event that will take place in February 2007. LIFT is an international conference that takes place in Geneva and looks at the relationship between technology and society.

My experience with LIFT 06 in evaluating the reactions and initial impact to the event is written up in this journal paper (pdf) or directly on the LIFT website.

In 2007, I hope to go further by exploring the impact of social media on the event setting and looking at networks that develop. It should be fun!

Glenn

November 6, 2006 at 10:40 pm 2 comments

Evaluation of events and conferences

I’ve written in previous posts about my work in evaluating the impact of events. A very interesting paper on this subject “A Guide to Measuring Event Sponsorship” has been published by US-based Institute for Public Relations. The title is misleading as the paper focuses on how to measure the effectiveness of an event and not on sponsorship evaluation (a separate subject, don’t get me started on it…).

The guide states:

“There are four central questions to keep in mind concerning
event evaluation:
1. How effective was the event? To what extent did the event impact the target public in the desired manner?
2. Did the event change the targeted public in unexpected ways,
whether desirable or undesirable?
3. How cost effective was the event?
4. What was learned that will help improve future events? “

The Guide goes further than I have done in event evaluation by looking at calculating ROI and at the impact on sales (applicable for a commercially focused event). It also confirms my general opinion on event evaluation – we have to go further than simply counting attendees, general reactions and press coverage – we have to look at the impact on attendees’ knowledge, attitudes, behaviours and anticipated behaviour (e.g. intention to purchase a product).

Glenn

May 12, 2006 at 2:36 pm 2 comments

Older Posts Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,145 other followers

Categories

Feeds