Posts filed under ‘Web metrics’
The approach taken relates online measurement tools to four levels of assessing influence of communications on policy (an aim of research communications):
- Management, outputs, uptake, outcomes and impact.
The last level, outcomes and impact is of course the hardest to measure with digital tools. But I think if you have access to your target audiences, this can be done through in-depth interviews or more simply through email surveys to ask how they have used the research products – which can give then provide an indication of the role they have taken in influencing policy.
Using Google Analytics to track the relative value of your Offer
Some lessons for the communications Evaluation profession
It is some time since I looked at my Google Analytics account. A pity, because it can reveal some dramatic insights into global trends. And the quality and mine-ability of the data is improving month by month.
I wanted to see what was happening in Benchpoint’s main market place, which is specialist on line surveys of employee opinion in large companies. So I looked up “employee surveys”. I was surprised (and shocked) to see that Google searches for this had declined since their peak in 2004 to virtual insignificance.
This was worrying, because our experience is that the sector is alive and well, with growing competition.
On the whole, we advise against general employee surveys, preferring surveys which gain insight into specific areas.
So I contrasted this with a search for “Employee Engagement”, on its own. The opposite trend! This search term has enjoyed steady growth, with the main interest coming from India, Singapore, South Africa, Malaysia, Canada and the USA, in that order.
“Employee engagement surveys”, which first appeared in Q1 2007, also shows a contrarian trend, with most interest coming from India, Canada, the UK and the USA.
Looking at the wider market, here is the chart for the search term “Surveys” – a steady decline since 2007
But contrast this with searches for “Survey Monkey”
Where is all this leading us? Google is remarkably good are recording what’s cool, and what’s not in great detail and in real time. There are plenty of geeks out there who earn good money doing it for the big international consumer companies. And what it tells us is that, more than ever, positioning is key.
Our own field, “ Communications Evaluation” is fairly uncool. Maybe we need to invent a new sexy descriptor for what we do?
But note, on the chart below, the peaks in the autumn of 2009 and 2010, when the AMEC Measurement Summits were held. Sudden spikes in interest.
This blog and Benchpoint have the copyright of “Intelligent measurement”, which is holding its own in the visibility and coolness stake – with this blog giving a boost way back in 2007…
- Get a Google Analytics account and start monitoring the keywords people are using to seach for your business activity and adapt your website accordingly
- As an interest group/profession, we probably need to adopt a different description of what we do if we wish to maintain visibility and influence. Suggestions anyone? Discuss!
Sorry for such a long post!
When evaluating a communications project, I often consider the web metrics aspect of the project, if a website played an important part in the project. Web metrics are statistics generated by tools that measure website traffic, such as how many people visited a web page, where did they come from, etc.
Seth Duncan has recently produced for the US-based Institute for PR a very interesting paper on this subject:
The paper focuses on the aspect of referral (e.g. which is the most “efficient” source of traffic for a website) but also contains some intruiging descriptions of advanced statistical methods for web analytics.
Online tools, such as corporate websites, members’ directories or portals increasingly play an important role in communications’ strategies. And of course, they are increasingly important to evaluate.
I just concluded an evaluation of an online tool, created to facilitate the exchange of information amongst a specific community. The tool in question, the Central Register of Disaster Management Capacities is managed by the United Nations Office for the Coordination of Humanitarian Affairs.
The evaluation methodology that I used for evaluating this online tool is interesting as it combines:
- Content analysis
- Network mapping
- Online survey
- Expert review
- Web metrics
And for once, you can dig into the methodology and findings as the evaluation report is available publicly: View the full report here (pdf) >>