Posts filed under ‘General’

Online Surveys and Best Practices

As I’m setting up and running online surveys using the Benchpoint system (commercial plug), I am always interested to see examples of best practices in online surveying.  Sometimes we come across examples of worst practice also. Below I’ve copied in a screenshot of a survey I was asked to complete. It breaks a simple rule of surveying that dates back to phone and street surveying: personal / demographic information should normally be requested at the end of a survey. As it has been found that people are more comfortable answering such questions once they are familiar with the survey theme and if done in person, with the interviewer. I found the amount of personal information this survey asks even before you “start survey” is excessive and was the reason why I didn’t complete the survey. Hopefully general standards on online surveying are emerging to avoid this type of issue.

Glenn

April 21, 2006 at 12:01 pm Leave a comment

Measurement – The Business case for communication in an organisation

Chris Mykrantz of Watson Wyatt writes about his company’s latest communication ROI study on the Simply-Communicate website.

For years internal communications people have been arguing for a seat at the top table where decisions are taken, and several have made it. But, warns Chris,

“Be careful what you wish for. Senior leadership is now asking what value they’re contributing by sitting there”.

He mentions large companies who are actively looking for a financial–based ROI measurement of communication, a statement which may have a few communicators who measure by instinct quaking in their shoes!

One of his key points, with which we heartily agree, is :

“Start at the end and think hard cash – the first step in developing a communication strategy that includes measurement should be to envision the successful outcome. What would be different in your organization if the communication was on target? How would it change the business? How would it change employees? Don't settle for awareness or satisfaction. Find a desired state that you can define in dollar terms and design your strategy around that.”

Read the full article here.

Richard

April 12, 2006 at 4:47 pm 1 comment

Top Ten Excuses for not Evaluating

This post at the IABC measurement blog caught my attention, as its author asks the question:

“So why don’t we measure more? Is it budget, competencies, time or the risk of accountability?”

People usually don’t evaluate for various reasons, but the most common excuses I’ve heard are the following:

  1. “It’s too expensive”.
    With the amount of free advice, excellent guidelines and cheap research solutions available, this doesn’t pass anymore.
  2. “I don’t know how to”.
    Fair enough, but you can learn a lot yourself without having to engage expensive consultants.
  3. “I’m too busy “doing” to be bothered with measuring”.
    Frightening. People love doing things, it’s natural. But sometimes you have to stop and take a step back to see what you have achieved.
  4. “What I’m doing couldn’t possibly be measured”.
    Often heard from the Creative Type. People who create their own fonts, too clever campaigns and beautiful artwork that impresses other Creative Types. But my question is – what did you change?
  5. “I don’t see the value of it”.
    How else can you judge the value of your work if you don’t attempt to analyse and assess it?
  6. “I’m scared what of what I will find out”.
    But I think it will be scarier for you if you don’t evaluate and someone else does.
  7. “People are fed up with giving their opinion”.
    I don’t think people are – as I’ve written about before.
  8. “My gut feeling tells me I’m doing a good job”.
    There is a certain vogue that says out intuition is often our best call. But research often brings out issues that were not even on your radar.
  9. “All my work is vetoed by the CEO, if s/he’s happy so am I”.
    The CEO sees the organisation through the same rose-colored glasses as you do. In PR, it’s your public’s perception of your communication that counts.
  10. “You can’t prove anything anyway”.
    You can rarely obtain 100% proof that your programme caused the change seen. But what you do is collect evidence that indicates the role your activity played, as I further explained in this post.

Glenn

April 7, 2006 at 12:20 pm 7 comments

Evaluation – going beyond your own focus

If your focus is on evaluating PR, training or another business competency, it is sometimes helpful to learn more about evaluation by looking further than your particular focus. Look at international aid. I just read a review of a new book The White Man’s Burden – Why the West’s Efforts to Aid the Rest Have Done So Much Ill and So Little Good By William Easterly. Based on the book review, he raises two interesting points about evaluation:

Planners vs. Searchers: he says most aid projects have either one of these approaches. He writes “A Planner thinks he already knows the answer. A Searcher admits he doesn’t know the answers in advance”. He explains that Searchers treat problem-solving as an incremental discovery process, relying on competition and feedback to figure out what works.

Measuring Success: Easterly argues that aid projects rarely get enough feedback whether from competition or complaint. Instead of introducing outcome evaluation where results are relatively easy to measure (e.g. public health, school attendance, etc), advocates of aid measure success by looking at how much money rich countries spend. He says this is like reviewing movies based on their budgets.

You can read the full review of Easterly’s book on the IHT website.

These are some excellent points that we can apply to evaluation across the board. Evaluators can probably classify many projects they have evaluated as either of a Planner or Searcher nature. The Searcher approach integrates feedback and evaluation throughout the whole process – look at PR activities, we may not know what communication works best with our target audience to begin with, but by integrating feedback and evaluation we can soon find out.

His analogy about reviewing movies also strikes a chord. How many projects are evaluated on expenditure alone?

Aside from the evaluation aspects, his thoughts on international aid are interesting. I spent my formative years as an aid worker in Africa, Eastern Europe and Asia and a lot of what he says confirms my own experiences and conclusions.

Glenn

March 20, 2006 at 10:27 pm Leave a comment

Favourite Quotes on Evaluation and Measurement

In presentations on evaluation and measurement, I find the use of quotes from experts or well-known personalities assists in clarifying the importance of the field for the listeners. Here are some of my favourite quotes:

“True genius resides in the capacity for evaluation of uncertain, hazardous, and conflicting information” 
Winston Churchill

“Fear cannot be banished, but it can be calm and without panic; it can be mitigated by reason and evaluation”
Vannevar Bush

“The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them”  
George Bernard Shaw 

And a favorite from Monitoring and Evaluation News:

Friend to Groucho Marx: “Life is difficult!” 

Marx to Friend: “Compared to what?”

 

 

March 17, 2006 at 4:44 pm 6 comments

Measuring the “Soft Issues” for Investors

 innovest_logoOne company that is a good example of “intelligent measurement” is Innovest Strategic Value Advisors, Inc.  an investment research and advisory firm (disclosure: there are no links between the authors of this blog and Innovest). This company specializes in analyzing “non traditional” drivers of risk and shareholder value, including companies’ performance on environmental, social and strategic governance issues.

Innovest’s research is focused on those factors which contribute most heavily to financial performance. Environmental and social performance measures are used as leading indicators for management quality and long-term financial performance, not as commentaries on the intrinsic ethical worth of the companies. At the heart of Innovest’s analytical model is the attempt to balance the level of environmentally and socially driven investment risk with the companies’ managerial and financial capacity to manage that risk successfully and profitably into the future.

Environmental assessment criteria:
In total, the Innovest EcoValue‘21™ model synthesizes over 60 data points and performance metrics, grouped together under six key value drivers – Historical contingent liabilities, Financial Risk assessment, Operating risk exposure, Sustainability risk, Strategic management capability and Sustainable profit opportunities.

Social assessment criteria:
Over 50 individual performance indicators are addressed in Innovest’s IVA™ rating model. The principal value drivers are Sustainable Governance, Stakeholder management, Human Capital management, Products and Services, and behaviour in relation to oppressive regimes or exploitative labour markets in emerging markets.

The measurement process is complex, intensive and time consuming. Once the interview/data gathering process is completed, each company is rated relative to its industry competitors. Companies are rated against the Innovest performance criteria, and given a weighted score, as well as a letter grade (AAA, BB etc.). Each of the factors has an industry-specific weighting, based in part on a regression-based factor attribution analysis examining recent (5 year) stock market performance.

This approach is intelligent measurement at its finest.  The methodology is rigorous, and the outputs have enormous strategic value to investors, and management of any capital intensive company trying to understand the correlation between investment in traditionally regarded “soft” management issues and shareholder value.

Which why Innovest commands a high price for its reports and is generally regarded as the leader in its field.

Richard

March 4, 2006 at 5:26 pm Leave a comment

Intelligent Measurement – what and how?

Intelligent Measurement is about using innovative methodologies and technologies to measure things which are (a) important to measure, and (b) notoriously difficult – normally because of the intangibility of the subject and the subjectivity of measurers.

Don’t get me wrong. As a declared enemy of bureaucracy, I have an aversion to all forms of obsessive micro management and control freakery.

But I have seen many organisations blunder along in the dark as they attempt to improve their communications or embark on change management or environmental performance programmes.  If only they had some objective measurements at the start of the programme- their strategy would be informed by actualities, not anecdote. They could have measured outcomes, not output. They could have used the data to frame a sensible budget, and also measure which parts of the budget were wasted, and which returned the investment.

I have been to a number of conferences on PR measurement, notably Katie Paine’s “Measurement Summits”. The cry from the floor is “how can I obtain evidence to justify/keep my job/budget/team?” One solution is measurement. But what? And how?

Over the next months, I will be posting on this blog examples of good “Intelligent Measurement”. (please feel free to post any suggestions on the blog).

Richard

February 28, 2006 at 9:46 pm Leave a comment

Measuring Networks

This is an interesting tool from trackingthethreat.com that provides a graphical overview of the Al Qaeda network. The data is collected from thousands of open source reports, documents & news stories which are put together to establish the network linkages.

I write about this tool as it’s one of the first I have seen that attempts to measure a network. In communications, it is interesting for an organisation to assess the links between their key stakeholders. The theory is that a stakeholder group has more influence over an organisation if it has multiple links with other stakeholders. The structure of the stakeholder network is a good indicator as to where the power and influence is centered – and this helps organisations in prioritising their communication and relationship building activities with stakeholders.

The theory and practice of the importance of stakeholder networks is growing. If you are interested, read this article (pdf) from Ann Svendsen of the Collaborative Learning and Innovation Group (Simon Fraser University, Canada) where she explains how organisations actively assess and work with their stakeholder networks.

I learnt about the trackingthethreat network tool from the information aesthetics blog that looks at novel approaches as to how data can be visually represented.

Glenn

February 15, 2006 at 9:25 am Leave a comment

Context and Evaluation

I noticed recently in an evaluation the influence that the context can have on the findings. Interviewing and surveying people from different countries, we could see how their responses were influenced by their different frames of reference.

If we are aware of context issues (e.g. the setting for a training seminar, the relationship between the respondents and the commissioning organisation, the political climate, etc.) we can better estimate their impact on the findings. Plus it helps see how we can apply our findings to other contexts.

Most of the recognised standards on evaluation speak of the importance of analyzing the influence of context as a question of “accuracy”. Consult the Program Evaluation Standards of the Evaluation Center of Western Michigan University if you are interested to learn more about context and accuracy in evaluation.

Glenn

January 24, 2006 at 9:12 pm 1 comment

A measured approach for 2006

“The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them”

George Bernard Shaw

I take the liberty to quote George Bernard Shaw for my first post – his quote on measurement appeals to me. To “behave sensibly” we need to “measure” or else we are working with the “old” that no longer fits. In a nutshell, it sums up my approach to evaluation and measurement and we hope to bring more ideas, discussions and conversations to this area through this blog.

Glenn   

January 9, 2006 at 7:57 pm 1 comment

Newer Posts


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,665 other subscribers

Categories

Feeds