
Ashdown was reacting to the disparity between the polls which asked people how they planned to vote, and those which asked them how they actually had voted.
According to pre-election polls, Labour and Conservatives would receive a roughly equal share of the vote. The exit polls gave a shock result, showing (correctly, it turns out) that there would be a Conservative majority.
It is further proof that politicians often fail to understand the widening gap between what people say in research and what they do. It’s also a truth that we in marketing often get wrong.
It’s not always just that people aren’t being honest in research polls and focus groups, it’s that people don’t really know what they’re going to, or even aren’t always sure what they’ve already done.
That’s why the worlds of politics and marketing need to adopt real-world research approaches if they want to really understand what people are likely to do.
Question everything
Real-world research is based on the belief that behavioural intent and reported behaviours are always likely to be suspect.
This has been proven in our own brand research, where over 70% of people who stated their affinity for and intention to fly British Airways were actually flying EasyJet and over 60% of people who stated they didn’t and wouldn’t eat at McDonald’s proved themselves wrong when they kept their own behavioural diaries.
Like some politicians, there is a marked difference between what we say we do and what we get up to after a few drinks on a Saturday night.
To get the research results right we need to adopt different approaches and tools: behavioural economics, social media listening, building digital communities, using behavioural diaries, observing actual behaviours and potentially even getting under the skin of our behavioural triggers by using neuroscience.
Find likely behaviours
A different approach that could have predicted actual voting behaviour would have started by recruiting a broad range of people based on the political sentiments they expressed in social media and given them tasks that could uncover their likely behaviours.
This could even have gone as far as re-creating that moment of truth, creating a mock polling station where they were asked to vote and report back on their actual behaviours and the tensions and conflicts they felt when they had to place an ‘X’ in a box.
While politicians would naturally shy away from what could be reported as manipulative techniques, a neuroscience-focused approach to testing how people really responded to the different parties, their messaging and their leaders could have told them what was really going on versus what people were telling them.
The neuroscience approach is probably on the wrong side of what we are comfortable with. Yet our government has been happy to embrace the tools and techniques of behavioural economics, and that might have helped Labour in how it researched and re-framed its challenge – how do we make a change in behaviour easier for people when we know that people aren’t predisposed to change and will even stick with a government that they don’t particularly like?
I’m looking forward to seeing the conclusion of this story, now that a new government is on the way. It’s the ultimate example of the truth that real world research exposes, that gap between what we say and what we do – will a politician who has told the nation on live TV that he will eat his own hat if proven wrong actually do it?