One of the reasons Donald Trump’s win was so shocking is that the pollsters completely failed to predict he’d end up in the White House. A bit like they did for Brexit and Cameron’s re-election before it.
It’s an important reminder that what people “say” when surveyed is rarely consistent with what they actually “do.” People are not very good predictors of their future behaviour or at recalling past behaviour. And the more ways there are to contact people to ask them what they will do, the more the issue is exacerbated.
It’s clear that the model is broken…But why?
Polls rely on models which are built on the assumption that people will behave in a similar way to how they have behaved in the past. In the case of the US elections, this meant that the turnout of white working class males was underestimated, because it was assumed that they would behave as they always had in the past. But they didn’t.
Whatever the reason, so seismic was the miss that it’s called into question the entire polling sector, the survey methods and the assumptive nature of its reporting.
There used to be a standard set of insight tools: focus groups and in-depth interviews on the qualitative side and face-to-face, phone or mail-based survey research on the quantitative side. The phone surveys worked because almost everyone had a landline phone and answered it.
But in today’s hyper-connected world, such conventional attitudinal methods simply don’t work.
One of the big issues with attitudinal research is that it’s looking at life through the lens of an informed elite. Those who spend time debating and cross analysing in bubble capital societies, when the wider world doesn’t have the time or energy for this sort of nonsense.
For most it’s hard to keep track of complex issues. So rather than ‘think’ their way through life, they ‘sense’ their way through – because instincts are often the best measure, albeit hard to predict.
Traditional research relied on people self-reporting – about their attitudes, behaviours, and the reasons for those behaviours. However self-reporting is subject to response bias. This is a general term for a wide range of cognitive biases that influence the responses of participants and calls into question the validity of focus groups, questionnaires or surveys.
We shouldn’t build a sense of sentiment remotely through a sterile survey approach. We need to go there, talk to people, get a feel for what matters to them. To sense it first hand and not just rely on numbers, social listening or distant analysis. Instinct counts.
The trend toward data-driven strategy presents a whole new array of research options including powerful new text analytics.
The social web dictates that brands and candidates are built and burnt by conversation. We need to go beyond the usual factors, to understand linguistic traits and other important factors that consumers choose and are exposed to daily.
We need to shift from siloed research techniques to mixed methodologies. We need to harness innovative, technology-driven research platforms including web-based survey platforms, hosted online communities, online dial testing and so on.
The rise of the social web and the decline of traditional advertising vehicles challenges the control and assumptions of conventional research methods. The ad-led, stimulus-response paradigm is being replaced by a horizontal word of mouth and peer-to-peer paradigm. Market research’s basic societal challenge is the disintegration of the mass market-mass advertising model. Today, people are media.
The focus needs to be on “actionable insights”. We have more data on consumers than ever before and yet we are still starved of a holistic understanding of our audience. We need to shine new light on old methodologies and embrace advances in the field of behavioral psychology and linguistic research techniques.
As Scottish writer Andrew Lang said: “Some people use statistics as a drunk man uses lamp-posts — for support rather than for illumination.”
Alex Van Gestel is CEO of behavioural communications agency Verbalisation. Verbalisation’s RAID (Rapid Audience Insights Diagnostic) model is designed to use multiple sources, cross-referencing questions and opportunities for respondents to use their own language representing their own attitudes.