by Stephen Tall on July 14, 2013
Earlier this week, I set readers a quiz — The British public, eh? What do they know? — based on research undertaken by Ipsos MORI for the Royal Statistical Society and King’s College London showing the British public is often wrong about what we think we know on a range of current social issues. For instance, we (collectively) massively over-estimate the proportion of people in this country who are black or Asian, or who are Muslim, or who are over 65. We also think, again wrongly, that crime is rising, and that more is spent on overseas aid than on pensions.
Over at The Conversation, Bobby Duffy, a senior research fellow at King’s, has outlined four explanations he thinks do most to account for how unreliable is the public’s knowledge of facts and figures:
First, there are simple measurement and definitional problems. It’s difficult to get across what can be quite complex and precise issues in simple survey questions.
But probably more importantly, the public are not always thinking about the things we think they are. For example, when we ask people what they were thinking of as benefit fraud when they guessed at its scale, they select items that can’t be counted as actual fraud. In people’s minds, it includes claimants not having paid tax in the past and people having children so they can claim more benefits.
Second, there are a whole range of cognitive errors, simple mistakes we make when answering these types of questions. This includes problems of statistical literacy – for example, we just struggle with very big or very small numbers, and find it hard to distinguish between rates and levels.
But there are also explanations from social psychology on the biases and shortcuts in how we think: for example, we know we’re more likely to focus on and remember negative information.
Third, there is certainly an impact from the media and political discourse. The links are complex and difficult to prove categorically, but the association between attitudes and media coverage is often strong. Of course, the media also reflects our concerns and tastes for types of information: to a large extent we get the media we want. The focus on vivid stories rather than straight facts is because we pay more attention to those vivid stories ourselves (we admit we rely on personal experience and information from those around us more than representative data).
Which leads onto the fourth key explanation – that these misperceptions may be an effect of our concerns rather than a cause. That is, we overestimate partly because we are worried about these things, rather than being worried because we believe we know their full extent. Academics call this “emotional innumeracy”: we’re making a point about what’s worrying us, whether we know it or not.
There is no easy solution to any of this.
Efforts to encourage/shame the media and politicians into using accurate statistics will help. So, too, can ‘citizens’ juries’: “deliberative workshops on tricky policy issues where information is provided, experts give evidence and people have time to reflect on things they don’t normally get the chance to”. Interestingly, this process was used by the Independent Parliamentary Standards Authority (Ipsa) to consult the public on the appropriate level of MPs’ pay. Here’s what they found:
Our work with ComRes in 2012 showed that most members of the public, when initially considering MPs’ pay, did not think it should be increased. In fact, when asked how much MPs should be paid, a plurality pitched it in the £30,000-50,000 range, with the average figure being £49,710. In a citizens’ jury exercise, the participants discuss the issues for several hours and receive more supporting information than in a focus group. This allowed us to see whether attitudes changed with greater understanding of the subject. What we found is that some people did become more inclined towards somewhat higher pay for MPs. The decisive information was seeing what others, whose roles carried significant responsibilities, received.
The problem with ‘citizens’ juries’, of course, is that they’re not scaleable. But here Bobby Duffy has an intriguing suggestion:
… it’s obviously not very practical to get the whole population in a workshop for a day or so. However, new communication technology does provide easier ways to do this. It won’t be as cathartic and will only reach a subset, but online mass deliberations by independent organisations (say, the BBC) could play its part in improving our currently badly informed debates.
There’s something in that. Remarkably, the last time this country had a national referendum – on the ‘alternative vote’ – the BBC failed to broadcast a single peak-time debate on the issue, despite the fact that almost 20 million citizens turned out to vote. In the lead-up to the 2015 general election, there is a definite role for broadcasters to use their skills and experience to ensure that, amidst all the heated partisan noise and the temptation to fire-up ratings by dumbing down, the public gets the chance to hear the reasoned views of all sides before we make up our minds.