That’s the question the Royal Statistical Society and King’s College London decided to find out with the help of polling firm Ipsos MORI. They asked the British public a range of questions on current social issues. And they found the public mostly gave answers that were factually wrong. Not just fractionally out: a long way out.
See how you do on the 10 most common misperceptions held by the British public…
Teenage pregnancy:
What proportion of girls under 16 do you think become pregnant each year?
a) 0.6%
b) 3.6%
c) 6.3%
Crime:
Is violent crime rising or falling?
a) Falling
b) Rising
c) Neither
Welfare:
How much do you think the UK spends (£bn) on i) pensions, and ii) job-seeker’s allowance?
a) £74.2bn pensions and £4.9bn JSA
b) £47.2bn and £79.4bn
c) £7.9bn and £42.7bn
Benefit fraud:
What proportion of what is spent on benefits payments is claimed fraudulently?
a) 0.7%
b) 5.7%
c) 10.7%
Overseas aid:
What proportion of total public expenditure was spent on overseas aid in 2011-12?
a) 1.1%
b) 3.1%
c) 6.1%
Religion:
What is the proportion of people in England and Wales who say they are i) Muslims, and ii) Christians?
a) 5% Muslims, and 59% Christians
b) 15%, and 35%
c) 25%, and 45%
Immigration and ethnicity:
What proportion of the population is black or Asian?
a) 11%
b) 17%
c) 26%
Age:
What proportion of the population do you think is aged 65 or over?
a) 16%
b) 23%
c) 32%
Welfare bill:
How much public money is thought will be saved by i) capping benefits at £26,000 per household, and ii) raising the pension age to 66 for both men and women?
a) £290m benefits cap, and £5bn pensions
b) £1.7bn, and £290m
c) £3.2bn, and £1.7bn
Voting:
What proportion of eligible voters turned out to vote at the last general election?
a) 65%
b) 59%
c) 53%
How do you reckon you did?
I’ve posted the answers below, or alternatively you can read the RSS’s press release here:
-
‘Our data poses real challenges for policymakers,’ said Hetan Shah, executive director of the Royal Statistical Society. ‘How can you develop good policy when public perceptions can be so out of kilter with the evidence? We need to see three things happen. Firstly, politicians need to be better at talking about the real state of affairs of the country, rather than spinning the numbers. Secondly, the media has to try and genuinely illuminate issues, rather than use statistics to sensationalise. And finally, we need better teaching of statistical literacy in schools, so that people get more comfortable in understanding evidence. Our getstats campaign is trying to create change at all of these levels.’
You want the answers?
Then here they are…
If you answered all a) then you got 100%.
Here’s the full skinny:
Teenage pregnancy:
a) 0.6%
“on average, we think teenage pregnancy is 25 times higher than official estimates: we think that 15% of girls under 16 get pregnant each year, when official figures suggest it is around 0.6%.”
Crime:
a) Falling
“51% think violent crime is rising, when it has fallen from almost 2.5 million incidents in 2006/07 to under 2 million in 2012. 58% do not believe that crime is falling, when the Crime Survey for England and Wales shows that incidents of crime were 19% lower in 2012 than in 2006/07 and 53% lower than in 1995. ”
Welfare:
a) £74.2bn pensions and £4.9bn JSA
“29% of people think we spend more on JSA than pensions, when in fact we spend 15 times more on pensions (£4.9bn vs £74.2bn).”
Benefit fraud:
What proportion of what is spent on benefits payments is claimed fraudulently?
a) 0.7%
“people estimate that 34 times more benefit money is claimed fraudulently than official estimates: the public think that £24 out of every £100 spent on benefits is claimed fraudulently, compared with official estimates of £0.70 per £100.”
Overseas aid:
a) 1.1%
“26% of people think foreign aid is one of the top 2-3 items government spends most money on, when it actually made up 1.1% of expenditure (£7.9bn) in the 2011/12 financial year. More people select this as a top item of expenditure than pensions (which cost nearly ten times as much, £74bn) and education in the UK (£51.5bn).”
Religion:
a) 5% Muslims, and 59% Christians
“we greatly overestimate the proportion of the population who are Muslims: on average we say 24%, compared with 5% in England and Wales. And we underestimate the proportion of Christians: we estimate 34% on average, compared with the actual proportion of 59% in England and Wales.”
Immigration and ethnicity:
a) 11%
“the average estimate is that black and Asian people make up 30% of the population, when it is actually 11% (or 14% if we include mixed and other non-white ethnic groups). The public think that 31% of the population are immigrants, when the official figures are 13%. Even estimates that attempt to account for illegal immigration suggest a figure closer to 15%.”
Age:
a) 16%
“we think the population is much older than it actually is – the average estimate is that 36% of the population are 65+, when only 16% are.”
Welfare bill:
a) £290m benefits cap, and £5bn pensions
“people are most likely to think that capping benefits at £26,000 per household will save most money from a list provided (33% pick this option), over twice the level that select raising the pension age to 66 for both men and women or stopping child benefit when someone in the household earns £50k+. In fact, capping household benefits is estimated to save £290m, compared with £5bn for raising the pension age and £1.7bn for stopping child benefit for wealthier households.”
Voting:
a) 65%
“we underestimate the proportion of people who voted in the last general election – our average guess is 43%, when 65% actually did.”
* Stephen was Editor (and Co-Editor) of Liberal Democrat Voice from 2007 to 2015, and writes at The Collected Stephen Tall.
27 Comments
“we underestimate the proportion of people who voted in the last general election – our average guess is 43%, when 65% actually did.”
I’m wondering how this is possible, if the options offered are 65%, 59% and 53% …
Is there a prize, other than a feeling of self-satisfaction that verges on smugness, for getting all 10 right?
Well, very interesting. The one question I got “wrong” was one in which I am a statistic, ie the proportion of over 65s!
Sid, we assume that background polling was carried out in order to set the multiple choice figures. Is that correct, Stephen?
@ Sid – the options are ones I’ve made up for the purposes of this post. Not seen the full data set yet to know how Ipsos MORI asked the Qs.
@ ATF – self-satisfaction is its own reward.
I hope, by the way, that it was your mistake, Stephen, in quoting all the correct answers as the first option (a)? If it wasn’t it would fail one of the first criteria for a properly designed multiple choice questionnaire! I was beginning to get a bad feeling about it as I did it!
Stephen, sorry, your post and mine crossed.
@ Tim13 – I made ’em all (a) as a little extra test of resilience for those who thought it too easy…
This is good. Although after a while it becomes clear that the answer is A to everything!
Only one I don’t much like is the Benefit fraud one. That’s impossible to know, surely it’s actually the % of people that are caught!?
These are difficult question anyway, I have great respect for the British public and want to emphasise that I don’t agree with any notion that their intelligence is inferior to political activists.
I got 10 out of 10, partly because I knew most of the answers, but also because it became blindingly obvious that all the answers were (a)
“I made ‘em all (a) as a little extra test of resilience for those who thought it too easy…”
Er, right. It was easy enough, but it was made even easier my making them all (a)
The trouble is that if you hear a number being reported, it is probably either a politician cherry picking some fact, or a journalist rehashing some PR-driven marketing survey. It is much safer for Joe public to be unshakeable in the face of such “evidence” than it would be to take such things at face value. Unfortunately when solid evidence does occasionally get reported it gets the same skeptical treatement.
Another feature may be the instrumental approach to surveys by the surveyed. If you want to boycott Nescafe in principle, but not in reality, the least you can do is tell surveyers that you are boycotting them and why. It’s probably more effective than an actual boycott. Similarly if you want politicians to focus more on crime, say you think crime is going up. If indeed you suspect that only 5% of your taxes are spent on public services, and 95% on duck houses, then it is reasonable to say to the polling people that everything is getting worse and needs more money spent on it, and taxes should be cut too.
Also, in some important sense, many of these numbers don’t matter. If at x%, crime is too high, that doesn’t mean that at some lower y%, that’s all right then. Whatever the crime rate is, we want it to be lower. “High” and “going up” can, perhaps be thought of interchangeably, in imprecise everyday parlance, exactly as “debt” and “deficit” are. It grievously offends the mathematicians among us, but we’re a minority.
While I’m not surprised at the level of ignorance, some of the findings are still baffling. There are clearly many people who think that there are over 20 million pensioners but that more is paid out in Job Seeker’s Allowance than in pensions.
Does the figure for pregnancies in under-16s include all under-16s (i.e. including those too young to get pregnant) or is there some arbitrary start age (e.g. 11)? I suspect part of the problem (with this and some other findings) is a woeful lack of understanding of percentages. If asked to give an answer in the form “one in x”, would people really have said that they thought one in six under-16s got pregnant in any given year?
@Joe Otten
I’m sure there is more than a little truth in your theory. However, it works both ways. The government has evidence to show that people think that cutting benefits will save a lot more money than is actually the case. Therefore, they cut benefits thinking that it will be popular. The public see the government taking a lot of time and effort to cut benefits so they think it must be saving a lot of money to make that effort worthwhile.
Despite all the answers being a), I knew them anyway as I do look at the factchecks dione by Channel 4 and, someties, The Grauniad
It is interesting, is it not, that all the errors are in favour of the direction of travel of the Coalition – makes you think doesn’t it?
I see again today that the Coalition has been playing loose with the numbers – to be accused of this by the Daily Mail is surely an embarrassment – come on LD sort it out!
http://politicalscrapbook.net/2013/07/daily-mail-blame-tory-hq-for-giving-them-misleading-benefits-figures/
Stephen – good posting.
I find it all the more interesting having read Daniel Kahneman’s book “Thinking, Fast and Slow”.
Posted too quick!
What would be interesting is for the RSS to rerun the survey using numbers rather than percentages and comparing the results.
Just to add to my earlier comment having now looked at the research. “Under-16s” refers to girls aged 13-15 at the time of conception. Whether the people being surveyed were told that is not clear.
Simon, perhaps, though it is difficult to judge the amout of effort that goes into a proposal – it’s not closely related to the amount of news or controversy that it generates.
One other probable factor behind these results.
If you underestimate the size of, say, benefit fraud, you are naive. If you overestimate the significance of a problem, you are cautious and prudent. If somebody is asking you to estimate the scale of a problem, it is likely to be a significant problem, or they woudn’t bother asking you about it. These are all good psychological reasons for guessing on the high side when you are asked to guess.
Joe Otten
Why try and invent an psychological reading of these results for which you have no evidence is occurring? I think ‘probable’ is a bit strong – link to evidence for this please?
Perhaps, a more simple explanation is that the media and certain political parties are banging on so much about benefit fraud or JSA, for example, that noone can believe it is so low as cutting them will save the economy!
It is the same smoke and mirrors that governs the ‘Laffer Curve’ and the perceived benefit from cutting the top rate tax – the media are united in support so there is no counter-argument being heard
@Joe & bcrombie
Your postings reminds me of an example Tom Peter’s used (showing my age here!): children from different backgrounds were asked to draw a ‘dime’ on a blank piece of paper from memory, children from poorer backgrounds tended to draw it much larger than it really was, whereas children from wealthier backgrounds tended to make it smaller…
Roland
errr no.
I look at the numbers and know what is true.
The people who are asked will surely be influenced by the misinformation fed to them – continued use of the 200 million figure when talking about welfare cuts, or the focus on benefit fraud.
It is strange that all the errors seem to be in favour of the Coalition’s narrative and that supported by the right wing tax-dodging media barons
@tom
Actually the benefit fraud one is spot on. They don’t calculate it using the number who are caught – instead they take a large, random sample of benefit claimants and subject each of them to a thorough investigation to find out if they’re making fraudulent claims and then extrapolate that figure across the total number of claimants.
As only 0.7% of a random sample commit benefit fraud then it’s a fairly safe bet that only 0.7% of benefit claimants in general commit fraud.
I would like to add that I think it is unfair to mock the public’s intelligence for not being able to answer questions that require specialist political knowledge.
Stephen says he is not mocking the public’s intelligence but I think he is and I think it is perfectly reasonable for me to say so. I’m not saying the findings should not be published, but they can be communicated better.
@bcrombie
Probably ‘remind’ was a poor choice of word on my part, perhaps more correctly the example came to mind after reading the article and reading your response to Joe, where you query Joe’s psychological reading. The example I gave suggests that if people’s perception of something tangible that they are familiar with, is influenced by their experience and outlook, then we should expect their perception of things they are not so familiar with ie. statistical data, to be similarly influenced. I don’t disagree that the media (and politicians) have probably also assisted in the creation of people’s distorted view of things.
Note, although Stephen presented the findings as a much simplified quiz, the research was intended to investigate people’s perception of statistical data and discovered a high level of misconception. Hence whilst we may feel good at getting the right answers, it wasn’t quite the challenge set by the RSS to the survey participants.
and remind me, where do most people get their facts/ideas/perceptions from? .. oh. yes. our friends in the press, the barons of ‘freedom’ to tell us their distorted view of how things are.
This whole item is a clear argument for demanding more honesty from the people we rely on to tell us what is going on.
The misbeliefs of the general public are not random. In almost every case given here, the errors people are making are errors which are beneficial to the political right.
Of course the RSS, by giving us the 10 most common misconceptions, are playing the same game – inflating a problem to make a news story. What about the top 10 things that people generally get right? We never hear about them, do we?
Come on, RSS, give us a fair random sample of how accurately people guess numbers.