The number of published opinion polls into British general election voting intention hit a twenty plus year high in 2009 with 141 polls carried out during the calendar year.
That is the highest figure since at least 1987 (when my records commence)* and more that completes the polling industry’s recovery from its post-1992 nadir. You can see the quarterly trend in this graph:
As I explained previously:
The 1992 general election was a bad one for the British political polling industry. During the campaign, the vast majority of polls put Labour ahead and of the final round of polls three put Labour ahead, one put Labour and the Conservatives neck-and-neck and only one – Gallup – gave the Conservatives a lead, but even that was a mere 0.5%. The actual result? A Conservative lead of 7.6%.
The response of the polling industry was a series of post-mortems and experiments with changes in methodology. Amongst those who commissioned polls, though, the response was also one of greater scepticism of the value of commissioning polls. Add in first the economic pressures of the 1990s and then the widespread seemingly inevitability of a Labour general election victory after Tony Blair become Labour leader, and it is no surprise that during the 1992-97 Parliament the number of opinion polls was consistently lower than in 1987-1992.
The prospect of an election more keenly contested than any since 1992, the declining cost of polling (thanks to phone and then internet polling) and the wider range of outlets commissioning polls (including in 2009 Political Betting) has resulted in the bumper year of polls.
Whether politics or political commentary is the better for this is another matter, particularly when you bear in mind that the standard margin of error on polls is +/- 3 percent. A sequence of two polls, one showing a party one percent higher than the previous really tells very little about any actual possible change in party support because the two margins of error overlap so heavily.
That’s a point often lost in commentary where changes well within that overlap are breathlessly described as party support rising, nose-diving, shifting and responding to events.
But of the 141 polls across 2010, for example, only 13 showed a statistically significant shift** in Conservative support from the previous poll by the same pollster and only 16 showed such a shift for Labour. Just 11 showed a statistically significant shift in Liberal Democrat support.
Headlines along the lines of “Sorry, no news from our latest poll as the changes are all too small to be statistically significant” though are not exactly common.
As the number of polls looked at grows, conclusions can be drawn with more confidence about changes in party support which would not be significant if they were present in only one poll. But that requires reports to put polls in their proper context alongside other recent polls, something with the traditional media is still very poor at. Hopefully 2010 will see that become the norm with the traditional media starting to catch up on the standards of political poll reporting which are common across political bloggers.
* Although there are many records of public opinion polls published before then, none that I’ve found provide enough information across all the polling companies to replicate this information in earlier years even when combining difference sources. It’s therefore more likely true that these are the highest figures ‘since records began’ … but if you know otherwise, do let me know.
** Although an individual poll is usually statistically significant to +/-3% (assuming no systematic errors), for the change between two polls to be statistically significant it has to be greater than approximately 4.7%. That is because, for example, if a party’s support has stayed static at 35%, the first poll may show the party to be at 33% and the second at 37% – an increase of 4% which does not actually signify a statistically significant shift. An explanation of the maths is here.