Algorithms don’t fail people, people do

The inequality in this year’s A-level results has been strongly linked to the performance of an algorithm – the statistical model that the government used to ‘automatically’ upgrade or downgrade results for pupils. While ministers will be called to question, for many it will be the cold, faceless, automated algorithm that is seen as the problem. We, as liberals, must be clear: the A-level disaster is not a programming error; algorithms merely reflect or even enhance the bias of their designers.

The Labour Party has said the A-level algorithm was ‘unlawful’, the FT has described how ‘the algorithm went wrong’ and clearly the process had massively unfair outcomes. Yet, this wasn’t data science gone rogue like Terminator’s Skynet or the ineffable AI in Ex Machina: this was a political choice reflecting political biases.

When the government’s advisory body got their Covid19 models ‘wrong’, it wasn’t because of algorithms but rather a lack of diversity  – both in thought and in demographics – within the advisory team. The algorithms and models weren’t ‘wrong’ and they certainly weren’t the reason for e.g. people of Bangladeshi origin being disproportionately affected by the disease – instead, the problem was in the inherent bias of a skewed group of designers.

In the case of this A-level algorithm, it made good statistical sense to include the historical performance of schools in the prediction model because that minimises the total error across all results – intuitively we know that pupils at exclusively funded Eton get more A* results than pupils at woefully under-resourced Scumbag College. And yet, it goes against every British and liberal value of fairness and equality of opportunity to downgrade the brightest lights from a poorer performing school as part of a mathematical rounding exercise.

Whether we are talking about the life and death of public health or the life-opportunities of our children, the issue isn’t whether algorithms ‘go wrong’ but whether we have fairness and diversity in the teams that build the algorithms – and the ministers signing them off.

The Education Secretary, Gavin Williamson, was warned as far back as July, about the problems with this process but decided to go ahead with it anyway. He also saw the forerunner of a u-turn from the Scottish government after similar accusations of unfairness yet still he backed the algorithm and its outcomes – and, in the case of small (mostly private school) classes, he supported not using the algorithm at all and having grade inflation for the privately educated.

This is not the fault of some cold, faceless bit of technology; this is a clear choice by those in power.

Top Tory advisor Dominic Cummings has spoken about how, in his experience, the Tories don’t care about poorer people or the NHS – did we see this in the A-level algorithm’s design? Is this what happened when Gavin Williamson approved an unfair and unequal outcome?

Perhaps a more pressing question is, will we spot it when it happens again? These biases and political decisions came to light for A-level results because they affected a very large and broad group of our children. Parents of all sorts, pointy-elbowed or otherwise, were acutely aware and raising merry hell. But what will happen when another algorithm, designed by a biased team and approved by a biased minister, negatively impacts a smaller or more hidden group?

Will we even know that the Home Office has designed an algorithm that is sending people to detention centres? Or that the DWP has an algorithm that is stopping benefits for a small group of wrongly suspected cheats? These things will likely go unnoticed by the public, silently approved by departmental ministers and an increasingly hollowed-out and subservient civil service.

We don’t need a ‘regulator for algorithms’ but we should have a regulator or ombudsman for each of our government’s increasingly illiberal ministries.

 

 

 

 

* Dr Rob Davidson is on the exec of the Association of Lib Dem Engineers and Scientists (ALDES) and the council of Social Liberal Forum. He founded Scientists for EU and NHS for a People's Vote and was a founding member of the People's Vote campaign. Most recently he has launched Liberation Inc, a platform for liberal startups and has helped launch the Free Society Centre and relaunch Trade Deal Watch as new liberal organisations.

Read more by .
This entry was posted in Op-eds.
Advert

13 Comments

  • James Belchamber 21st Aug '20 - 11:50am

    Entirely agreed – placing the blame on the algorithm lets people off the hook that designed it that way. This government pursued a path that would lead to slacking rich kids getting good grades and hard-working poor kids lose out ENTIRELY BY DESIGN.

    Even unconscious biases are laid bare in code.

  • I do not agree with the argument. Dragging lack of advisory team diversity into the problems of Covid and A-level models is a cheap way to make a political point.

    The novel and unexpected characteristics of the Covid 19 pathogen created modelling problems. As in the case of climate models, too many unknowns require guesswork and result in high levels of uncertainty.

    The A-level algorithm was always going to fail. It deals with populations. These can be broken down into ever smaller sub-populations and these can be modified to reflect particular circumstances. But the success of the output depends on the acceptability of individual results. So while the algorithm may have produced a fair result for the population, it failed the students on an individual level.

    The conclusion must be that there is no satisfactory substitute for individual examinations. One would hope that teacher predictions would be almost as good but the results show that these are woefully inadequate and the problem of pass inflation will cause long term damage. Holding examinations in the future will be more of a logistical problem than a statistical one.

  • James Fowler 21st Aug '20 - 12:56pm

    @Rob: Spot on – couldn’t agree more.

  • We need to ask why there is a need for any algorithm at all – except of course those to program the computer to carry out a rotulien dats processing operation, and to produce output which is useful. The answer seems to be a fear of what is called grade inflation. This means the theory that if the results in exams keep increasing then something must be wrong because the children have the same levels of ability each year.
    The evidence is lacking in this dogma of course but it fits in with the world view of the people making the decisions.
    The evidence is that the biggest déterminant in poor exam results is family poverty. The same applies to poor health.
    Algorithms are processes to ensure we are able to write programs which will do what we want them to.
    Poverty is not solved by algorithms it is solved by making sure that everyone has enough to eat, somewhere acceptable to live and a security that many in our country do not have.

  • No grade should be adjusted “automatically”, but only when the script has been remarked by an experienced examiner who deems the original mark inaccurate. End of.

  • @Peter. You are very close to the truth. Exams are a logistical problem. Every year there are thousands of exams papers still unmarked a couple of weeks before results day because either the boards can’t find enough examiners or some examiners proved to be so unreliable that the marking had to be redone. The result was panic at the exam board and examiners paid double rates to wade thought the outstanding pile of papers. Answer….. get all the remarks done by computer and problem solved.
    And I hate to sound grumpy, but there are a lot of people applying their pet theories to a subject about which they seem to have limited understanding. The A level marks fiasco is nothing to do with some deliberate attempt to do down the poor (even if that is the effect) nor has diversity anything to do with it, although I know if has to be brought into every conversation. The problem started when someone at Ofqual or an exam board thought you could replace humans with computers and save a quid or two.

  • Andrew Toye 21st Aug '20 - 3:41pm

    What normally happens is that there are actual exam papers to compare. the absence of this, what should have happened would have been to collect random samples of student’s actual work – either from the ‘mock’ papers or examples of coursework – so the moderators could have then properly taken teacher bias into account and then predict what individual students would have achieved had they sat the exams. Applying an algorithm without any real data was just wrong.

  • David Evans 21st Aug '20 - 5:10pm

    Actually Tom Harney, the evidence of grade inflation is there if people only want to see the truth. Indeed York University gave exactly the same exam to its new students in sciences over many years to determine what extra focus would be needed to help them transition from school to University. They dropped it over a decade ago because results had fallen consistently to the level where the exam was no longer useful to determine what help was needed.

    Just think how you would like to tell last year’s students that they are 10 to 15% stupider than this years!

  • @Tom Harney – The need was to have a means of assigning results in the absence of any actual exam results. My own assessment is that this is an impossible task.

    A child and his/her parents will reluctantly accept a result achieved by the child in a conventional exam. We now know that there is no possibility of a computed or predicted result being accepted unless it exceeds expectations.

    I don’t blame the authorities for trying in these exceptional circumstances but next time they should find a way of holding actual exams, perhaps by taking longer to cover all the subjects in order to facilitate distancing.

  • Michael Bukola 21st Aug '20 - 8:32pm
  • David Evershed 22nd Aug '20 - 2:24am

    The issue is whether the rules set by the algorithm designers were more less biased than the teachers’ predictions.

  • Sue Sutherland 22nd Aug '20 - 12:56pm

    I agree with you Rob that “this was a political choice reflecting political bias”. I’m not saying the Tories set out to deflate A level results in schools in deprived areas but that their prejudices led them to ignore the unfair way in which individuals were treated. I think they expected public schools to do better than state schools, and that schools in middle class areas would do better than those in areas that struggle so when the results were given they matched with their prejudices so they didn’t recognise there was a problem.
    There was a time when some Tories were concerned about the rights of the individual but that disappeared sometime ago. If you want to govern in the authoritarian centralised way the Tories do then you have to rely on algorithms to govern the masses.
    I am so impressed with the campaigns that the students themselves organised and find hope that for them the rights of the individual and fair treatment still matter.

  • I don’t think anyone would mind a constant upward trend in grade inflation if it was an indication of increasing academic ability, but, from my experience this is not the case. My current. Job role involves recruiting new staff to our service. The link between newly qualified applicants and a decline in the quality of applications over time has been clear for at least the last 10 years or so. Last year I attended a seminar at the University of York, imagine my surprise and subsequent disbelief to see adverts for remedial l(their word not mine) lessons for mathematics and English
    …a Russel group University offering remedial maths and English to first year undergraduates! There was a time when candidates lacking a minimum standard in either subject would not have progressed past their first semester.
    A few years ago I was supporting my son with his A level chemistry (never an easy subject for me ) he was working on the previous years exam questions, which were about the equivalent of that of my GCE ‘O’ level.
    I’m all for celebrating achievements and giving due credit, but we do our students a disservice by pretending there is any longer any pretence at grade equivalency from even a few years ago.

Post a Comment

Lib Dem Voice welcomes comments from everyone but we ask you to be polite, to be on topic and to be who you say you are. You can read our comments policy in full here. Please respect it and all readers of the site.

If you are a member of the party, you can have the Lib Dem Logo appear next to your comments to show this. You must be registered for our forum and can then login on this public site with the same username and password.

To have your photo next to your comment please signup your email address with Gravatar.

Your email is never published. Required fields are marked *

*
*
Please complete the name of this site, Liberal Democrat ...?

Advert



Recent Comments

    No recent comment found.