‘In this brief post, I want to demonstrate how ostensibly neutral and efficient algorithms can cause discrimination in education. Last year, the national advanced level qualifications (“A-levels”) exams in the UK that lead to places in university, further study, training, or work had to be cancelled because of school closures owing to the COVID-19 pandemic. In mitigation, the Office of Qualifications and Examinations Regulation (“Ofqual”) asked teachers to supply an estimated grade for each student and a ranking that compared with every other student at the school within the same estimated grade. This data went into an algorithm that also factored the school’s performance in the subject over the previous three years. The animating purpose behind the algorithm was to avoid ‘grade inflation’ and ensuring consistency with previous year’s results. When the grades were announced, the outcome was devastating for many. In England, Wales and Northern Ireland, nearly 40% of results were lower than teachers’ assessments. The effects of “downgraded” results were disproportionately felt in comparatively poorly resourced state schools.’
Full Story
Oxford Human Rights Hub, 15th March 2021
Source: ohrh.law.ox.ac.uk