Society, Law & JusticeL05
reading

Reading Practice

Long-form reading practice with exam-style tasks, glossary support and audio.

45 minC1c1readingsociety-law-justiceaijusticiaalgoritmossesgo

Lesson objectives

  • Read a C1-level text with better control over detail, tone and argument.
  • Develop topic knowledge around society, law & justice while practising exam reading.
  • Use glossary support and audio to consolidate comprehension.

Unit 7: Society, Law & Justice

Reading text

The Algorithmic Verdict: Justice in the Age of Automation

As we move into 2026, the legal landscape is undergoing a transformation more profound than any seen since the introduction of the solicitor. The integration of Artificial Intelligence into judicial decision-making is no longer a speculative science-fiction trope; it is a daily reality in many Western jurisdictions. While proponents argue that algorithms offer an oasis of objectivity in a sea of human bias, critics warn that we are outsourcing our moral compass to black-box mathematics.

The central tension lies in the concept of 'predictive policing' and sentencing. In several metropolitan hubs, algorithms now process vast quantities of historical data to forecast crime hotspots or assess the likelihood of recidivism. On paper, this promises a streamlined, efficient legal system. If a machine can identify patterns that a human judge might overlook, surely justice is better served? However, this logic rests on a precarious assumption: that historical data is inherently neutral.

In reality, data is a mirror of our past societal failings. If historical policing patterns were disproportionately focused on specific marginalised communities, the algorithm will inevitably codify these prejudices under the guise of 'mathematical certainty'. This creates a feedback loop where the law does not just respond to crime, but actively shapes the socio-economic reality of certain populations. Instead of breaking the cycle of inequality, we risk automating it.

Furthermore, the 'black box' problem presents a significant hurdle to the principle of transparency. In a traditional courtroom, a judge provides a written justification for a sentence, allowing for scrutiny and appeal. When an AI generates a risk score, the logic behind that score is often obscured by proprietary code. How can a defendant challenge a verdict if the reasoning is hidden behind a wall of corporate intellectual property? The right to a fair trial becomes increasingly nebulous when the accuser is an opaque algorithm.

There is also the philosophical question of accountability. If an automated system makes a catastrophic error—a 'false positive' that results in the wrongful imprisonment of an innocent individual—who is held liable? Is it the software developer, the state, or the human judge who rubber-stamped the machine's suggestion? Our current legal frameworks are ill-equipped to handle the diffusion of responsibility that automation brings.

As we navigate this transition, the legal profession faces a crossroads. We must decide whether the goal of justice is merely efficiency and statistical accuracy, or if it requires the nuanced, empathetic, and often messy application of human judgment. To lose the human element in the pursuit of perfection may be the greatest injustice of all. The challenge for the next decade will be to ensure that while we use technology to assist the law, we do not allow it to replace the very essence of human equity.


Comprehension — multiple choice

  1. What is the writer's main purpose in the first paragraph? A. To celebrate the efficiency of modern legal technology. B. To introduce a debate regarding the use of AI in law. C. To argue that science fiction has predicted the future accurately. D. To suggest that human bias is no longer a problem in courts.

  2. What does the writer imply about the use of historical data in the second paragraph? A. It is the most reliable way to ensure judicial objectivity. B. It is too complex for human judges to understand. C. Its perceived neutrality is actually quite questionable. D. It is the primary reason for the rise in crime rates.

  3. According to the third paragraph, what is the danger of using biased data? A. It makes the legal system too expensive to maintain. B. It could lead to the permanent codification of social inequalities. C. It prevents the police from identifying new crime hotspots. D. It causes the software to malfunction frequently.

  4. What is the 'black box' problem mentioned in the text? A. The physical difficulty of storing large amounts of legal data. B. The difficulty of understanding how an algorithm reaches a conclusion. C. The high cost of proprietary software for the legal sector. D. The lack of digital literacy among current judges.

  5. How does the writer view the current legal framework regarding automation? A. It is perfectly designed to handle new technological challenges. B. It is too focused on corporate intellectual property. C. It is not sufficiently prepared for issues of accountability. D. It is being replaced too quickly by automated systems.

  6. What is the overall tone of the conclusion? A. Optimistic about the future of legal technology. B. Dismissive of the importance of human empathy. C. Cautious and reflective regarding the future of justice. D. Aggressive in its demand for a total ban on AI.


Gapped text — missing sentences

A. This creates a feedback loop where the law does not just respond to crime, but actively shapes the socio-economic reality of certain populations.

B. Instead of breaking the cycle of inequality, we risk automating it.

C. The right to a fair trial becomes increasingly nebulous when the accuser is an opaque algorithm.

D. To lose the human element in the pursuit of perfection may be the greatest injustice of all.

E. This ensures that the legal system remains profitable for the private sector.


Glossary

  1. Proponent (Defensor/Partidario)
  2. Recidivism (Reincidencia)
  3. Precarious (Precario/Inseguro)
  4. Codify (Codificar/Consolidar)
  5. Obscured (Oscurecido/Oculto)
  6. Nebulous (Nebuloso/Vago)
  7. Liability (Responsabilidad legal)
  8. Equity (Equidad)

Answers

Comprehension 1. B 2. C 3. B 4. B 5. C 6. C

Gapped Text (Note: The sentences were removed from the text above. Learners must match them to the logic of the original text.) Note to teacher: In a real exam, the gaps would be numbered in the text. Based on the text provided, the sentences belong to the following locations: * Sentence A belongs to the 3rd paragraph. * Sentence B belongs to the 3rd paragraph. * Sentence C belongs to the 4th paragraph. * Sentence D belongs to the 6th paragraph. * Sentence E is the distractor.