Mass Surveillance & State PowerL15
global_exam

Global Exam

Checkpoint exam covering the previous ten units with integrated C1 practice.

120 minC1c1global_exammass-surveillance-state-powervigilanciaprivacidadiapoder

Lesson objectives

  • Attempt a checkpoint exam covering the previous ten units.
  • Practise integrated C1 reading, listening, use of English, writing and speaking tasks.
  • Use the answer key and marking checklist to plan revision.

Global Exam — Units 71-80

Este examen ha sido diseñado para evaluar tus conocimientos tras completar las unidades 71 a 80. Cubre temas complejos como la vigilancia estatal, el poder, la tecnología y la ética.

Instrucciones: 1. Tiempo recomendado: 2 horas y 30 minutos. 2. Materiales: Solo papel y bolígrafo. No utilises diccionarios. 3. Autocorrección: Las respuestas correctas se encuentran al final. En las tareas de Writing y Speaking, utiliza las notas de evaluación para comparar tu desempeño con el nivel C1 requerido.


Reading and Use of English

Part A: Reading Comprehension

Read the text and choose the correct answer (A, B, C, or D).

The Panopticon of the Digital Age

In the 18th century, philosopher Jeremy Bentham conceptualised a prison design known as the Panopticon, where a single guard could observe all inmates without them knowing if they were being watched. This psychological pressure ensured total compliance. Today, as we navigate the complexities of the 21st century, many argue that we have inadvertently constructed a digital Panopticon. Through the ubiquitous presence of smartphones, social media, and smart devices, our movements, preferences, and even private thoughts are being harvested.

While proponents of mass data collection argue that it enhances national security and streamlines consumer experiences, critics suggest a more insidious reality. The erosion of privacy is not merely a loss of personal space; it is a fundamental shift in the power dynamic between the individual and the state. When every action is recorded, the concept of 'freedom to act' becomes hollow. One might behave differently if they suspect a permanent record of their existence is being curated by an invisible authority.

Furthermore, the automation of surveillance through AI has introduced a new layer of complexity. Algorithms can now predict behaviour with unsettling accuracy. This predictive policing and targeted marketing create a feedback loop that nudches human behaviour toward predictable patterns. We are no longer just being watched; we are being shaped. The question remains: can democracy survive in an environment where transparency is a one-way street?

  1. What is the writer's main point in the first paragraph? A) Bentham's prison design was highly effective in reducing crime. B) Modern technology has created a situation similar to Bentham's concept. C) People are becoming more compliant due to the presence of guards. D) The digital age has made privacy more important than ever before.

  2. The word 'insidious' in the second paragraph suggests that... A) The loss of privacy is happening in a gradual, harmful way. B) The loss of privacy is an obvious and immediate threat. C) The loss of privacy is a positive development for security. D) The loss of privacy is a myth created by critics.

  3. According to the second paragraph, how does surveillance affect the individual? A) It provides a sense of security in a chaotic world. B) It makes people feel more independent. C) It alters how people behave due to the feeling of being observed. D) It strengthens the bond between the state and the citizen.

  4. What does the phrase 'the freedom to act becomes hollow' imply? A) People have more freedom than ever before. B) Freedom is meaningless if every action is monitored. C) Actions are becoming more significant in the digital age. D) Physical freedom is being replaced by digital freedom.

  5. In the third paragraph, the author suggests that AI... A) Is primarily used to catch criminals more efficiently. B) Makes human behaviour more unpredictable. C) Allows for the manipulation of human actions through prediction. D) Is a tool that can only be controlled by the state.

  6. What is the overall tone of the text? A) Optimistic and celebratory. B) Indifferent and academic. C) Critical and cautionary. D) Sarcastic and humorous.

Part B: Word Formation

Complete the sentences by changing the form of the word in CAPITALS.

  1. The _____ of personal data has become a major ethical concern. (COLLECT)
  2. The government's decision was met with widespread _____. (REJECT)
  3. It is _____ to assume that all surveillance is for security purposes. (LOGIC)
  4. The sudden _____ of the internet changed how we communicate. (EMERGE)
  5. We must ensure the _____ of our digital identities. (PROTECT)
  6. The _____ of the new law caused significant public outcry. (IMPLEMENT)
  7. Her _____ to privacy was seen as a sign of political activism. (REFUSE)
  8. The _____ of the algorithm was difficult to explain to the public. (COMPLEX)

Part C: Key-word Transformations

Complete the second sentence so that it has a similar meaning to the first sentence, using the word given. Do not change the word given. You must use between three and six words.

  1. It was only when the data was leaked that they realised the danger. UNTIL It was not ________ the danger of the situation.

  2. I regret not studying harder for the advanced English exam. WISH I ________ harder for the advanced English exam.

  3. "You shouldn't have told him the secret," said Sarah to Mark. MUST Sarah told Mark that ________ him the secret.

  4. Despite the heavy surveillance, the protesters continued their march. SPITE In ________ heavy surveillance, the protesters continued their march.

  5. The company is being investigated for spying on its employees. UNDERGOING The company ________ for spying on its employees.

  6. It is possible that the government intercepted the messages. MIGHT The government ________ the messages.


Listening Simulation

Read the following transcript of a lecture on 'The Ethics of Algorithmic Governance' and answer the questions.

Transcript: "Good morning, everyone. Today we are delving into a topic that sits at the intersection of technology and sociology: algorithmic governance. We often think of algorithms as neutral mathematical tools, but in reality, they are imbued with the biases of their creators. When these algorithms are used to make decisions about credit scores, job applications, or even criminal sentencing, we are no longer dealing with mere math; we are dealing with social engineering.

The danger lies in the 'black box' nature of these systems. Because the logic is often proprietary or too complex for human intervention, it becomes impossible to challenge a decision. If an algorithm denies you a loan, how do you appeal a decision you cannot understand? This lack of transparency undermines the very foundation of due process.

Furthermore, we must consider the feedback loops. If an algorithm is trained on historical data that contains human prejudice, it will not only replicate that prejudice but amplify it. It creates a self-fulfilling prophecy. For instance, if a predictive policing tool sends more officers to a specific neighbourhood based on biased historical data, they will inevitably find more crime there, which then feeds back into the system as 'proof' that the neighbourhood is high-risk.

We are moving towards a society where 'nudging'—the subtle manipulation of choice through digital interfaces—becomes the primary mode of governance. While it may not feel like coercion, it is a soft form of control that bypasses conscious deliberation. As we move forward, our task is to demand accountability and transparency. We must ensure that technology serves humanity, rather than the other way around."

Questions: 1. What is the speaker's primary argument regarding the neutrality of algorithms? 2. What does the term 'black box' refer to in the context of the lecture? 3. Why does the speaker believe the 'black box' nature threatens due process? 4. How does the speaker explain the concept of a 'self-fulfilling prophecy' in policing? 5. What is the relationship between historical data and prejudice according to the text? 6. How does the speaker define 'nudging'? 7. What is the speaker's attitude towards the current direction of technology? 8. Which of the following best describes the purpose of the lecture? A) To promote the use of AI in government. B) To warn about the social implications of algorithmic decision-making. C) To explain the mathematical complexity of modern algorithms. D) To argue that privacy is no longer relevant.


Writing Paper

Part 1: Essay

Write an essay responding to the following prompt. You should write between 220–260 words.

Topic: Some people believe that increased government surveillance is a necessary price to pay for public safety. Others argue that it fundamentally destroys individual liberty.

Task: Discuss both views and give your own opinion.

Consejo de planificación (Spanish): * Introducción: Presenta el tema de forma general. No des tu opinión todavía. * Cuerpo 1: Analiza los argumentos a favor de la vigilancia (seguridad, prevención de crímenes). * Cuerpo 2: Analiza los argumentos en contra (privacidad, abuso de poder, libertad). * Conclusión: Resume los puntos principales y da tu opinión clara.

Part 2: Options

Choose ONE of the following two tasks. Write between 220–260 words.

Option 1: Report Your local council is considering installing facial recognition cameras in all public parks. Write a report for the council outlining the potential benefits and drawbacks, and making a recommendation.

Option 2: Article Write an article for a technology magazine titled: "The End of Privacy: Are we living in a digital prison?" Discuss whether the loss of privacy is an inevitable part of modern life.

Consejo de planificación (Spanish): * Report (Informe): Usa un tono formal. Usa subtítulos (e.g., Introduction, Findings, Recommendations). * Article (Artículo): Usa un tono más atractivo y periodístico. Puedes usar preguntas retóricas para involucrar al lector.


Speaking Paper

Part 1: Interview (2-3 minutes)

The examiner asks personal questions. 1. Where are you from? 2. How much time do you spend using social media every day? 3. Do you think people share too much information about their lives online? 4. Would you prefer to live in a city with high security or a quiet village with less surveillance?

Part 2: Long Turn (Collaborative Task) (4 minutes)

Candidate A and Candidate B discuss a topic based on visual prompts (simulated here with text).

Prompt: Discuss the different ways technology affects our sense of privacy. * Prompt 1: Social media profiles. * Prompt 2: Smart home devices (Alexa, etc.). * Prompt 3: CCTV and public cameras. * Prompt 4: Data tracking by websites.

Interaction Language (English): * "That's an interesting point, but don't you think...?" * "I see what you mean, however..." * "Building on what you just said..."

Part 3: Discussion (5 minutes)

The examiner asks more abstract questions related to Part 2. 1. To what extent should the government have access to private messages in the interest of national security? 2. Is it possible to truly be 'anonymous' in the modern world? 3. Do you think children should be taught about digital privacy in school?

Part 4: Follow-up (2 minutes)

The examiner asks a final, challenging question. "In a world where everything is recorded, will the concept of 'the private self' eventually disappear?"


Mediation Challenge

Source Text (Spanish): "El uso de inteligencia artificial en la selección de personal está creciendo. Aunque promete eliminar el sesgo humano, muchos expertos advierten que los algoritmos pueden perpetuar prejuicios si los datos de entrenamiento no son diversos. Es vital que las empresas sean transparentes sobre cómo funcionan sus procesos de selección automatizados."

Task (English): You are working for an international recruitment agency. You need to summarize the key points of this text for your English-speaking manager.

Task Requirements: * Write a short summary (80–100 words). * Use a professional, formal register. * Ensure the main concerns regarding AI and bias are clearly communicated.


Answer Key and Marking Notes

Reading and Use of English

Part A: 1. B 2. A 3. C 4. B 5. C 6. C

Part B: 1. Collection 2. Rejection 3. Illogical 4. Emergence 5. Protection 6. Implementation 7. Refusal 8. Complexity

Part C: 1. until we realised / until then did we realise 2. wish I had studied 3. you must not have told 4. spite of the 5. is undergoing investigation 6. might have intercepted

Listening Simulation

  1. It is not neutral; it carries the biases of its creators.
  2. The complex, opaque, or proprietary nature of algorithms that makes them hard to understand.
  3. Because decisions cannot be understood or challenged, undermining due process.
  4. Biased data leads to more police presence in certain areas, which leads to more arrests, reinforcing the bias.
  5. They can replicate and amplify human prejudice.
  6. A subtle manipulation of choice through digital interfaces.
  7. Cautionary/Critical.
  8. B

Writing/Speaking Marking Notes (C1 Criteria)

Writing (C1 Advanced): * Content: Did the student answer all parts of the prompt? * Communicative Achievement: Is the tone appropriate (Formal for Report/Essay, semi-formal for Article)? * Organization: Are there clear paragraphs and cohesive devices (Furthermore, Conversely, In conclusion)? * Language: Is there a range of advanced vocabulary and complex grammatical structures (Inversions, Conditionals, Passives)?

Speaking: * Grammar & Vocabulary: Look for C1-level structures (e.g., "Had I known...", "Not only is it... but also..."). * Discourse Management: Does the speaker flow naturally or do they hesitate excessively? * Pronunciation: Is the speech clear and easy to understand?