Scope of Use
  • Thermal cameras are being used by governments and companies to detect and prevent the spread of covid-19.
  • Europe introduced more surveillance measures than any other region since the start of the pandemic.
  • Thermal cameras are currently being used in airports, public transit hubs, offices, retail businesses, health facilities, and on public streets.
  • Actual public or private use of these systems difficult to determine
  • The demands for these devices have increased dramatically as many European countries have joined the market since the start of the pandemic.
Technological robustness and efficacy
  • AI-based thermal cameras are susceptible to errors in temperature readings, particularly when cameras are used to scan multiple people in crowds.
  • The European Centre of Disease Prevention and Control has defined it as a “high-cost, low-efficient measure”.
  • Thermal cameras can only predict surface body temperature, not internal body temperature.
  • The detection of surface body temperature is not a direct indication of covid infection. They miss the detection of asymptomatic people, or those going through an incubation phase.
Impact on citizens and society (blue=pos, red=neg)
  • This type of AI-based surveillance negatively impacts citizens and society.
  • In any context, they are a highly intrusive form of surveillance, that creates a chilling effect on the individual.
  • As an undesired precedent, the use of thermal cameras normalizes (through justifying attempts) the use of constant surveillance of citizens and the collection of unauthorised personal data in the future.
  • There is ample impact on the human right to privacy, autonomy, and democracy.
  • When used in public spaces, compliance with the GDPR is difficult if not impossible (e.g. people cannot consent).
Governance and Accountability
  • The use of thermal cameras in public spaces are not in compliance with GDPR.
  • Responsibility to comply with policy and legislation fall upon the controller of the AI, yet there seems to be no clear and specific legislation put in place for each of these AI-based surveillance devices.
  • It is unknown whether people’s data is used for other purposes e. training the AI, statistical analyses.
  • It is unknown whether additional data is collected aside from what they are strictly designed for, or from what is strictly necessary.
Acceptable trade-offs in times of crisis

The use of thermal cameras has been justified by many European countries amid the pandemic. Yet, scientists argue that they are not better at detecting internal body temperature than alternatively, less-invasive solutions e.g. infrared thermometers. Given the high cost of indirectly detecting people infected with covid-19 (whilst negatively impacting privacy, autonomy, and democracy) we see no acceptable trade-off at this stage for the use of AI-based thermal cameras.

Thermal cameras at Spanish airports

The Spanish airport operations company AENA has deployed Hikvision’s thermal camera across Spanish airports. The cameras faced a lot of criticism because of the amount of sensitive (biometric) data that they collect and their vulnerability to hacking. The company providing the cameras, Hikvision, has also been accused of supplying surveillance equipment to camps detaining Uighurs and other Muslim minorities in China, making the European Parliament remove the company’s cameras from its own premises.

The impact of the deployment of thermal cameras alone, as well as the company’s reputation regarding data safety and human rights issues, have made it clear that these devices call for an urgent ethical and legal evaluation in their application for the detection of covid-19.

We have assessed the use of thermal cameras at the Spanish airports against the 24 requirements of the “Framework for Responsible AI in times of Corona”, divided into 3 categories: (i) Society; (ii) Technology; and (iii) Governance.

Each requirement is given a score from ‘0’ to ‘3’:

0 = unknown

1 = no compliance

2 = partial compliance

3 = full compliance



  • Thermal cameras could set an undesired precedent: more (acceptance of) surveillance
  • Using them at airports can create an unavoidable chilling effect
  • They create an effect of social isolation towards the individuals being detected
  • Impact on the human right to privacy and autonomy
  • No democratic practices from governments or airports consulting citizens during decision-making
  • no free access to correct information about the AI algorithm
  • Habituating to intrusive technologies will undermine the proper functioning of our democratic society
  • Their use in airports does not comply with art.21 GDPR
  • No sufficient information is given to travellers
  • Subjection to the AI-system could only be avoided by not flying to the airport, thus limiting people in their freedom to move about.
  • The behavioural ‘chilling-effect’ on people is not considered in the AI’s deployment
  • The collection of data is solely based on the trust and unwarranted faith in the AI
  • An explanation of the outcome is not given to passengers that become undetected
  • It is unknown whether it is given to those detected by the system
  • Hikvision does not offer an explanation about the AI algorithm
  • Does not account for differences in gender
  • It is possible they include facial recognition which has been associated with racial/gender bias
  • Biased towards symptomatic individuals of increased body temperature
  • The AI-system is used for the temperature screening of all


  • By identifying people with elevated body temperature, they aim to prevent the spread of COVID-19
  • Infrared/manual thermometers are more effective at an individual level
  • These are also less intrusive
  • Screening of other body parts has not been explored
  • Only detects surface body temperature – unable to identify the cause of it
  • Sensitive to other sources of heat
  • Sensitive to people entering rooms from colder/warmer areas
  • Sensitive to covering of the forehead
  • Unknown whether potential adverse effects were identified
  • It is possible that passengers feel anxious during and after the process
  • Level of resilience to (cyber)attacks: low
  • Level of confidence that data is not being shared : low
  • Improper calibration will cause missed detections or false alarms
  • Underreported body temperature
  • Increased variability caused by: obscuring forehead, entering a room from colder/warmer areas
  • No report the percentage accuracy
  • Hikvision claims they can be deployed in office buildings, factories, train stations, and other public areas
  • But they are restricted to certain limitations: detection capacity of up to 30 people, must be installed indoors, individuals must remain indoors for 5 min before screening
  • One can assess the AI’s judgement by comparing it to the temperature reading of an infrared thermometer or a manual device.
  • Not possible to do this at the same rate as the AI


  • The Spanish DPA did not reach a solid conclusion regarding whether temperature screenings fall under the material scope of GDPR
  • The ministry of health stated that personal data captured by thermal cameras shall not be stored
  • Given the low efficiency of thermal cameras, their use does not seem to be necessary nor proportional to the high cost of personal data it captures.
  • There is no clear policy stating the prohibition of additional uses or domains in which AI-based thermal cameras can be used
  • No clear, open and direct communication on their workings and use
  • No direct warnings to passengers
  • No explanation about the nature of the technology
  • People entering the country cannot deny being subjected
  • The is no sunset process for the use of thermal cameras in airports, with no clear end date or dismantling process.
  • Spanish DPA carried an assessment of privacy risk
  • The ministry of health has been involved in their deployment at airports
  • AENA signed a contract with Hikvision to deploy these cameras in airports across the country
  • Citizens have not been involved in the decision-making process
  • Accountability falls on the entity using the technology
  • Hikvision’s engineering recommendations do not coincide with the marketing of their products, which could lead to misuse of their cameras.

Assessment prepared by Mónica Fernández-Peñalver