On 29 January 2021, the Kemp Little team joined Deloitte Legal. Click here to view the press release.

As of 30 January 2021, Kemp Little LLP ceased to operate as a firm of solicitors and practice law and ceased to be regulated and authorised by the Solicitors Regulation Authority.

Kemp Little LLP has been re-named KL Heritage LLP.

If you are looking to contact a specific individual to seek legal advice or in respect of any other business relationship, please contact Deloitte Legal.

If you are seeking to contact the old Kemp Little LLP in relation to a previous business relationship or matter, please get in touch with KL Heritage LLP.

For enquiries relating to Kemp Little technology products and training portal, please email deloittelegal@deloitte.co.uk

 


 

Kemp Little is a trade name used under licence by KL Heritage LLP (formerly Kemp Little LLP, registered number OC300242 and VAT number 182 8854 65).

On 29 January 2021, the Kemp Little team joined Deloitte Legal.  As of 30 January 2021, Kemp Little ceased to operate as a firm of solicitors and practice law. From this date Kemp Little ceased to be authorised and regulated by the Solicitors Regulation Authority and is being re-named KL Heritage LLP.

All references to Kemp Little herein are references to KL Heritage LLP, which used to carry on business in that name.

KL Heritage LLP is not connected to or associated with Deloitte Legal or Deloitte LLP in any capacity.

 

Kemp Little
  • Looking for someone?
  • Email us
  • Search
MENU MENU
Insights overview

Data protection & privacy · 17 September 2019 · Alex Dittel

Facial recognition, from news iterations to court deliberations

The use of facial recognition technologies (FRT) is constantly making the headlines; from Gatwick Airport’s announcement this week confirming its plans to become the first British airport… Read more

more content below

The use of facial recognition technologies (FRT) is constantly making the headlines; from Gatwick Airport’s announcement this week confirming its plans to become the first British airport to permanently use FRT, to FRT-enabled cameras in London’s King’s Cross, police trials of FRT at sporting events, FRT ID kiosks emerging at US airports, or the criticism of Facebook’s “tag suggestions” feature (which is now subject to an opt-in), FRT seems to be here to stay. However, FRT does not come without serious privacy challenges, including a risk of gender and racial discrimination caused by algorithmic bias in FRT systems which privacy groups are fighting to mitigate. But how does FRT work and what are the real privacy risks? In this piece we aim to shed some light on the basics of FRT.

How does FRT work?

CCTV cameras record video images of individuals. Those images are analysed by FRT algorithms which scan for distinguishable facial features such as the distances between a person’s eyes, the eyes and the tip of the nose, or the width of the mouth, to create a unique biometric data record of an individual. The record is then matched against a database of biometric templates to determine whether the “object” is a person, is male or female, has certain characteristics or to identify the individual. In a nutshell, it will know “you” are in the airport terminal (just as you would, for example, recognize someone who catches the same train as you every day), but this does not necessarily mean the FRT can identify you by name, unless it is connected to a reference database.

What are the key concerns?

The key concerns that the public have around FRT are intrusiveness, profiling, lack of transparency and bias.

The fear that FRT could potentially be deployed without peoples’ knowledge, concealed as a regular CCTV camera which people have grown accustomed is a common theme as FRT is implemented in different sectors. The indiscriminate use of FRT could result in the unlawful surveillance and profiling of individuals, in breach of their fundamental right to privacy.

Whilst we are fairly far away from a “Minority report”1 scenario, there is a concern that recording of peoples’ biometric data may force people to change their behaviour. For example, the deployment of FRT could create a sense of being watched in the general public that could dissuade people from expressing their views or participating in peaceful protests.

Accuracy issues resulting from data engineers’ failure to adequately train their FRT models on a sufficiently diverse audience, have cast doubt on the ability of FRT to deliver fair outcomes. This is particularly true in the context of any automated decision-making which is operated without human intervention.

What law governs FRT in the UK and in Europe?

At a European level, FRT is subject to  the European Convention of Human Rights (ECHR) and the General Data Protection Regulation (“GDPR”) as well as, in the case of  public authorities making use of FRT, the Law Enforcement Directive.2 The GDPR governs the capturing of images of people (unless the “domestic purposes” exemption applies). The biometric information collected by FRT will likely constitute special category of personal data under the GDPR, the collection of which is restricted. The European Data Protection Board has also issued “Guidelines 3/2019 on processing of personal data through video devices”.

Some European countries such as Sweden and the Netherlands have camera surveillance laws that require prior permits for the deployment of FRT in areas accessible by the public.

Whereas there is no specific UK legislation at present regulating FRT, regulatory codes such as the Surveillance Camera Commissioner’s “Surveillance Camera Code of Practice”, the Information Commissioner’s Office’ (ICO) “CCTV Code of Practice” should be considered when implementing FRT.

Recent cases of interest

Two novel cases in relation to FRT have recently emerged in Europe.

Use of FRT in a trial to monitor pupils’ attendance at Swedish school

The Swedish data protection authority “Datainspektionen” has issued a fine equating to around £16,700 and a warning to a secondary school.

The school conducted a voluntary trial involving 22 pupils which lasted for three weeks. The students were filmed by a camera when they entered the classroom. Those images were then matched against photos of the participating students. Participation was voluntary and explicit consent was obtained from the pupils’ guardians. Those who did not take part were not monitored and their attendance was checked by the usual manual process. The FRT was deployed to improve the accuracy of absence record keeping which was an important public task.

Datainspektionen concluded as follows:

  • Legal basis of processing. Datainspektionen held that the dependence of the pupil on the school in terms of grades, funding, future study opportunities meant that consent could not be voluntary. Interestingly, consent here was given by an adult guardian who is designated by law to make “adult” decisions on behalf of the child. Datainspektionen held that consent given in the school environment could still be valid in relation to taking photos to demonstrate the pupils’ participation in school activities. However, keeping absence records was so important that the pupil was in a “state of dependence that results in significant inequality”.
  • In the alternative, the school relied on the lawful basis of complying with a legal obligation and carrying out a public task. The school had a legal obligation to carry out effective “case handling” and to report any absences to the guardian and the central funding board. Under the relevant Swedish law “case handling” was not defined but commentary on the law suggested that, unlike the manual process, FRT record keeping would not qualify as “case handling” because of its automated nature. Another nuance under Swedish law is that the ground under Article 9(1)(g) GDPR permits the processing of special category personal data in connection with a substantial public interest, but only in relation to data “provided to the authority” and processing is only permitted where the processing does not result in “undue intrusion” such as by creating “privacy sensitive compilations of sensitive personal data”.[1] On this basis, Datainspektionen held that without any specific law that would mandate the processing of special categories of data for this purpose, the deployment of FRT was not warranted.
  • Datainspektionen held the use of FRT was disproportionate because less intrusive means were available and the processing constituted a “a great infringement of students’ integrity”. It followed, that the data minimisation principle was also breached.
  • Data Protection Impact Assessment. In order to assess the risks involved, a data protection impact assessment should have been carried out. However, the school only carried out a limited assessment (or recorded a written statement rather) in relation to the lawful basis, public task and security highlighting the benefits of FRT. So, the requisite assessment of proportionality was missing.

Use of FRT by UK police

In the UK, a recent test case[2] backed by the privacy group Liberty challenged the use of FRT by the police. According to counsel this was the first time in the world that FRT was considered by a court. The Article 8 and data protection challenge was dismissed in favour of the police.

The police trialled the FRT at sporting events such as football and rugby matches. Digital images of faces of members of the public were taken from live CCTV feeds (at a rate of up to 50 faces per second) and processed in real time to extract facial biometric information. That information was then compared with facial biometric information of individuals on a watchlist prepared for the specific deployment, based on general policing and custody records. If a match was found, the police officer operating the system would verify it. This human intervention was considered an important safeguard by the court. The intervention officers would also form their own judgment if the individual identified is indeed a subject of interest. If no match was made, the data was immediately deleted from the system. Match data was retained for up to 24 hours. The underlying CCTV feed was retained for 31 days.

The court observed as follows:

  • Interference with the right to privacy. The technology interfered with the Article 8 right to privacy, as the underlying biometric data is “intrinsically private” as it allowed the police to single out individuals. The argument that the police was not processing the data of members of the public whose data was “manifestly made public” but only that of people on the watch list, was rejected by the court. The speedy deletion of unmatched data was of no consequence.
  • Adequate legal framework. The interference with the fundamental right was in accordance with the law. The claimant argued that “there is no sufficient legal framework for the use of [FRT] … such that its use lacks the necessary qualities of foreseeability, predictability, and hence of legality”. However, the court was satisfied that the police’s common law powers, the Data Protection Act 2018, regulatory codes and the police’s own policy documents created a sufficient legal framework for the purposes of ECHR. The court concluded that the UK’s legal regime is adequate “to ensure the appropriate and non-arbitrary use of [FRT] … consistent with the requirements of the Human Rights Act, and the data protection legislation”.
  • Proportionality is key. The deployment of FRT was proportionate against the backdrop of transparency achieved through Facebook and Twitter as well as offline means, deployment for a legitimate limited purpose (identifying individuals whose presence was of justifiable interest), human intervention, limited data retention and the lack of complaints. Proportionality was judged against the four-part test in Mellat.[3] The court agreed that use of CCTV alone would not achieve the safety of the public and detection of crime purposes pursued by the police, and the processing was “strictly necessary”. The watch lists were sufficiently “targeted”. The court acknowledged that FRT helps to save resources.
  • It could not be determined whether the system had the discriminatory impact alleged by the claimant but given its initial equality assessment and the ongoing review of the system by the police, the claim under Equality Act 2010 failed.
  • Appropriate policy document. The processing was “necessary for the performance of a task carried out for” law enforcement purposes under section 35 of Data Protection Act 2018 and the police had an “appropriate policy document” in place. Although the court considered the document “brief and lacking in detail”, it decided to let the police to follow any further guidance from the ICO.
  • Data Protection Impact Assessment. The privacy impact assessment carried out by the police addressed concerns about a breach of Article 8 of ECHR.
  • Future use of FRT. On the claimants’ contention that any future use would be unlawful, the court considered “there is no systemic or clear ‘proportionality deficit’ such that it can be said that future use of [FRT] … would be inevitably disproportionate” and that “Questions of proportionality are generally fact sensitive”.

This case highlights the vast amount of issues that need to be considered before deploying FRT. Even though the case concerns law enforcement, the conclusions can serve as a point of reference for the commercial sector.

The above-mentioned cases highlight the need for planning prior to deploying FRT. This is further complicated by the harmonisation issues raised by the Swedish case, which seemed local law-specific. Surely explicit consent given by the guardian could be relied on in relation to a voluntary trial? We can only hope that future cases will uniformly shape the concept of “proportionality” in the context of FRT, which will be essential in setting the boundaries between this “new technology” and the right to privacy. We will await with interest the outcome of the recently announced investigation into the King’s Cross monitoring by the ICO which has confirmed that “Facial recognition technology is a priority area”.[4]

Despite the media headlines in relation to the use of FRT, the existing legal regime in Europe is designed to put individuals first and protect their interests. It would be in breach of law to use FRT for obscure purposes, without transparency and without putting adequate safeguards in place (and it is very unlikely it would go unnoticed or unremedied). The dystopian use of FRT is, for the time being, not in the horizon in Europe. Of course, transparency will play a key role in gaining the general trust and acceptance that is essential to rollout FRT. It will ultimately come down to peoples’ trust in the law, regulatory supervision and their legal remedies, and the Liberty court case and the developments in San Francisco[5] certainly show peoples’ scepticism towards FRT. However, whilst there is scope for the unlawful deployment of FRT, it is reassuring that the current law is “adequate”.

Moreover, according to the Financial Times, the incoming president of the European Commission, Ursula von der Leyen, will be focusing on legislation to provide a “co-ordinated European approach on the human and ethical implications of artificial intelligence” to “foster public trust and acceptance” in facial recognition.6 However, one could question if there is any need for further restrictions on the use of FRT. As the UK’s biometrics commissioner stated, make a “strategic choice” about “the future world we want to live in”.7

 

1 A 2002 film directed by Steven Spielberg, based on the Phillip K Dick novel of the same name.

2 DIRECTIVE (EU) 2016/680 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.

[1] “Datainspektionen, Supervision in accordance with the EU Data Protection Regulation 2016/679 – facial recognition for attendance control of students, Skellefteå Municipality, High School Board, 2019-08-20, page 3.” Link.

[2] R (Bridges) v Chief Constable of South Wales police and Others [2019] EWHC 2341 (Admin). Link.

[3] Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700. Link.

[4] Statement: Live facial recognition technology in King’s Cross, Information Commissioner’s Office, 15 August 2019. Link.

[5] California could soon ban facial recognition technology on police body cameras, Los Angeles Times, 12 September 2019. Link.

  • Share this blog

  • Twitter
  • Facebook
  • Linkedin

Alex DittelAlex Dittel is a data protection & privacy senior associate

Get in touch

View the team

Sign up for our newsletters

  • Share this Blog

  • Twitter
  • Facebook
  • Linkedin

Other stuff you might like

  1. How to… lawfully collect customer contact-tracing information | The Caterer
  2. What to do when an audit highlights deficiency
  3. Podcast | DPO Update: DPIAs, Schrems 2, e-privacy, CCPA & CPRA, mobile phone extraction by police, Smart TVs investigated and latest fines
The hottest topics in technology
  • Adtech & martech
  • Agile
  • Artificial intelligence
  • EBA outsourcing
  • Brexit
  • Cloud computing
  • Complex & sensitive investigations
  • Connectivity
  • Cryptocurrencies & blockchain
  • Cybersecurity
  • Data analytics & big data
  • Data breaches
  • Data rights
  • Digital commerce
  • Digital content risk
  • Digital health
  • Digital media
  • Digital infrastructure & telecoms
  • Emerging businesses
  • Financial services
  • Fintech
  • Gambling
  • GDPR
  • KLick DPO
  • KLick Trade Mark
  • Open banking
  • Retail
  • SMCR
  • Software & services
  • Sourcing
  • Travel
close
The hottest topics in technology
  • Adtech & martech
  • Agile
  • Artificial intelligence
  • EBA outsourcing
  • Brexit
  • Cloud computing
  • Complex & sensitive investigations
  • Connectivity
  • Cryptocurrencies & blockchain
  • Cybersecurity
  • Data analytics & big data
  • Data breaches
  • Data rights
  • Digital commerce
  • Digital content risk
  • Digital health
  • Digital media
  • Digital infrastructure & telecoms
  • Emerging businesses
  • Financial services
  • Fintech
  • Gambling
  • GDPR
  • KLick DPO
  • KLick Trade Mark
  • Open banking
  • Retail
  • SMCR
  • Software & services
  • Sourcing
  • Travel
Kemp Little

Lawyers
and thought leaders who are passionate about technology

Expand footer

Kemp Little

138 Cheapside
City of London
EC2V 6BJ

020 7600 8080

hello@kemplittle.com

Services

  • Commercial technology
  • Consulting
  • Disputes
  • Intellectual property
  • Employment
  • Immigration

 

  • Sourcing
  • Corporate
  • Data protection & privacy
  • Financial regulation
  • Private equity & venture capital
  • Tax

Sitemap

  • Our people
  • Insights
  • Events
  • About us
  • Contact us
  • Cookies
  • Privacy
  • Terms of use
  • Complaints
  • Debt recovery charges

Follow us

  • Twitter
  • LinkedIn
  • FlightDeck
  • Sign up for our newsletters

Kemp Little LLP is a limited liability partnership registered in England and Wales (registered number OC300242) and is authorised and regulated by the Solicitors Regulation Authority. Its registered office is 138 Cheapside, London EC2V 6BJ. The SRA Standards and Regulations can be accessed by clicking here.

  • Cyber Essentials logo
  • LORCA logo
  • ABTA Partner+ logo
  • Make Your Ask logo
  • FT Innovative Lawyers 2019 winners logo
  • Law Society Excellence Awards shortlisted
  • Legal Business Awards = highly commended
  • Home
  • Our people
  • Services
    • Business restructuring and reorganisation
    • Commercial technology
    • Consulting
    • Corporate
    • Data protection & privacy
    • Digital content & reputation risk
    • Disputes
    • Employment
    • Financial regulation
    • Immigration
    • Innovation
    • Intellectual property
    • Private equity & venture capital
    • Sourcing
    • Tax
    • Travel
  • Resources
  • Insights
  • Covid 19: Your Business Continuity
  • Events
  • About us
    • Who we are
    • Our social responsibilities
    • Our partnerships
    • Join us
  • Contact us
  • FlightDeck
  • Sign up for our newsletters
  • Follow us
    • Twitter
    • LinkedIn
close
close
close

Send us a message

Fill in your details and we'll be in touch soon

[contact-form-7 id="4941" title="General contact form"]
close

Sign up for our newsletter

I would like to receive updates and related news from Kemp Little *

Please select below any publications that you would like to receive:

Newsletters

close

Register for future event information

[contact-form-7 id="4943" title="Subscribe to future events"]
close
close
Generic filters
Exact matches only

Can't remember their name? View everyone

  • Home
  • Our people
  • Services
    • Business restructuring and reorganisation
    • Commercial technology
    • Consulting
    • Corporate
    • Data protection & privacy
    • Digital content & reputation risk
    • Disputes
    • Employment
    • Financial regulation
    • Immigration
    • Innovation
    • Intellectual property
    • Private equity & venture capital
    • Sourcing
    • Tax
    • Travel
  • Resources
  • Insights
  • Covid 19: Your Business Continuity
  • Events
  • About us
    • Who we are
    • Our social responsibilities
    • Our partnerships
    • Join us
  • Contact us
  • FlightDeck
  • Sign up for our newsletters
  • Follow us
    • Twitter
    • LinkedIn