- At Kemp Little, we are known for our ability to serve the very particular needs of a large but diverse technology client base. Our hands-on industry know-how makes us a good fit with many of the world's biggest technology and digital media businesses, yet means we are equally relevant to companies with a technology bias, in sectors such as professional services, financial services, retail, travel and healthcare.
- Kemp Little specialises in the technology and digital media sectors and provides a range of legal services that are crucial to fast-moving, innovative businesses.Our blend of sector awareness, technical excellence and responsiveness, means we are regularly ranked as a leading firm by directories such as Legal 500, Chambers and PLC Which Lawyer. Our practice areas cover a wide range of legal issues and advice.
- Our Commercial Technology team has established itself as one of the strongest in the UK. We are ranked in Legal 500, Chambers & Partners and PLC Which Lawyer, with four of our partners recommended.
- Our team provides practical and commercial advice founded on years of experience and technical know-how to technology and digital media companies that need to be alert to the rules and regulations of competition law.
- Our Corporate Practice has a reputation for delivering sound legal advice, backed up with extensive industry experience and credentials, to get the best results from technology and digital media transactions.
- In the fast-changing world of employment law our clients need practical, commercial and cost-effective advice. They get this from our team of employment law professionals.
- Our team of leading IP advisors deliver cost-effective, strategic and commercial advice to ensure that your IP assets are protected and leveraged to add real value to your business.
- Our litigation practice advises on all aspects of dispute resolution, with a particular focus on ownership, exploitation and infringement of intellectual property rights and commercial disputes in the technology sector.
- We have an industry-leading reputation for our outsourcing expertise. Our professionals deliver credible legal advice to providers and acquirers of IT and business process outsourcing (BPO) services.
- We work alongside companies, many with disruptive technologies, that seek funding, as well as with the venture capital firms, institutional investors and corporate ventures that want to invest in exciting business opportunities.
- Our regulatory specialists work alongside Kemp Littles corporate and commercial professionals to help meet their compliance obligations.
- With a service that is commercial and responsive to our clients needs, you will find our tax advice easy to understand, cost-effective and geared towards maximising your tax benefits.
- At Kemp Little, we advise clients in diverse sectors where technology is fundamental to the ongoing success of their businesses.They include companies that provide technology as a service and businesses where the use of technology is key to their business model, enabling them to bring their product or service to market.
- We bring our commercial understanding of digital business models, our legal expertise and our reputation for delivering high quality, cost-effective services to this dynamic sector.
- Acting for market leaders and market changers within the media industry, we combine in-depth knowledge of the structural technology that underpins content delivery and the impact of digitisation on the rights of producers and consumers.
- We understand the risks facing this sector and work with our clients to conquer those challenges. Testimony to our success is the continued growth in our team of professionals and the clients we serve.
- We advise at the forefront of the technological intersection between life sciences and healthcare. We advise leading technology and data analytics providers, healthcare institutions as well as manufacturers of medical devices, pharmaceuticals and biotechnological products.
- For clients operating in the online sector, our teams are structured to meet their commercial, financing, M&A, competition and regulatory, employment and intellectual property legal needs.
- Our focus on technology makes us especially well positioned to give advice on the legal aspects of digital marketing. We advise on high-profile, multi-channel, cross-border cases and on highly complex campaigns.
- The mobile and telecoms sector is fast changing and hugely dependent on technology advances. We help mobile and wireless and fixed telecoms clients to tackle the legal challenges that this evolving sector presents.
- Whether ERP, Linux or Windows; software or infrastructure as a service in the cloud, in a virtualised environment, or as a mobile or service-oriented architecture, we have the experience to resolve legal issues across the spectrum of commercial computer platforms.
- Our clients trust us to apply our solutions and know-how to help them make the best use of technology in structuring deals, mitigating key risks to their businesses and in achieving their commercial objectives.
- We have extensive experience of advising customers and suppliers in the retail sector on technology development, licensing and supply projects, and in advising on all aspects of procurement and online operations.
- Our years of working alongside diverse software clients have given us an in-depth understanding of the dynamics of the software marketplace, market practice and alternative negotiating strategies.
- Working with direct providers of travel services, including aggregators, facilitators and suppliers of transport and technology, our team has developed a unique specialist knowledge of the sector
- Your life as an entrepreneur is full of daily challenges as you seek to grow your business. One of the key strengths of our firm is that we understand these challenges.
- Kemp Little is trusted by some of the worlds leading luxury brands and some of the most innovative e-commerce retailers changing the face of the industry.
- HR Bytes is an exclusive, comprehensive, online service that will provide you with a wide range of practical, insightful and current employment law information. HR Bytes members get priority booking for events, key insight and a range of employment materials for free.
- FlightDeck is our portal designed especially with start-up and emerging technology businesses in mind to help you get your business up and running in the right way. We provide a free pack of all the things no-one tells you and things they dont give away to get you started.
A guide to GDPR profiling and automated decision-making
Organisations need to prepare for the new provisions on profiling, such as informing individuals in their privacy notices. EU DPAs are now seeking comments on their profiling guidelines. By Nicola Fulford and Krysia Oastler of Kemp Little LLP.
The Article 29 Working Party issued on 3 October 2017 its draft “Guidelines on Automated individual decision-making and Profiling”1 under the GDPR (referred to as “the guidelines” in this article).
The guidelines provide some clarity around what is a puzzling aspect of the GDPR for many. Profiling involving personal data is already part of the dayto-day processing undertaken by many organisations. Existing profiling activities involving personal data are subject to data protection law. The GDPR is an evolution (not a revolution) of the law, so why is there such attention about the impact of the GDPR on profiling?
There are three reasons. Firstly, the GDPR includes a definition of profiling, which is new, very broadly defined and being a new definition, requires guidance on which activities are caught. Secondly, there are references to profiling throughout the text of the GDPR suggesting that profiling is, in and of itself, a risky activity and organisations need to determine what this means for their profiling activities. Thirdly, there are several specific rules in relation to solely automated decision-making which reference profiling, and the text of the GDPR does not provide clarity on the scope of these rules.
In this article, we explore what is meant by profiling, automated decision- making and solely automated decision-making under the GDPR and consider how to navigate the rules applying to these activities.
What is profiling?
Profiling consists of three aspects:
- Automated processing (processing using computers);
- of personal data2
- with the aim of evaluating personal aspects relating to a person or group of people (including analysis or prediction)3.
The guidelines make it clear that the definition is very broad and that the processing does not need to involve inference to be caught – “simply assessing or classifying individuals based on characteristics such as their age, sex, and height could be considered profiling, regardless of any predictive purpose”4.
The guidelines describe profiling as having three distinct stages each of which fall within the GDPR definition of profiling: (1) data collection; (2) automated analysis to identify correlations; and (3) applying the correlation to an individual to identify characteristics of present or future behaviour5.
Examples of profiling include:
- Collection and analysis of data to gain insights into behaviours and characteristics (the guidelines include an example of a data broker collecting data from different public and private sources, compiling the data to develop profiles on the individuals, placing the individuals into segments and selling the output information to companies who wish to improve the targeting of their goods and services6);
- Keeping a record of traffic violations to monitor driving habits of individuals over time to identify repeat offenders (which may have an impact on the sanction)7; and
- Considering an individual’s credit score before granting a mortgage8.
What is meant by solely automated decision-making?
A decision based solely on automated processing is a decision with no human involvement in the decision process9. The guidelines warn that involving a human in the process to circumvent the rules on solely automated decision making would not work, as the human involvement must be meaningful and not just a token gesture. The individual needs to have the authority to change the decision considering all the information available10..
Decisions that have a legal effect are those that impact on an individual’s legal rights (including in contract). Examples given in the guidelines include:
- entitlement or denial of a social benefit granted by law, such as child or housing benefit;
- increased surveillance by competent authorities; or
- being automatically disconnected from a mobile phone service because an individual forgot to pay his/her bill before going on holiday.
A decision that has a similarly significant effect “must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned. At its most extreme, the decision may lead to the exclusion or discrimination of individuals.” 11 The examples given in the GDPR are automatic refusal of an online credit application or e-recruiting practices without any human intervention. The guidelines explain that although online advertising will not generally meet the threshold of having a similarly significant effect, online advertising may meet the threshold depending on the intrusiveness of the profiling, the expectations and wishes of the individuals, the way the advert is delivered and the vulnerabilities of the individuals concerned. An example given is an advert for risky financial products targeted at vulnerable individuals.
what should organisations be doing now?
Take stock of profiling activities and any automated decision-making: It will be impossible to comply with GDPR requirements without first identifying the profiling activities and automated decisions taken by the organisation. Organisations are likely to find it helpful to think about the three stages of profiling12 to help identify profiling activities.
Where automated decisions are identified, assess whether they are solely automated and, if so, if they may produce a legal or similarly significant effect on individuals. Organisations should document their analysis as part of GDPR accountability requirements.
Comply with the data protection principles: Identify an appropriate legal basis for each of your profiling activities and automated decisions. Ensure your activities comply with the data protection principles13.
Tell people about your profiling activities and automated decisions: Organisations need to provide information about profiling and automated decision-making in their privacy notices14. The rights to object and, where consent is the legal basis for processing, the right to withdraw consent must be explicitly brought to the attention of individuals and presented clearly and separately from other information. [See section below for specific requirements for Article 22 solely automated decisions that have a legal or similarly significant effect (“Article 22 decisions”).]
Have processes to deal with individual’s rights in relation to profiling and automated decision making: Organisations need to have processes in place to deal with requests from individuals exercising their rights. Consider the right of access to data and what information to which individuals will be entitled to a copy.
Individuals have an absolute right to object to direct marketing including profiling related to direct marketing. Organisations will need to have a clear view on their profiling that is related to direct marketing in order to be able to fulfil the absolute right to object to direct marketing. Individuals also have a right to object to processing of personal data necessary for the purposes of the legitimate interests pursued by the controller. Such objections to processing will likely need to be considered on a case-by-case basis by the controller.
Special considerations for article 22 decisions: There is debate about whether Article 22 is a prohibition (meaning organisations cannot take Article 22 decisions unless one of the exemptions applies) or just a right for individuals not to be subject to Article 22 decisions (meaning individuals only have the right to object to such decisions).
The guidelines clearly state that the controller can only carry out the processing if one of the three exceptions covered in Article 22(2) applies15. Read as a prohibition, organisations are only permitted to take Article 22 decisions where:
- The decision is necessary for entering into, or performance of, a contract between the individual and the controller;
- the decision is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
- the decision is based on the individual’s explicit consent; and
- the controller has implemented suitable measures to safeguard the individual’s rights and freedoms and legitimate interests (which includes at least a means for the individual to obtain human intervention, express his or her point of view and/or contest the decision).
Note that Article 22 decisions must not be based on special categories of personal data unless the controller has the explicit consent of the individual or the automated decision-making is necessary for reasons of substantial public interest and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.
Regardless of the distinction, when taking Article 22 decisions, organisations must implement documented processes to ensure that:
- the decisions are lawful;
- information about the profiling and the Article 22 decisions is easily accessible for individuals and brought to their attention (which includes the rationale behind or the criteria relied on in reaching the decision and the consequences for the individual with tangible examples);
- details of Article 22 decisions are provided in response to data subject access requests, including meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the individual;
- suitable measures to safeguard individuals’ rights, freedoms and legitimate interests (including as a minimum, a way for the individuals to obtain human intervention, express their point of view, obtain an explanation of the decision reached and/or contest the decision) are implemented.
Potential differences across EU member states
Member States have discretion to introduce legislation to restrict individuals’ rights and controllers’ obligations regarding Article 22 decisions. In the UK, Section 13 of the Data Protection Bill sets out safeguards in relation to the Member State derogation provisions on automated decision-making and introduces the concept of a “qualifying significant decision”, which is an automated decision that produces legal effects or significantly affects the data subject, is required or authorised by law and is not exempt due to it being necessary for performance of a contract or as a result of explicit consent being obtained.
Where a qualifying significant decision exists, the automated decision-making will be exempt from the Article 22 prohibition, subject to the controller, as soon as reasonably practicable, notifying the data subject in writing that the automated decision has been made. The individual has 21 days to ask the controller to reconsider the decision or take a new decision that is not based solely on automated processing. If such a request is submitted, then the controller has a further 21 days to comply with the request and inform the data subject of the steps it has taken to comply along with the outcome.
It is unclear how an automated decision “authorised by law” will be interpreted in each country. It is also unclear whether including details of the automated decision in a privacy notice would satisfy the obligation to notify the individual or, more likely, this should be interpreted as an additional requirement.
There is potential for the additional exemptions to create confusion for organisations that are seeking to implement a workable mechanism which can be consistently applied, as the requirements are different from the requirements for the other exemptions (performance of a contract or if explicit consent is obtained). This is an area for organisations to keep under review.
Looking to the future –DPIA's?
Data protection impact assessments (DPIAs) are mandatory in certain circumstances under the GDPR16. A DPIA is required in the case of Article 22 decisions. Organisations need a process to identify whether they are required to perform a DPIA on future profiling and automated decision- making. Even where a DPIA is not legally required, a DPIA should be considered as a good practice tool and a way of demonstrating that profiling/automated decision-making complies with the GDPR.
Something to say?
The Article 29 Working Party has requested comments on the profiling guidelines (and separately the data breach notification guidelines) be submitted by 28 November 2017, so time remains in which to submit your views17. The aim is for the guidelines to be finalised by the end of the year.
This article was first published in Privacy Laws & Business UK Report, November 2017, www.privacylaws.com.
1 Available on the Article 29 Working Party websiteec.europa.eu/newsroom/just/document.cfm?doc_id=47963
2 Defined in Article 4(1) of the GDPR as any information relating to an identifiedor identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person
3 Profiling is defined in Article 4(4) of the GDPR as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”.
4 Page 7 of the guidelines
5 See reference above
6 See reference above
7 Page 8 of the guidelines
8 See reference above
9 The guidelines illustrate the difference between decision-making based on profiling and solely automated decisions using examples. An example of decision-making based on profiling is where a human decides whether to agree the loan based on a profile produced by purely automated means. An example of a solely automated decision (including profiling) is where an algorithm decides whether the loan is agreed and the decision is automatically delivered to the individual, without any meaningful human input.
10 Page 10 of the guidelines
11 Page 11 of the guidelines
12 Data collection, automated analysis to identify correlations and applying the correlation to an individual to identify characteristics of present or future behaviour
13 Set out in Article 5 of the GDPR
14 See Articles 13 and 14 of the GDPR
15 Page 15 of the guidelines
16 See Article 35 of the GDPR
17 Details on how to provide comments are available on the Article 29 Working Party website ec.europa.eu/newsroom/just/