On 29 January 2021, the Kemp Little team joined Deloitte Legal. Click here to view the press release.

As of 30 January 2021, Kemp Little LLP ceased to operate as a firm of solicitors and practice law and ceased to be regulated and authorised by the Solicitors Regulation Authority.

Kemp Little LLP has been re-named KL Heritage LLP.

If you are looking to contact a specific individual to seek legal advice or in respect of any other business relationship, please contact Deloitte Legal.

If you are seeking to contact the old Kemp Little LLP in relation to a previous business relationship or matter, please get in touch with KL Heritage LLP.

For enquiries relating to Kemp Little technology products and training portal, please email deloittelegal@deloitte.co.uk

 


 

Kemp Little is a trade name used under licence by KL Heritage LLP (formerly Kemp Little LLP, registered number OC300242 and VAT number 182 8854 65).

On 29 January 2021, the Kemp Little team joined Deloitte Legal.  As of 30 January 2021, Kemp Little ceased to operate as a firm of solicitors and practice law. From this date Kemp Little ceased to be authorised and regulated by the Solicitors Regulation Authority and is being re-named KL Heritage LLP.

All references to Kemp Little herein are references to KL Heritage LLP, which used to carry on business in that name.

KL Heritage LLP is not connected to or associated with Deloitte Legal or Deloitte LLP in any capacity.

 

Kemp Little
  • Looking for someone?
  • Email us
  • Search
MENU MENU
Insights overview

Commercial technology · Data protection & privacy · 2 May 2019 · Hayley Davis · Agatha Claridge

Health apps and your personal data – are they doing more harm than good?

Advances in digital health products, and increased accessibility to them, are changing the way consumers monitor their health and access medical services. Consumers can now… Read more

more content below

Advances in digital health products, and increased accessibility to them, are changing the way consumers monitor their health and access medical services. Consumers can now use health apps to self-diagnose, access services, check symptoms and research medication. For example:

  • following FDA approval in the US, recent CE marking approval means that Apple Watch users across 19 countries in Europe can now use the device to take electrocardiogram (ECG) readings measuring their heart health;
  • apps such as Push Doctor and Babylon mean that patients can now access, book and attend GP appointments using an app;
  • chatbots can dispense medical information to thousands of people at once. For example, “Molly” assesses a patient’s symptoms using speech, text, images and video, and “Florence” acts as a personal nurse, tracking the user’s health and reminding them to take their medication.

The use of health apps is quick, convenient and usually free, meaning the global market for mobile health apps is expected to rapidly increase, with forecasts predicting it will be worth US$102.43 billion by 2022[1]. However, the major emerging theme of the health app revolution is the sharing of vast quantities of data, and a new study by researchers from Canada, the United States and Australia suggests that many companies are not transparent enough about what they do with your personal data collected when you use these apps.

The analysis

The research team selected 24 of the most popular health-related apps in the UK, US, Canada and Australia available on the Android platform via Google Play, including symptom-checker Ada, and reference guides Medscape and Drugs.Com.

The research team created imaginary users who interacted on the apps and conducted traffic analysis. They then updated the information for one of the imaginary users and analysed the resulting traffic to identify privacy leaks.

Content and network analysis was then performed to map the data flows from the app, including identifying any third (and fourth) parties with whom personal data were shared, and categorising the relationships between the various parties.

The staggering results

The study showed that 79% of the apps tested share your personal data with third parties, particularly data such as browsing behaviour, device name, email address and approximate or precise location. A total of 55 new third parties were identified as directly receiving or processing the data, which included app developers, parent or other group companies and other unconnected third parties. The fourth party network, a further layer of companies to whom data is sent by the third parties, totalled 237 entities. This fourth party network comprised entities even further removed from the health and medical sphere, such as credit reference agencies.

Many of the companies behind the apps reserved the right to collect “deidentified” and aggregated data from your use of their apps for their own commercialisation, such as selling on to pharmaceutical companies, health insurers and other health-related services.

The researchers found that it is nearly impossible to opt out of this data sharing.

Legal issues

Although a lack of transparency when processing any of your personal data is always a cause for concern, personal data relating to health is defined as a special category of personal data under the GDPR and therefore subject to more stringent requirements.

A company must have a lawful basis for processing any of your personal data, but some of the bases most commonly relied upon (such as legitimate interest), cannot be used for processing personal data relating to health. Processing personal data relating to an individual’s health is prohibited unless one of the specified justifications is met, for example having the “explicit consent” of the data subject. The requirements for “explicit consent” are relatively high and the main problem is whether individuals are truly being armed with sufficient information to make an informed decision about how their personal data is used.

Do they really know what they are consenting to?

Anonymised or pseudonymised?

Privacy policies are the key to communicating what, how and why personal data are processed and shared. As mentioned above, the apps tested and the relevant third party entities often rely on “deidentifying” personal data in order to share it. The privacy policy for one of the tested apps frequently refers to “aggregated and/or anonymised” data, but what does this really mean?

  • “Anonymised data” is data that cannot possibly be traced back to any identifiable individual, and therefore is not caught by the requirements under GDPR.
  • “Pseudonymised data” is different and although data may appear anonymised at first, if it can be reversed or otherwise matched to an individual using a separately held set of data or information, this will be pseudonymised, not anonymised.

Therefore, true anonymisation can be difficult to achieve, and data which may in isolation seem irrelevant or low-risk can help to build an incredibly detailed profile of one individual when combined with other data sets held by third parties.

GDPR principles

Companies must also apply the principles of purpose limitation and data minimisation to any processing of personal data. This means that personal data should be only be collected for “explicit, specific and legitimate purposes”, and should only be “adequate, relevant and limited to what is necessary”.

Collecting and processing personal data outside of these parameters exposes both data subjects and the companies that collect their data to risk: the bigger the cache of personal data that is held, the bigger the potential damage could be in the event of exposure to a cyber attack or data breach.

Health app data breaches

Data breaches are becoming a much more frequent occurrence, and in a 2018 report[2], health-related organisations accounted for a quarter of those incidents.

2018 provided the first known data breach of a major consumer fitness app when Under Armour, the company that owns diet and fitness tracking app MyFitnessPal, was subject to a significant data breach with personal data relating to 150 million separate accounts compromised by hackers and sold on the dark web.

Poor data protection practices and weak security design can leave your personal data exposed. In 2018 PumpUp, a popular fitness app, left a backend server unprotected on Amazon’s cloud, which allowed anyone who had the IP address of the server to see who was signing on, and the contents of the private messages being sent between users in real time.

The journey so far – a Code of Conduct for mobile health data?

In 2014 the European Commission published a Green Paper consultation into mobile health, which revealed that consumers had typically been slower and more cautious in adopting mobile and tech developments in health than in other areas of their lives due to privacy concerns.

In April 2015, a group of industry stakeholders including Apple, Google and Microsoft began the process of creating a Privacy Code of Conduct for mobile health apps.

In December 2017, the first draft was formally submitted to the then Article 29 Working Party (now the European Data Protection Board). However, the Working Party decided that the imminence of the GDPR coming into force coupled with the fact that the draft did not adequately address all of the GDPR’s requirements meant that it could not approve the Code and so it remains in draft form.

What can you do about your data?

For the majority of consumers, the benefits afforded by mobile health apps far outweigh their privacy concerns. However, as consumers become more data savvy and the digital health sector continues to grow at pace, these concerns will need to be addressed.

You should be extra vigilant with your use of health apps and the data that you are inputting. It is usually possible to better protect your data, but this isn’t easy and often requires a lot of investigation and work.

What should app providers do?

As a result of increased consumer awareness and with the variety of health apps available, health app providers that lead the way in considering protection of consumers’ data are likely to be more popular with consumers. Health app providers should therefore follow these steps to show that they use personal data considerately and securely:

  1. The GDPR has led to a heightened awareness by consumers over their data, so you should continually analyse any data you collect and ensure it complies the principles relating to the processing of personal data and your legal requirements. You must also identify any special categories of personal data being processed and ensure these meet the additional legal requirements. You should ensure that all considerations and key decisions are well documented.
  2. With the rise in value of consumer data, cyber breaches are likely to continue to become more prevalent, so ensure security measures are appropriate for the types of data you process.
  3. Ensure data is anonymised where possible and that any data that is treated as anonymised is truly
  4. Following reports from studies into the use of data in health apps, consumers will value transparency, so ensure privacy policies are kept up to date and drafted in a way that allows consumers to really understand and control what is happening with their data.
  5. Keep an eye on developments in relation to the Privacy Code of Conduct for mHealth apps

[1] Report by Deep Knowledge Ventures http://analytics.dkv.global/data/pdf/Health-Tech-Mobile-Apps-Analytical-Report.pdf

[2] Verizon’s annual Data Breach Investigations Report, released on 10 April 2018.

  • Share this blog

  • Twitter
  • Facebook
  • Linkedin

Need to talk about this?

Hayley DavisHayley Davis

Agatha ClaridgeAgatha Claridge

Get in touch

Sign up for our newsletters

  • Share this Blog

  • Twitter
  • Facebook
  • Linkedin

Other stuff you might like

  1. Are your offices ready for a post-lockdown return to work?
  2. Preparing for the New Normal | Webinar
  3. Retail reconsidered | KL Stores: a case study series exploring innovation in retail
The hottest topics in technology
  • Adtech & martech
  • Agile
  • Artificial intelligence
  • EBA outsourcing
  • Brexit
  • Cloud computing
  • Complex & sensitive investigations
  • Connectivity
  • Cryptocurrencies & blockchain
  • Cybersecurity
  • Data analytics & big data
  • Data breaches
  • Data rights
  • Digital commerce
  • Digital content risk
  • Digital health
  • Digital media
  • Digital infrastructure & telecoms
  • Emerging businesses
  • Financial services
  • Fintech
  • Gambling
  • GDPR
  • KLick DPO
  • KLick Trade Mark
  • Open banking
  • Retail
  • SMCR
  • Software & services
  • Sourcing
  • Travel
close
The hottest topics in technology
  • Adtech & martech
  • Agile
  • Artificial intelligence
  • EBA outsourcing
  • Brexit
  • Cloud computing
  • Complex & sensitive investigations
  • Connectivity
  • Cryptocurrencies & blockchain
  • Cybersecurity
  • Data analytics & big data
  • Data breaches
  • Data rights
  • Digital commerce
  • Digital content risk
  • Digital health
  • Digital media
  • Digital infrastructure & telecoms
  • Emerging businesses
  • Financial services
  • Fintech
  • Gambling
  • GDPR
  • KLick DPO
  • KLick Trade Mark
  • Open banking
  • Retail
  • SMCR
  • Software & services
  • Sourcing
  • Travel
Kemp Little

Lawyers
and thought leaders who are passionate about technology

Expand footer

Kemp Little

138 Cheapside
City of London
EC2V 6BJ

020 7600 8080

hello@kemplittle.com

Services

  • Commercial technology
  • Consulting
  • Disputes
  • Intellectual property
  • Employment
  • Immigration

 

  • Sourcing
  • Corporate
  • Data protection & privacy
  • Financial regulation
  • Private equity & venture capital
  • Tax

Sitemap

  • Our people
  • Insights
  • Events
  • About us
  • Contact us
  • Cookies
  • Privacy
  • Terms of use
  • Complaints
  • Debt recovery charges

Follow us

  • Twitter
  • LinkedIn
  • FlightDeck
  • Sign up for our newsletters

Kemp Little LLP is a limited liability partnership registered in England and Wales (registered number OC300242) and is authorised and regulated by the Solicitors Regulation Authority. Its registered office is 138 Cheapside, London EC2V 6BJ. The SRA Standards and Regulations can be accessed by clicking here.

  • Cyber Essentials logo
  • LORCA logo
  • ABTA Partner+ logo
  • Make Your Ask logo
  • FT Innovative Lawyers 2019 winners logo
  • Law Society Excellence Awards shortlisted
  • Legal Business Awards = highly commended
  • Home
  • Our people
  • Services
    • Business restructuring and reorganisation
    • Commercial technology
    • Consulting
    • Corporate
    • Data protection & privacy
    • Digital content & reputation risk
    • Disputes
    • Employment
    • Financial regulation
    • Immigration
    • Innovation
    • Intellectual property
    • Private equity & venture capital
    • Sourcing
    • Tax
    • Travel
  • Resources
  • Insights
  • Covid 19: Your Business Continuity
  • Events
  • About us
    • Who we are
    • Our social responsibilities
    • Our partnerships
    • Join us
  • Contact us
  • FlightDeck
  • Sign up for our newsletters
  • Follow us
    • Twitter
    • LinkedIn
close
close
close

Send us a message

Fill in your details and we'll be in touch soon

[contact-form-7 id="4941" title="General contact form"]
close

Sign up for our newsletter

I would like to receive updates and related news from Kemp Little *

Please select below any publications that you would like to receive:

Newsletters

close

Register for future event information

[contact-form-7 id="4943" title="Subscribe to future events"]
close
close
Generic filters
Exact matches only

Can't remember their name? View everyone

  • Home
  • Our people
  • Services
    • Business restructuring and reorganisation
    • Commercial technology
    • Consulting
    • Corporate
    • Data protection & privacy
    • Digital content & reputation risk
    • Disputes
    • Employment
    • Financial regulation
    • Immigration
    • Innovation
    • Intellectual property
    • Private equity & venture capital
    • Sourcing
    • Tax
    • Travel
  • Resources
  • Insights
  • Covid 19: Your Business Continuity
  • Events
  • About us
    • Who we are
    • Our social responsibilities
    • Our partnerships
    • Join us
  • Contact us
  • FlightDeck
  • Sign up for our newsletters
  • Follow us
    • Twitter
    • LinkedIn