Health apps and your personal data – are they doing more harm than good?
Advances in digital health products, and increased accessibility to them, are changing the way consumers monitor their health and access medical services. Consumers can now… Read more
Advances in digital health products, and increased accessibility to them, are changing the way consumers monitor their health and access medical services. Consumers can now use health apps to self-diagnose, access services, check symptoms and research medication. For example:
- following FDA approval in the US, recent CE marking approval means that Apple Watch users across 19 countries in Europe can now use the device to take electrocardiogram (ECG) readings measuring their heart health;
- apps such as Push Doctor and Babylon mean that patients can now access, book and attend GP appointments using an app;
- chatbots can dispense medical information to thousands of people at once. For example, “Molly” assesses a patient’s symptoms using speech, text, images and video, and “Florence” acts as a personal nurse, tracking the user’s health and reminding them to take their medication.
The use of health apps is quick, convenient and usually free, meaning the global market for mobile health apps is expected to rapidly increase, with forecasts predicting it will be worth US$102.43 billion by 2022. However, the major emerging theme of the health app revolution is the sharing of vast quantities of data, and a new study by researchers from Canada, the United States and Australia suggests that many companies are not transparent enough about what they do with your personal data collected when you use these apps.
The research team selected 24 of the most popular health-related apps in the UK, US, Canada and Australia available on the Android platform via Google Play, including symptom-checker Ada, and reference guides Medscape and Drugs.Com.
The research team created imaginary users who interacted on the apps and conducted traffic analysis. They then updated the information for one of the imaginary users and analysed the resulting traffic to identify privacy leaks.
Content and network analysis was then performed to map the data flows from the app, including identifying any third (and fourth) parties with whom personal data were shared, and categorising the relationships between the various parties.
The staggering results
The study showed that 79% of the apps tested share your personal data with third parties, particularly data such as browsing behaviour, device name, email address and approximate or precise location. A total of 55 new third parties were identified as directly receiving or processing the data, which included app developers, parent or other group companies and other unconnected third parties. The fourth party network, a further layer of companies to whom data is sent by the third parties, totalled 237 entities. This fourth party network comprised entities even further removed from the health and medical sphere, such as credit reference agencies.
Many of the companies behind the apps reserved the right to collect “deidentified” and aggregated data from your use of their apps for their own commercialisation, such as selling on to pharmaceutical companies, health insurers and other health-related services.
The researchers found that it is nearly impossible to opt out of this data sharing.
Although a lack of transparency when processing any of your personal data is always a cause for concern, personal data relating to health is defined as a special category of personal data under the GDPR and therefore subject to more stringent requirements.
A company must have a lawful basis for processing any of your personal data, but some of the bases most commonly relied upon (such as legitimate interest), cannot be used for processing personal data relating to health. Processing personal data relating to an individual’s health is prohibited unless one of the specified justifications is met, for example having the “explicit consent” of the data subject. The requirements for “explicit consent” are relatively high and the main problem is whether individuals are truly being armed with sufficient information to make an informed decision about how their personal data is used.
Do they really know what they are consenting to?
Anonymised or pseudonymised?
- “Anonymised data” is data that cannot possibly be traced back to any identifiable individual, and therefore is not caught by the requirements under GDPR.
- “Pseudonymised data” is different and although data may appear anonymised at first, if it can be reversed or otherwise matched to an individual using a separately held set of data or information, this will be pseudonymised, not anonymised.
Therefore, true anonymisation can be difficult to achieve, and data which may in isolation seem irrelevant or low-risk can help to build an incredibly detailed profile of one individual when combined with other data sets held by third parties.
Companies must also apply the principles of purpose limitation and data minimisation to any processing of personal data. This means that personal data should be only be collected for “explicit, specific and legitimate purposes”, and should only be “adequate, relevant and limited to what is necessary”.
Collecting and processing personal data outside of these parameters exposes both data subjects and the companies that collect their data to risk: the bigger the cache of personal data that is held, the bigger the potential damage could be in the event of exposure to a cyber attack or data breach.
Health app data breaches
Data breaches are becoming a much more frequent occurrence, and in a 2018 report, health-related organisations accounted for a quarter of those incidents.
2018 provided the first known data breach of a major consumer fitness app when Under Armour, the company that owns diet and fitness tracking app MyFitnessPal, was subject to a significant data breach with personal data relating to 150 million separate accounts compromised by hackers and sold on the dark web.
Poor data protection practices and weak security design can leave your personal data exposed. In 2018 PumpUp, a popular fitness app, left a backend server unprotected on Amazon’s cloud, which allowed anyone who had the IP address of the server to see who was signing on, and the contents of the private messages being sent between users in real time.
The journey so far – a Code of Conduct for mobile health data?
In 2014 the European Commission published a Green Paper consultation into mobile health, which revealed that consumers had typically been slower and more cautious in adopting mobile and tech developments in health than in other areas of their lives due to privacy concerns.
In April 2015, a group of industry stakeholders including Apple, Google and Microsoft began the process of creating a Privacy Code of Conduct for mobile health apps.
In December 2017, the first draft was formally submitted to the then Article 29 Working Party (now the European Data Protection Board). However, the Working Party decided that the imminence of the GDPR coming into force coupled with the fact that the draft did not adequately address all of the GDPR’s requirements meant that it could not approve the Code and so it remains in draft form.
What can you do about your data?
For the majority of consumers, the benefits afforded by mobile health apps far outweigh their privacy concerns. However, as consumers become more data savvy and the digital health sector continues to grow at pace, these concerns will need to be addressed.
You should be extra vigilant with your use of health apps and the data that you are inputting. It is usually possible to better protect your data, but this isn’t easy and often requires a lot of investigation and work.
What should app providers do?
As a result of increased consumer awareness and with the variety of health apps available, health app providers that lead the way in considering protection of consumers’ data are likely to be more popular with consumers. Health app providers should therefore follow these steps to show that they use personal data considerately and securely:
- The GDPR has led to a heightened awareness by consumers over their data, so you should continually analyse any data you collect and ensure it complies the principles relating to the processing of personal data and your legal requirements. You must also identify any special categories of personal data being processed and ensure these meet the additional legal requirements. You should ensure that all considerations and key decisions are well documented.
- With the rise in value of consumer data, cyber breaches are likely to continue to become more prevalent, so ensure security measures are appropriate for the types of data you process.
- Ensure data is anonymised where possible and that any data that is treated as anonymised is truly
- Following reports from studies into the use of data in health apps, consumers will value transparency, so ensure privacy policies are kept up to date and drafted in a way that allows consumers to really understand and control what is happening with their data.
- Keep an eye on developments in relation to the Privacy Code of Conduct for mHealth apps
 Verizon’s annual Data Breach Investigations Report, released on 10 April 2018.
Share this blog
- Adtech & martech
- Artificial intelligence
- EBA outsourcing
- Cloud computing
- Complex & sensitive investigations
- Cryptocurrencies & blockchain
- Data analytics & big data
- Data breaches
- Data rights
- Digital commerce
- Digital content risk
- Digital health
- Digital media
- Digital infrastructure & telecoms
- Emerging businesses
- Financial services
- KLick DPO
- KLick Trade Mark
- Open banking
- Software & services