Data Bytes 50 – Your UK and European Data Privacy update for August 2024
18 September 2024
18 September 2024
Welcome back to a slightly delayed August edition of Data Bytes. This month's edition includes commentary on some large and unusual fines. In the UK, August saw the first intention to fine a data processor, with a proposed fine of over £6 million heading its way towards Advanced Software Group following a ransomware attack on this supplier in 2022 affecting NHS critical services. Meanwhile, in Europe, data transfers are back in the lime-light as Uber have been fined an eye watering 290 million Euro for a failure to have an appropriate data transfer mechanism in place over a 2 year period when sending driver details to a group company in the US.
Before you think this was rather careless of Uber not to have standard contractual clauses (SCCs) in place, it's worth a read of the facts of the case. In Uber's defence, there was not an appropriate safeguard available to it during the time of the alleged breach. The European Commission in questions and answers on EU SCCs had stated that the existing EU SCCs cannot be used for transfers to a party outside the EU that is subject to the GDPR’s scope of application, because the SCCs would then produce duplicates and deviations from the obligations under the GDPR. The relevant Uber entities in this matter – Uber B.V. (In the Netherlands) and Uber Technologies Inc. (in the US) – qualify as joint controllers and are both within the scope of application of the GDPR. This lacuna does appear to be being rectified with news this month that the European Commission is shortly going to publish a version of the SCCs to be used in this scenario, but at that time and to this day, no such clauses have been published. Uber are appealing the fine.
Get your Data Bytes here.
August saw the ICO launch its fifth and final call for evidence on the allocation of controllership across the GenAI supply chain and it is worth noting that:
This call for evidence is open until 18 September. We await to hear when the ICO expects to publish its finalised guidance on generative AI. In the meantime the 5 consultation documents contain some useful intelligence on the ICO's thinking in a number of areas.
The ICO issued its intention to fine Advanced Software Group Ltd (Advanced) £6.09 million following a ransomware attack in August 2022 which disrupted critical NHS services and affected over 80,000 individuals whose data, including medical records, was exfiltrated.
Advanced is a processor and software provider to NHS and healthcare providers and this is the first time we have seen the ICO look to fine a data processor following a data breach. Data processor breaches often affect multiple entities and we have seen a growing trend in such supply chain incidents. The proposed fine is also of interest against the backdrop of the ICO's published approach not to impose large fines against the public sector, favouring non financial penalties to stop money circulating around the public purse. The ICO are able to have more impact by targeting the processor rather than the public bodies in this scenario. We await to see whether this will be the first of a number of fines against processors.
It's rare that we get a court of appeal decision on issues of data protection, but the case of Farley v Paymaster (1936) Limited (trading as Equiniti) 2024 EWCA Civ 781, has given us data geeks a couple of reasons to read the judgment.
Equiniti administered the pension scheme for the Sussex Police Force. In August 2019 Equiniti posted the claimant's annual pension statements to their former addresses rather than their current addresses in error. The letter contained the data subject's name, date of birth, occupation, salary, pension details and NI number. 474 Sussex police Force officers brought a claim against Equiniti. The majority of cases relied on an inference that the envelopes had been opened and read by a third party and the claimants brought claims for "anxiety, alarm, distress and embarrassment".
The high court struck out all but 14 claims, where they had evidence that the letter had in fact been opened and even for those 14 claims, the judge commented that they "would appear to be very far from being serious cases". However on appeal the claimants claimed that the extraction of the information from the database, electronic transfer of the data to the paper document constituted processing, without the need for anyone to have opened the envelope and read the contents. The Court of Appeal have granted permission to appeal.
This will be an interesting case to watch on two points (i) whether data breach claims for low value amounts are ever likely to succeed in UK courts; and (ii) the potential wide scope of the definition of processing: whereas High Court found there had not been any "real processing" in the absence of the letter being opened or read by a third party, the Court of Appeal suggests putting a document in an envelope for posting and/or sending this envelope in the post, irrespective of whether it is accessed or not, could be considered processing.
The ICO has issued a warning to 11 social media and video sharing platforms regarding their inadequate privacy practices concerning children and calling them to improve their child focussed privacy practices. As we see the UK's fast moving digital sector evolve, the ICO remains steadfast in its intention to maintain focus on children's privacy in this every changing landscape. Therefore, even non-social media companies should take heed of this warning and ensure privacy by design is adhered to when processing children's data.
This warning follows the ICO's ongoing review of both social media platforms and video sharing platforms, as part of its Children’s Code Strategy, where it found varying levels of compliance, with 11 of the 34 platforms reviewed showing inadequacy in protecting children’s privacy. These platforms identified are being asked to address issues related to default privacy settings, geolocation, and age assurance, and to demonstrate how their practices align with the Children’s Code of Practice.
The ICO has introduced a new, user-friendly privacy policy tool or generator, particularly for SMEs and sole traders, to help create bespoke privacy notices. The tool attempts to simplify compliance with data protection laws by helping organisations communicate how they handle personal information clearly. Articles 13 and 14 of the UK GDPR place obligations on controllers to provide privacy information to data subjects. It features sector-specific sections for industries such as finance, education, health, and retail, which is aimed at helping to ensure that privacy notices are sufficiently tailored.
The tool also offers two different types of privacy notice that can be generated, one being for customer and supplier information and another for staff and volunteer information.
On 16 July, the EDPB published an FAQ for European data subjects ("FAQ") about the EU-US Data Privacy Framework ("DPF"). The DPF is a self-certification mechanism for U.S. companies that ensures an adequate level of protection for personal data transferred from the European Economic Area ("EEA") to the U.S. The European Commission adopted the adequacy decision for the DPF on 10 July 2023, allowing personal data to be transferred without additional safeguards or authorisation to organisations that self-certify. The DPF covers all types of personal data, including commercial, health, and HR Data, if the U.S. company is certified to process such data. U.S. companies can certify on the U.S. Department of Commerce's website (Data Privacy Framework Certification List) ("DPF List").
The FAQs:
The EDPB has issued a statement on the role of Data Protection Authorities ("DPAs") under the new AI Act. The AI Act requires Member States to designate one or more Market Surveillance Authorities ("MSAs") at national level before 2 August 2025, for the purpose of supervising the application and implementation of the AI Act. The AI Act, which was published in the Official Journal on 12 July 2024, lays down harmonised rules for placing on the market, putting into service and using AI systems in the EU. The AI Act aims to promote human-centric and trustworthy AI as well as to ensure a high level of protection of the fundamental rights, including the rights to privacy and to the protection of personal data.
The EDPB recommends the appointment of member state DPAs as the MSAs for high-risk AI systems in the areas of biometrics, law enforcement, migration and administration of justice and democratic processes (as mentioned in Article 74(8) of the AI Act). These high-risk AI systems are likely to affect the rights and freedoms of individuals with regard to processing personal data. The EDPB argues that this would ensure a coherent and effective supervision of both the AI Act and the GDPR, as well as facilitate the cooperation among different regulatory bodies and provide legal certainty for all stakeholders by providing a single contact point.
On 22 July, Uber was fined €290 million by the Dutch Data Protection Authority for violating data transfer obligations under the GDPR. Uber has already issued a statement confirming that it will appeal this decision. This fine comes after French association La Ligue des droits de l'Homme, representing more than 170 Uber drivers, filed a collective complaint before the CNIL, the French Data Protection Authority. As Uber has its headquarters in the Netherlands, the Dutch Data Protection Authority was the competent authority to conduct the investigation.
Uber was found to be in breach of article 44 GDPR governing the transfer of personal data to third countries. According to the Dutch Data Protection Authority, for two years, Uber had collected data (such as taxi licenses, location data, photos) and sensitive information (such as criminal and medical records) about the drivers and had transferred it to its headquarters in the United States without an appropriate safeguards such as the use of EU SCCs (which Uber stopped using in 2021).
In Uber's defence, there was not an appropriate safeguard available to it during the time of the alleged breach. The European Commission in questions and answers on EU SCCs had stated that the existing EU SCCs cannot be used for transfers to a party outside the EU that is subject to the GDPR’s scope of application, because the SCCs would then produce duplicates and deviations from the obligations under the GDPR. The relevant Uber entities in this matter – Uber B.V. (In the Netherlands) and Uber Technologies Inc. (in the US) – qualify as joint controllers and are both within the scope of application of the GDPR.
This heavy fine represents almost 1% of Uber's global revenue in 2023. This is not the first time Uber has been fined by the Dutch DPA – it had already received a €600,000 fine in 2018 and a €10 million fine in 2023 for several breaches of driver information.
The Spanish Data Protection Agency (the AEPD) has fined UNIQLO EUROPE, Ltd (a branch in Spain) for infringements of: (i) article 5(1)(f) GDPR – ensuring the integrity and confidentiality of personal data; and (ii) article 32 GDPR by not applying appropriate technical and organisational measures to prevent and mitigate the risk of such incidents.
Former employees of UNIQLO reported that one of them had received a PDF file containing the payroll data of 447 workers of the company due to a human error of the HR department. The payroll data disclosed included name, national identity number, Social Security number, bank account and remuneration received. It is stated that when this breach occurred, the HR department did not inform the company of this breach.
The AEPD proposed to impose a fine of €300,000 for the infringement of article 5(1)(f) GDPR and a fine of €150,000 for the infringement of article 32 GDPR, as well as an order to UNIQLO to adopt measures to adjust its processing to the GDPR within a specified period. UNIQLO paid the sanction voluntarily within the deadline to submit allegations implied the recognition of its responsibility and the application of a 40% reduction in the fine, resulting in a final amount of €270,000.
A complaint was filed by a consumer against ID FINANCE SPAIN, S.A.U., a financial company, for refusing to delete their personal data from its credit information systems despite their requests and evidence of identity theft. The consumer had requested deletion of their data from ID FINANCE in December 2022, but ID FINANCE refused to delete it claiming that:
The AEPD:
Authors: Rhiannon Webster, Partner; Nicolas Quoy, Partner; Alexander Duisberg, Partner; Andreas Mauroschat, Partner; Cristina Grande, Counsel; Shehana Cameron-Perera, Senior Associate; Tom Brookes, Senior Associate; Antoine Boullet, Senior Associate; Julia Bell, Associate; Lisa Kopp, Associate; David Plischka, Associate; Carmen Gordillo, Associate; Latasha Kirimbai; Junior Associate; Nilesh Ray, Junior Associate; Hana Byrne, Junior Associate; María Baixauli, Junior Associate; Melvin Cheung, Trainee Solicitor; Anne Wecxsteen, Trainee Solicitor
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.
Sign-up to select your areas of interest
Sign-up