Data Bytes 57: Your UK and European Data Privacy update for April 2025
14 May 2025

14 May 2025
Welcome back to our April edition of Data Bytes. It’s a cyber heavy edition this month with the UK facing a wave of cyber attacks on the retail sector. This is triggering companies across all sectors to dust off their incident response plans and ensure they are match fit in the event of a significant attack.
Keep scrolling to our spotlight section, where Tom Brookes, a senior associate in the UK data team has taken a deep dive into the ICO’s much awaited anonymisation guidance. When is data truly anonymous such that it can be used outside the confines of data protection law? The ICO has set out its stall in this area and we now await the EU to produce its own guidelines.
Get your Data Bytes here.
The UK government has published a statement detailing its proposals for the Cyber Security and Resilience Bill (which will replace the NIS regulations in the UK) and key points to note are:
In addition, the statement sheds light on four new measures which are under consideration for the Bill:
Whilst we await a first draft of the text of the Bill to land in parliament, it is clear from this governmental statement that the intention is for the BIll to align with the approach of the EU’s NIS 2 Directive. Once we have a published draft of the Bill, we will be casting our spotlight onto it for all Data Bytes readers.
The ICO has fined law firm DPP £60,000 (1.7% of its turnover) after it experienced a cyber-attack in 2022 following security failures which led to a brute force attack (comprising of 400 network access attempts). During the attack, an administrator account which was set up externally was accessed and used to further access a legacy case management system. This allowed the attackers to move laterally across DPP's network and exfiltrate over 32GB of data affecting 791 individuals.
Due to the nature of the firm’s specialisms in crime, military, family fraud, sexual offences and actions against the police, the attack resulted in access to and publication of highly sensitive and confidential personal information on the dark web.
In its monetary penalty notice, the ICO honed in on:
The ICO’s publication of the errors that led to this cyber-attack are a warning to all organisations about, at the very minimum, the importance of MFA and regular audits of accounts and networks. We recommend sharing the monetary penalty notice with your CISO/Info Security teams to discuss.
If you’ve just read the above summary about DPP’s fine, then you’ll know the answer is yes. In similar circumstances, in October 2023, the British Library reported a ransomware attack to the ICO which arose out of the lack of multi-factor authentication (MFA) on an administrator account.
In the ICO’s statement, it reiterates the same messaging as DPP and implores organisations to proactively mitigate risks against cyber attacks such as by implementing MFA, regularly scanning for vulnerabilities and keeping systems up to date with the latest security patches.
If you need another reason to discuss MFA with your CISO/Info Security teams, look no further than the British Library’s cyber incident review, which provides an overview of the cyber attack and key lessons learnt to help other organisation's that may experience similar incidents.
In the space of a few weeks, Marks & Spencer, Co-op and Harrods have all been victim to cyber-attacks. Here’s a few of our thoughts on what other organisations can learn from these public events:
As part of its ICO25 strategic plan, the ICO reviewed the use of children’s personal data by over 40 organisations in the financial sector and published its findings. Key findings are:
Whilst this review focused on the use of children's data in the financial services sector, the findings will be relevant and directly applicable to any organisation collecting and processing children's data. Children's privacy is an ongoing priority of the ICO and we expect to see more investigations/enforcement action taken in this regard. Therefore if your organisation is processing children's data, we strongly advise completing a DPIA in the first instance to tease out the relevant issues, including those set out above.
6. ICO release statement on the use of facial recognition by police
In a statement at the beginning of the month, the ICO acknowledged that facial recognition technology (FRT), which involves the processing of large amounts of sensitive personal data, offers significant benefits in crime prevention and detection. However, the ICO was clear that the use of FRT must be necessary, proportionate, and designed to meet standards of fairness and accuracy.
Whilst this statement was focused on police use of FRT, the ICO noted that they are planning to launch a more general AI and biometrics strategy "later in the spring". There are no specific details yet on what this will entail but it is likely to build on the existing ICO biometric guidance and focus on the safeguards surrounding the use of FRT. We will be sure to do a deep dive on this strategy whenever it is published so watch this space.
The EDPB has recently published guidance to controllers and processors on processing personal data in the context of blockchains; this is open for public consultation until 9 June 2025.
The EDPB emphasises the need for governance mechanisms that determine the roles and responsibilities under the GDPR, which can be centralised or distributed, registered on the blockchain, or agreed upon separately.
Storing personal data in a directly identifying form on a blockchain has several implications. The data will stay on the blockchain with no practical possibility to delete or modify it in most cases. This conflicts with the data subject rights of erasure, objection and rectification under GDPR. The EDPB presents, among other things, three key areas to consider:
On 23 April 2025, the EU Commission issued fines of over EUR 500 million on Apple and EUR 200 million on Meta for breaching their respective obligations under the DMA.
In the case of Apple, the EU Commission determined that the company violated the DMA’s anti-steering obligation by imposing restrictions that prevent app developers from informing consumers about offers available outside Apple’s App Store.
Further, the EU Commission fined Meta for its consent or pay model. The EU Commission found that Meta does not enable consumers to choose a service that uses less of their personal data but is otherwise equivalent to the “personalized ads” service. As we mentioned in our previous Data Bytes, now (more than ever) is the time to take a deep dive into your targeted advertising practices to ensure it is compliant with direct marketing rules.
At its March 2025 plenary, the EDPB issued a new statement on how EU countries should implement the Passenger Name Record (PNR) Directive, in light of the CJEU judgment in case C-817/19. This follows a previous statement from December 2022.
The EDPB provides additional guidance to Passenger Information Units (PIUs) on adjusting PNR data processing in line with the court’s ruling. PNR data includes personal details collected by airlines such as names, itineraries, contact info, and payment methods.
The statement includes practical recommendations for aligning national laws with the CJEU’s findings, addressing issues like flight data selection and data retention limits. The Board stresses that PNR data should not be stored for more than six months unless strictly necessary and proportionate to the PNR Directive’s objectives.
EDPB Chair Anu Talus emphasized the importance of a consistent, EU-wide approach to PNR data processing that balances security with privacy. While some Member States have begun updating their laws, the Board calls for urgent and widespread implementation across the EU.
The new German Federal Government has dedicated an important chapter of its Coalition Agreement to matters of data protection and digitalisation. It remains to be seen how government will implement its goals into legislation.
The Federal Government intends to anchor the Data Protection Conference (DSK) in the Federal Data Protection Act (BDSG) in order to develop common standards. The Federal Government wants to use all the leeway provided by the GDPR to ensure consistency, uniform interpretation and simplification in data protection for small and medium-sized enterprises, employees and volunteers. At the European level, it aims to ensure that non-commercial activities (e.g. in associations), small and medium-sized enterprises and low-risk data processing (e.g. customer lists of tradespeople) are excluded from the scope of the GDPR. Further and with view to a more efficient supervisory regime, it intends to consolidate the responsibilities and powers of the Federal Commissioner for Data Protection and Freedom of Information, with the new title "Federal Commissioner for Data Use, Data Protection and Freedom of Information".
In 2024, the French Data Protection Authority (CNIL) recorded 619 authorisation requests for health data processing, representing a 20% increase compared to 2023. Of these, 472 concerned research projects, while 147 related to non-research data processing activities. Thanks to a marked improvement in the quality of submitted applications, processing times were reduced: the average review period dropped to 65 days for research projects (down from 73 days in 2023) and to 49 days for non-research requests (down from 65 days).
In total, 397 authorizations were granted, 174 requests were closed without further action—often due to the absence of a required formality or incomplete documentation—and only 3 requests were denied, primarily due to insufficient data security measures.
This overall improvement can be attributed to enhanced support provided by the CNIL, which continued to publish practical guidance, organized webinars, and emphasized key areas such as pseudonymization, data minimization, and compliance with the SNDS framework.
To build on this momentum, a new authorization request form will be made available by mid-2025. The CNIL is also currently updating its health data reference frameworks following a public consultation.
For more details, you can access the article here (French only).
A young woman filed a complaint with the Spanish Data Protection Authority (AEPD) following the online publication of an adult video in which she appeared, arguing that it violated personal data protection regulations. Although a contract had been signed with a producer, the AEPD found that the website had no legal basis to process her image. She was not a party to the contract cited between the publisher and the producer, had not given specific consent, and had not received clear information regarding the use of her data. As a result, the publisher was fined €10,000 and ordered to remove the complainant’s image.
This decision has been viewed as surprising, given that the complainant had explicitly authorised, in the original contract, to the use of her image by third parties, thereby making her participation in subsequent agreements unnecessary. Requiring GDPR consent for each transfer of rights would be considered illogical and could undermine legal certainty for distributors, who might be forced to cease distribution of content solely on the basis of a withdrawal of consent.
For more details, you can access the article here (Spanish only).
The Consellería (the Valencian Regional Ministry of Universal Health and Public Health), as data controller filed a complaint with the Spanish Data Protection Agency (AEPD) on May 16, 2023, alleging that MARINA SALUD (its data processor which managed public health services on its behalf) failed to comply with multiple requests from the Consellería to provide copies of the contracts related to the software and services used for handling sensitive personal data.
The AEPD initiated a sanctioning procedure against MARINA SALUD for a potential violation of Article 28(2) GDPR and consequently imposed a fine of 500,000 euros to MARINA SALUD for failing to provide the necessary information about the sub-processors, thereby preventing the Consellería from exercising its control over the data processing activities.
The AEPD's decision:
It is a warning to processors that they are not free from regulatory scrutiny and enforcement action and that they must ensure that they inform controllers of any changes to their sub-processors even when they have a general authorisation – something that is often disregarded due to it being impractical and onerous!
SERVICIOS ESPECIALES, S.A. (the "Company"), was subject to a sanctioning procedure initiated by the Spanish Data Protection Agency (the "AEPD") following two complaints from two individuals, A.A.A. and B.B.B., who claimed that their identities were disclosed during a workplace harassment investigation. The complainants alleged that the company had published their names and surnames along with the term "complainant" in relation to the harassment investigation.
The Company sent an email to the work's council, informing them that the harassment investigation had concluded identifying each of the five complainants and the ten defendants by name, surname, and job position. This information was disseminated within the Company because: (i) the Company also sent the same information, to a broader list of 15 individuals which made the identities of the complainants and the accused widely known within the workplace; (ii) the work's council then forwarded this email, to the complainants themselves and (iii) one of the accused individuals posted a message in a work-related WhatsApp group, thanking the complainants sarcastically. The latter exposed the identities and led to additional emotional distress (an anxiety attack) for the complainants.
The AEPD imposed a fine of 200,000 euros on the Company for breaching Article 5(1)(f) GDPR and found that the Company had failed to protect the confidentiality of the complainants' personal data during the harassment investigation. In determining the sanction, the AEPD considered:
At the end the amount of the fine was reduced to 120,000 euros upon acknowledgment of responsibility and voluntary payment by the company.
The ICO published on 28 March guidance to help organisations understand effective anonymisation and pseudonymisation techniques and related data protection law obligations. The guidance is extensive (98 pages including case studies) and for many data protection practitioners long overdue. The previous ICO guidance on anonymisation was released on 2012 when the Data Protection Act 1998 was still in force and the ICO first launched a consultation on the latest incarnation of the guidance in September 2022.
In this article we look at some of the key points from the guidance and how this compares to the position in the EU.
Anonymisation when implemented effectively offers organisations the opportunity to use data outside of the confines of data protection law. However, the lack of specificity in the UK GDPR about the concept coupled with the absence of up to date regulatory guidance has meant that it has been challenging and to a certain extent risky for businesses to implement in practice.
The latest guidance from the ICO goes someway to addressing these points and includes specific sections that are relevant for different types of stakeholders – sections 3 and 4 are intended for technical experts whilst section 1 and 4 are intended for decision makers.
A crucial question for data protection practitioners is to what extent can pseudonymised personal data be considered "effectively" anonymised under certain conditions. The ICO sets out its position that pseudonymised personal data can, once a certain threshold is met, be considered to meet the legal threshold for anonymisation. For example, where the user of personal data subject to pseudonymisation techniques such as hashing does not have access to the hash key and is unable to identify individuals using other "reasonably likely" means.
This position ultimately turns on the concept of identifiability which the ICO guidance examines in detail. In particular, the ICO does not view identifiability as absolute but instead suggests a “spectrum of identifiability approach" based on "identifiability risk". The ICO also discusses established anonymisation concepts such as the “motivated intruder” test. This formed part of the ICO's past guidance but has been brought up to date with references to motivated intruders have access to AI tools such as generative AI chatbots as sources of information.
The ICO helpfully confirms its view that “distinguishing one record from others in a table is not sufficient by itself to make the person the record relates to identifiable”. Instead, the ICO recommends that identifiability assessments should consider whether additional sources of data are available to either “take action on a person specifically” or “discover someone’s real-world identity”. This is a pragmatic position by the ICO which we hope the ICO will provide further details on during its upcoming webinar on the topic on 22 May (you can register here).
Similar to the UK, the existing anonymisation and pseudonymisation guidance in the EU is several years old and was released before the EU GDPR entered into effect.
We are expecting guidance from the European Data Protection Board (EDPB) this year on both pseudonymisation and anonymisation. The EDPB released draft pseudonymisation guidance for consultation in January (see here) and we are awaiting the finalised guidance.
The EDPB takes a more restrictive view in its draft guidance of anonymisation in comparison to the ICO. Whilst the ICO accepts that pseudonymised personal data can be considered anonymous or "effectively anonymous" once a certain threshold is met and the risk of re-identification is sufficiently remote, the EDPB is clear that re-identification must be impossible. This is a much more challenging bar for organisations to meet and in practice means that pseudonymisation techniques are limited to acting as important security measures which assist with GDPR compliance.
An additional complication from an EU perspective is that we are awaiting the ruling of the CJEU in EDPS v Single Resolution Board (see here) – the Advocate General's opinion in this case appears to diverge from the draft EDPB guidelines. We will need to see whether the final CJEU decision will be released before the EDPB guidance is finalised and if this results in a softening of the EDPB approach to anonymization.
In summary, whilst the ICO’s guidance brings greater clarity on how achieve anonymisation in compliance with UK laws, it is likely to be only of partial assistance for organisations seeking to anonymise personal data sets subject to both UK and EU data protection regimes. These organisations will need to wait for the anticipated publication of the EDPB’s guidelines and CJEU caselaw so that the EU and UK regulatory positions can be considered side by side.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.