Legal development

Data Bytes 57: Your UK and European Data Privacy update for April 2025

Triangular Colorbond profiles

    Welcome back to our April edition of Data Bytes. It’s a cyber heavy edition this month with the UK facing a wave of cyber attacks on the retail sector. This is triggering companies across all sectors to dust off their incident response plans and ensure they are match fit in the event of a significant attack.

    Keep scrolling to our spotlight section, where Tom Brookes, a senior associate in the UK data team has taken a deep dive into the ICO’s much awaited anonymisation guidance. When is data truly anonymous such that it can be used outside the confines of data protection law? The ICO has set out its stall in this area and we now await the EU to produce its own guidelines.

    Get your Data Bytes here.

    Updates from the UK

    1. The latest on the Cyber Security and Resilience Bill 

    The UK government has published a statement detailing its proposals for the Cyber Security and Resilience Bill (which will replace the NIS regulations in the UK) and key points to note are:

    • It will be broad in this application, bringing more firms into scope such as managed service providers who have “unprecedented access to clients’ IT systems, networks, infrastructure and data”;
    • It will introduce a power for regulators to identify and designate specific high-impact suppliers as ‘designated critical suppliers’;
    • It will update and expand the current incident reporting requirements for regulated entities; and
    • It will improve the ICO’s information gathering powers.

    In addition, the statement sheds light on four new measures which are under consideration for the Bill:

    • Bringing data centres into scope of the regulatory framework;
    • Publishing a statement of strategic priorities for regulators so that there is a clear and coherent framework for cyber security regulation across the 12 regulators and their sectors;
    • Proposal for government to have executive powers to be able to respond to cyber threats; and
    • Proposal to grant the Secretary of State with a new power to direct a regulator to take action when it is necessary on national security grounds.

    Whilst we await a first draft of the text of the Bill to land in parliament, it is clear from this governmental statement that the intention is for the BIll to align with the approach of the EU’s NIS 2 Directive.  Once we have a published draft of the Bill, we will be casting our spotlight onto it for all Data Bytes readers.

    2. Merseyside Law Firm fined £60,000 following Cyber Attack

    The ICO has fined law firm DPP £60,000 (1.7% of its turnover) after it experienced a cyber-attack in 2022 following security failures which led to a brute force attack (comprising of 400 network access attempts). During the attack, an administrator account which was set up externally was accessed and used to further access a legacy case management system. This allowed the attackers to move laterally across DPP's network and exfiltrate over 32GB of data affecting 791 individuals.

    Due to the nature of the firm’s specialisms in crime, military, family fraud, sexual offences and actions against the police, the attack resulted in access to and publication of highly sensitive and confidential personal information on the dark web.

    In its monetary penalty notice, the ICO honed in on: 

    • failure to adopt the principle of least privilege;
    • failure to regularly audit administrative accounts;
    • failure to implement multi-factor authentication (MFA) on its administrator account which had full unrestricted access across DPP’s network and to change the password on the account; 
    • failure to perform an asset management audit or similar audit which would have otherwise discovered the issues associated with the administrator account;
    • failure to carry out a risk assessment on the excessive privileges granted to the administrator account which were not in alignment with the function needs;
    • DPP’s complete reliance on third party IT contractors; and
    • DPP’s lack of Cyber Essentials accreditation.

    The ICO’s publication of the errors that led to this cyber-attack are a warning  to all organisations about, at the very minimum, the importance of MFA and regular audits of accounts and networks. We recommend sharing the monetary penalty notice with your CISO/Info Security teams to discuss.

    3. The importance of multi-factor authentication

    If you’ve just read the above summary about DPP’s fine, then you’ll know the answer is yes. In similar circumstances, in October 2023, the British Library reported a ransomware attack to the ICO which arose out of the lack of multi-factor authentication (MFA) on an administrator account.

    In the ICO’s statement, it reiterates the same messaging as DPP and implores organisations to proactively mitigate risks against cyber attacks such as by implementing MFA, regularly scanning for vulnerabilities and keeping systems up to date with the latest security patches.

    If you need another reason to discuss MFA with your CISO/Info Security teams, look no further than the British Library’s cyber incident review, which provides an overview of the cyber attack and key lessons learnt to help other organisation's that may experience similar incidents.

    4. Retail industry hit by wave of cyber-attacks

    In the space of a few weeks, Marks & Spencer, Co-op and Harrods have all been victim to cyber-attacks. Here’s a few of our thoughts on what other organisations can learn from these public events:

    • External communications can be key to maintaining consumer goodwill and reputational damage. Whilst M&s have remained resolutely silent as to whether customer data has been affected (the implication being that they don’t believe it has), Co-op down played the impact of their attack, only for threat actors to publicly disprove this by presenting evidence to the BBC of the substantial amounts of affected customer data, prompting Co-op to publicly apologise. This highlights the common scenario in a cyber incident that although you may not have evidence that customer data has been affected,  you often won’t have evidence that it hasn’t been. Communications in these situation can be challenging.
    • The M&S incident in particular, highlights the disruption, reputational damage and loss of consumer goodwill and financial impact that a cyber incident can have on any business and its payment infrastructure – M&S was forced to suspend contactless payments and online orders.
    • It should be a wake-up call to all organisations that cyber planning and the ability to operationalise such plans in the eye of an incident are a whole company’s issue – now is the time to ensure that your organisation is cyber ready and has undergone comprehensive planning and war gaming, most importantly at the most senior levels of your organisation.  Take a look at DSIT’s Cyber Governance Code of Practice Cyber Governance Code of Practice - GOV.UK which formalises the UK Government’s expectations of directors for governing cyber risk in the same way as any other material or principle business risk.
    • Get in touch with the Ashurst team if you want more information about our legal and risk services on cyber readiness, maturity planning and uplift, cyber governance and cyber response and recovery.

    5. ICO publishes its review on the financial sector’s use of children’s data

    As part of its ICO25 strategic plan, the ICO reviewed the use of children’s personal data by over 40 organisations in the financial sector and published its findings. Key findings are:

    • Governance – whilst most organisations had the appropriate policies governing use of children's data and provide data protection training, there is a gap for specific training for processing children’s personal data.
    • Transparency – the ICO found that less than half of organisations had age-appropriate transparency information. To be ‘age-appropriate’, ensure privacy information is written in age-appropriate language, avoid jargon and technical terminology and contain engaging descriptions or illustrations; carry out testing to check understandability; and consider providing privacy information to children as their age increases and their understanding develops. The ICO is clear that children have a right to be informed about what will happen with their information and responsibility cannot be passed over to parents because it may be easier.
    • Consent – many organisations ask parents to provide consent on behalf of their children. As children get older and are able to provide (or withdraw) their own consent, consent should be refreshed and provided by the child. 
    • Age verification - Processes to verify the age of children were robust across all organisations.
    • Marketing – although many organisations have a policy of not marketing to children, there is often limited distinction between whether recipients are adults or children. Therefore, organisations should ensure that internal procedures for determining who is a recipient of a marketing communication are robust. Where communications are, or are likely to be, sent to a child, a DPIA should be carried out and all information provided about the processing is age-appropriate.
    • Data Subject Rights – organisations should ensure that internal procedures for handling data subject rights requests are not restricted based on arbitrary age considerations which could prevent children from exercising their rights.

    Whilst this review focused on the use of children's data in the financial services sector, the findings will be relevant and directly applicable to any organisation collecting and processing children's data. Children's privacy is an ongoing priority of the ICO and we expect to see more investigations/enforcement action taken in this regard. Therefore if your organisation is processing children's data, we strongly advise completing a DPIA in the first instance to tease out the relevant issues, including those set out above. 

    6. ICO release statement on the use of facial recognition by police

    In a statement at the beginning of the month, the ICO acknowledged that facial recognition technology (FRT), which involves the processing of large amounts of sensitive personal data, offers significant benefits in crime prevention and detection. However, the ICO was clear that the use of FRT must be necessary, proportionate, and designed to meet standards of fairness and accuracy. 

    Whilst this statement was focused on police use of FRT, the ICO noted that they are planning to launch a more general AI and biometrics strategy "later in the spring". There are no specific details yet on what this will entail but it is likely to build on the existing ICO biometric guidance and focus on the safeguards surrounding the use of FRT. We will be sure to do a deep dive on this strategy whenever it is published so watch this space.

    Updates from the EU

    1. EDPB adopts Guidelines on Processing Personal Data through Blockchains

    The EDPB has recently published guidance to controllers and processors on processing personal data in the context of blockchains; this is open for public consultation until 9 June 2025.

    The EDPB emphasises the need for governance mechanisms that determine the roles and responsibilities under the GDPR, which can be centralised or distributed, registered on the blockchain, or agreed upon separately.

    Storing personal data in a directly identifying form on a blockchain has several implications. The data will stay on the blockchain with no practical possibility to delete or modify it in most cases. This conflicts with the data subject rights of erasure, objection and rectification under GDPR. The EDPB presents, among other things, three key areas to consider:

    • Encryption of personal data: Personal data can be encrypted before being stored on a blockchain. This ensures that only individuals with the appropriate decryption key can access the data in clear text. If this decryption key is deleted, the encrypted data becomes unintelligible. However, this remains true only until the encryption algorithm is broken, decryption techniques advance sufficiently to decrypt the ciphertext, or if the key has already been compromised or leaked.
    • Hashing of personal data: Another measure is to store only a salted or keyed hash of personal data on the blockchain. The unhashed data and the secret key or long random salt are stored confidentially off-chain. Although this is called "off-chain" storage, GDPR still applies to this processing activity, and the hash is considered personal data, as are any other identifiers. The advantage of this architecture is that the original data cannot easily be recovered from the hash. After deleting the secret key or salt, the hash should not be linkable to the original data, assuming the algorithm is secure, and the keys or salt are not compromised or poorly chosen.
    • Cryptographic commitments: Instead of storing personal data directly, the controller and processor could store it as a cryptographic commitment. The confidentiality of the personal data can be protected by putting only the commitment on-chain, while storing the original data off-chain. Once the original data and its corresponding witness are deleted, the commitment stored on the blockchain becomes useless. As a result, it will be impossible to recover or recognize the original personal data.

    2. EU Commission sanctions Apple and Meta for breaches of the Digital Markets Act (DMA)

    On 23 April 2025, the EU Commission issued fines of over EUR 500 million on Apple and EUR 200 million on Meta for breaching their respective obligations under the DMA.

    In the case of Apple, the EU Commission determined that the company violated the DMA’s anti-steering obligation by imposing restrictions that prevent app developers from informing consumers about offers available outside Apple’s App Store

    Further, the EU Commission fined Meta for its consent or pay model. The EU Commission found that Meta does not enable consumers to choose a service that uses less of their personal data but is otherwise equivalent to the “personalized ads” service. As we mentioned in our previous Data Bytes, now (more than ever) is the time to take a deep dive into your targeted advertising practices to ensure it is compliant with direct marketing rules.

    3. EDPB releases position on implementation of the PNR Directive 

    At its March 2025 plenary, the EDPB issued a new statement on how EU countries should implement the Passenger Name Record (PNR) Directive, in light of the CJEU judgment in case C-817/19. This follows a previous statement from December 2022.

    The EDPB provides additional guidance to Passenger Information Units (PIUs) on adjusting PNR data processing in line with the court’s ruling. PNR data includes personal details collected by airlines such as names, itineraries, contact info, and payment methods.

    The statement includes practical recommendations for aligning national laws with the CJEU’s findings, addressing issues like flight data selection and data retention limits. The Board stresses that PNR data should not be stored for more than six months unless strictly necessary and proportionate to the PNR Directive’s objectives.

    EDPB Chair Anu Talus emphasized the importance of a consistent, EU-wide approach to PNR data processing that balances security with privacy. While some Member States have begun updating their laws, the Board calls for urgent and widespread implementation across the EU.

    Updates from Germany

    New German Federal Government plans certain reforms on data protection law

    The new German Federal Government has dedicated an important chapter of its Coalition Agreement to matters of data protection and digitalisation. It remains to be seen how government will implement its goals into legislation.

    The Federal Government intends to anchor the Data Protection Conference (DSK) in the Federal Data Protection Act (BDSG) in order to develop common standards. The Federal Government wants to use all the leeway provided by the GDPR to ensure consistency, uniform interpretation and simplification in data protection for small and medium-sized enterprises, employees and volunteers. At the European level, it aims to ensure that non-commercial activities (e.g. in associations), small and medium-sized enterprises and low-risk data processing (e.g. customer lists of tradespeople) are excluded from the scope of the GDPR. Further and with view to a more efficient supervisory regime, it intends to consolidate the responsibilities and powers of the Federal Commissioner for Data Protection and Freedom of Information, with the new title "Federal Commissioner for Data Use, Data Protection and Freedom of Information".

    Updates from France

    Applications for authorisation in the health sector: CNIL's action in 2024 in review

    In 2024, the French Data Protection Authority (CNIL) recorded 619 authorisation requests for health data processing, representing a 20% increase compared to 2023. Of these, 472 concerned research projects, while 147 related to non-research data processing activities. Thanks to a marked improvement in the quality of submitted applications, processing times were reduced: the average review period dropped to 65 days for research projects (down from 73 days in 2023) and to 49 days for non-research requests (down from 65 days).

    In total, 397 authorizations were granted, 174 requests were closed without further action—often due to the absence of a required formality or incomplete documentation—and only 3 requests were denied, primarily due to insufficient data security measures.

    This overall improvement can be attributed to enhanced support provided by the CNIL, which continued to publish practical guidance, organized webinars, and emphasized key areas such as pseudonymization, data minimization, and compliance with the SNDS framework.

    To build on this momentum, a new authorization request form will be made available by mid-2025. The CNIL is also currently updating its health data reference frameworks following a public consultation.

    For more details, you can access the article here (French only).

    Updates from Spain

    1. When GDPR obstructs the enforcement of a contract (AEPD, 27 January 2025)

    A young woman filed a complaint with the Spanish Data Protection Authority (AEPD) following the online publication of an adult video in which she appeared, arguing that it violated personal data protection regulations. Although a contract had been signed with a producer, the AEPD found that the website had no legal basis to process her image. She was not a party to the contract cited between the publisher and the producer, had not given specific consent, and had not received clear information regarding the use of her data. As a result, the publisher was fined €10,000 and ordered to remove the complainant’s image.

    This decision has been viewed as surprising, given that the complainant had explicitly authorised, in the original contract, to the use of her image by third parties, thereby making her participation in subsequent agreements unnecessary. Requiring GDPR consent for each transfer of rights would be considered illogical and could undermine legal certainty for distributors, who might be forced to cease distribution of content solely on the basis of a withdrawal of consent.

    For more details, you can access the article here (Spanish only).

    2. Processor fined for not providing updated information on its sub-processors to the data controller

    The Consellería (the Valencian Regional Ministry of Universal Health and Public Health), as data controller filed a complaint with the Spanish Data Protection Agency (AEPD) on May 16, 2023, alleging that MARINA SALUD (its data processor which managed public health services on its behalf) failed to comply with multiple requests from the Consellería to provide copies of the contracts related to the software and services used for handling sensitive personal data.

    The AEPD initiated a sanctioning procedure against MARINA SALUD for a potential violation of Article 28(2) GDPR and consequently imposed a fine of 500,000 euros to MARINA SALUD for failing to provide the necessary information about the sub-processors, thereby preventing the Consellería from exercising its control over the data processing activities. 

    The AEPD's decision:

    • was based on the fact that Article 28(2) GDPR mandates that data processors must inform the data controller about any changes involving the incorporation or replacement of other processors.
    • was imposed despite the fact that MARINA SALUD had a general authorisation from the Consellería to subcontract data sub-processors - this general authorization still required it to inform on the specific sub-processors' contracts; and despite its assertion that the GDPR did not explicitly required them to provide copies of the contracts with sub-processors to the data controller and that providing copies of the requested contracts did not align with the functions of this organism;
    • took into account: (i) the nature and duration of the infringement; (ii) the categories of personal data involved (which included sensitive health data); and (iii) the fact that the data processing affected to the public health services provided by MARINA SALUD; and (iv) the company's annual turnover, ensuring that the fine was proportionate but significant enough to reflect the gravity of the breach.

    It is a warning to processors that they are not free from regulatory scrutiny and enforcement action and that they must ensure that they inform controllers of any changes to their sub-processors even when they have a general authorisation – something that is often disregarded due to it being impractical and onerous!

    3. Company fined 200,000 euros for breaching confidentiality during a harassment procedure

    SERVICIOS ESPECIALES, S.A. (the "Company"), was subject to a sanctioning procedure initiated by the Spanish Data Protection Agency (the "AEPD") following two complaints from two individuals, A.A.A. and B.B.B., who claimed that their identities were disclosed during a workplace harassment investigation. The complainants alleged that the company had published their names and surnames along with the term "complainant" in relation to the harassment investigation.

    The Company sent an email to the work's council, informing them that the harassment investigation had concluded identifying each of the five complainants and the ten defendants by name, surname, and job position. This information was disseminated within the Company because: (i) the Company also sent the same information, to a broader list of 15 individuals which made the identities of the complainants and the accused widely known within the workplace; (ii) the work's council then forwarded this email, to the complainants themselves and (iii) one of the accused individuals posted a message in a work-related WhatsApp group, thanking the complainants sarcastically. The latter exposed the identities and led to additional emotional distress (an anxiety attack) for the complainants.

    The AEPD imposed a fine of 200,000 euros on the Company for breaching Article 5(1)(f) GDPR and found that the Company had failed to protect the confidentiality of the complainants' personal data during the harassment investigation. In determining the sanction, the AEPD considered:

    • the nature and gravity of the breach - the affected employees suffered an anxiety attack and required medial leaves;
    • duration of the infringement - the data was disseminated in a manner that allowed multiple individuals within the workplace to access it;
    • the negligence of the company – it had failed to implement adequate measures to protect the confidentiality of personal data, despite the sensitive nature of the information; and
    • the impact on the 15 affected individuals (15 employees).

    At the end the amount of the fine was reduced to 120,000 euros upon acknowledgment of responsibility and voluntary payment by the company.

    Spotlight

    ICO anonymisation guidance – pragmatic but of partial assistance

    The ICO published on 28 March guidance to help organisations understand effective anonymisation and pseudonymisation techniques and related data protection law obligations. The guidance is extensive (98 pages including case studies) and for many data protection practitioners long overdue. The previous ICO guidance on anonymisation was released on 2012 when the Data Protection Act 1998 was still in force and the ICO first launched a consultation on the latest incarnation of the guidance in September 2022.

    In this article we look at some of the key points from the guidance and how this compares to the position in the EU.

    Effective Anonymisation 

    Anonymisation when implemented effectively offers organisations the opportunity to use data outside of the confines of data protection law. However, the lack of specificity in the UK GDPR about the concept coupled with the absence of up to date regulatory guidance has meant that it has been challenging and to a certain extent risky for businesses to implement in practice.

    The latest guidance from the ICO goes someway to addressing these points and includes specific sections that are relevant for different types of stakeholders – sections 3 and 4 are intended for technical experts whilst section 1 and 4 are intended for decision makers.

    A crucial question for data protection practitioners is to what extent can pseudonymised personal data be considered "effectively" anonymised under certain conditions. The ICO sets out its position that pseudonymised personal data can, once a certain threshold is met, be considered to meet the legal threshold for anonymisation. For example, where the user of personal data subject to pseudonymisation techniques such as hashing does not have access to the hash key and is unable to identify individuals using other "reasonably likely" means.

    This position ultimately turns on the concept of identifiability which the ICO guidance examines in detail. In particular, the ICO does not view identifiability as absolute but instead suggests a “spectrum of identifiability approach" based on "identifiability risk". The ICO also discusses established anonymisation concepts such as the “motivated intruder” test. This formed part of the ICO's past guidance but has been brought up to date with references to motivated intruders have access to AI tools such as generative AI chatbots as sources of information.

    The ICO helpfully confirms its view that “distinguishing one record from others in a table is not sufficient by itself to make the person the record relates to identifiable”. Instead, the ICO recommends that identifiability assessments should consider whether additional sources of data are available to either “take action on a person specifically” or “discover someone’s real-world identity”. This is a pragmatic position by the ICO which we hope the ICO will provide further details on during its upcoming webinar on the topic on 22 May (you can register here).  

    EU vs UK Approach

    Similar to the UK, the existing anonymisation and pseudonymisation guidance in the EU is several years old and was released before the EU GDPR entered into effect.

    We are expecting guidance from the European Data Protection Board (EDPB) this year on both pseudonymisation and anonymisation. The EDPB released draft pseudonymisation guidance for consultation in January (see here) and we are awaiting the finalised guidance.

    The EDPB takes a more restrictive view in its draft guidance of anonymisation in comparison to the ICO. Whilst the ICO accepts that pseudonymised personal data can be considered anonymous or "effectively anonymous" once a certain threshold is met and the risk of re-identification is sufficiently remote, the EDPB is clear that re-identification must be impossible. This is a much more challenging bar for organisations to meet and in practice means that pseudonymisation techniques are limited to acting as important security measures which assist with GDPR compliance.

    An additional complication from an EU perspective is that we are awaiting the ruling of the CJEU in EDPS v Single Resolution Board (see here) – the Advocate General's opinion in this case appears to diverge from the draft EDPB guidelines. We will need to see whether the final CJEU decision will be released before the EDPB guidance is finalised and if this results in a softening of the EDPB approach to anonymization.

    Partial Assistance 

    In summary, whilst the ICO’s guidance brings greater clarity on how achieve anonymisation in compliance with UK laws, it is likely to be only of partial assistance for organisations seeking to anonymise personal data sets subject to both UK and EU data protection regimes. These organisations will need to wait for the anticipated publication of the EDPB’s guidelines and CJEU caselaw so that the EU and UK regulatory positions can be considered side by side.

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.