Data Bytes 56: Your UK and European Data Privacy update for March 2025
22 April 2025

22 April 2025
Welcome back to the March edition of Data Bytes. The data and cyber team have had a busy few weeks talking to clients about the Cyber Governance Code of Practice : Cyber Governance Code of Practice - GOV.UK Cyber risk is a whole company issue. The code sets out its expectations regarding the governance of cyber security and the actions that directors, including non-executives, need to consider to meet their responsibilities in managing cyber risk. The code is voluntary but the ICO has also endorsed it: “With cyber incidents increasing across all sectors, it is crucial for organisations and businesses to take a proactive approach to cyber governance, including putting the appropriate security measures and training in place to protect people’s data while boosting innovation. We welcome the new Cyber Governance Code of Practice and would encourage organisations to prioritise the digital safety of their assets and, ultimately, their reputation.”
At our recent cyber readiness roundtable (pictured above), held in the beautiful setting of the Crypt of St Paul's Cathedral, we provided our insights and facilitated a discussion for female directors on:
We are working with boards to test readiness, run simulations and support ongoing governance, helping them to demonstrate their cyber readiness. Do get in touch with me if you’d like to discuss further but for a sneak preview of your insights keep scrolling down to our spotlight section below.
Get your data bytes here.
The ICO published on 28 March guidance to help organisations understand effective anonymisation and pseudonymisation techniques and related data protection law obligations. The guidance is extensive and the ICO has indicated that specific sections are relevant for different types of stakeholders – sections 3 and 4 are intended for technical experts whilst section 1 and 4 are intended for decision makers.
The concept of identifiability is examined in detail with the ICO outlining its “spectrum of identifiability approach” as well as established anonymisation concepts such as the “motivated intruder” test.
The ICO helpfully confirms its view that “distinguishing one record from others in a table is not sufficient by itself to make the person the record relates to identifiable”. Instead, the ICO recommends that identifiability assessments should consider whether additional sources of data are available to either “take action on a person specifically” or “discover someone’s real-world identity”.
Whilst the ICO’s guidance brings greater clarity on how achieve anonymisation in compliance with UK laws, it is likely to be only of partial assistance for organisations seeking to anonymise personal data sets subject to both UK and EU data protection regimes. These organisations will need to wait for the anticipated publication of the European Data Protection Board’s anonymisation guidelines so that the EU and UK regulatory positions can be considered side by side.
The ICO announced on 27 March that it has fined Advanced Computer Software Group Limited £3.07 million. This is the first processor fine by the ICO and it relates to a failure to fully implement appropriate security measures such as multi factor authentication prior to a ransomware attack in 2022. The attack impacted the medical information of large numbers of individuals which the organisation stored as processor on behalf of the NHS and other health providers.
The original provisional fine amount was approximately £6 million but this was reduced by the ICO taking account of:
The organisation noted in submissions to the ICO that the remediation and response costs were in excess of £ 21 million.
Not only is this the first processor fine by the ICO, it is also the first significant ICO enforcement action in the past 12 months. Controller and processors alike are now on notice that data security failings particularly where special category data is involved represent a higher risk of enforcement action. Organisations should factor this point into their internal risk registers when considering data and cyber security risks and also bear this in mind when it comes to the negotiation of liability caps in data processing agreements.
On the trend of ICO enforcement, genetic testing company 23andMe were issued on 24 March with a notice of intent from the ICO in the form of a £4.59 million fine and a preliminary enforcement notice.
This comes nearly 18 months after suffering a data breach and following a joint investigation by the ICO and the Office of the Privacy Commissioner of Canada. Please see here for our previous update on the breach and investigation.
Up until recently, the ICO had issued a relatively small number of fines for breaches of the UK GDPR; in 2024 for example, its total monetary amount was only £1.1 million. This is now eclipsed solely by the fine against Advanced Software and 23andMe’s notice of intent issued in the last month. These two cases show the ICO flexing its enforcement powers again, particularly when it comes to affected special category data.
On 4 March 2025, the Artificial Intelligence (Regulation) Bill was reintroduced in the House of Lords in a renewed effort to establish binding legislation for the regulation of AI in the UK. The Bill was first introduced in 2023 and abandoned when Parliament was dissolved before the 2024 election. The Bill’s provisions have remained unchanged since first introduction in 2023; for more information about the Bill’s history since 2023, see here.
Given the Bill was introduced as a private member's bill, its path to becoming legislation is currently unclear without support from the Government. However, its reintroduction highlights the growing appetite for a clear legislative framework for AI governance in the UK in spite of the Government’s pro innovation approach to AI regulation . The Bill is currently at the second reading stage in the House of Lords so watch this space to see what progress it makes this time.
The Data (Use and Access) Bill (the Bill) which was introduced to Parliament in October 2024 is due to have its report stage and third reading and the government are still aiming for royal assent in Spring!
As the UK progresses the Bill, the European Commission announced on 19 March it has proposed to adopt an extension of the existing EU/UK adequacy decisions which provide for the free flow of data between the EU and UK until 27 December 2025. Once the Bill has become law, the European Commission will then assess whether the changes to the UK data protection regime allow for continued adequacy to be granted to the UK. Whilst the Bill does include reforms to specific areas of data protection law, such as automated decision making (see our previous article here), the divergence from the EU GDPR is relatively limited and therefore the likelihood of the adequacy decisions being renewed currently remain high.
On 3 March 2025, the ICO published a blog post outlining its latest work to protect children’s data online and specifically its focus on how social media video sharing platforms collect and use children’s data. The ICO noted that it has begun an investigation into TikTok’s use of personal information of 13-17 year olds in its UK recommender systems which focuses on the amount of data being collected and transparency. Another priority area referenced was age assurance with the ICO noting it has opened investigations into use of age assurance measures by Reddit and Imgur. The ICO concludes by noting that it is continuing to work closely with Ofcom to ensure children in the UK have a better digital experience.
This blog serves as a reminder to organisations of the ICO’s continued focus on children’s data as part of its ICO2025 strategic plan. When published, the outcomes and key findings of these investigations are likely to serve as useful guidance for organisations operating in the online environment. We will be covering in the next edition of Data Bytes an update on further work by the ICO in connection with the use of children’s data in the financial services sector.
On 12 March 2025, John Edwards, the Information Commissioner delivered a speech at IAPP London and addressed the areas of focus for the ICO:
On 17 March, the ICO, following discussions between John Edwards and Chancellor of the Exchequer Rachel Reeves, unveiled a package of measures to support the government’s growth agenda. Data is viewed by the Government and the ICO as being the key to economic growth and investment, not least demonstrated by the Data (Use and Access) Bill. The ICO has therefore committed to, amongst other things:
Meta faced and recently settled a claim by Ms O’Carroll for its targeted advertising based on her online behaviour on Facebook. The ICO had been involved and intervened to assist the Court with the application of the claimant's right to object under the UK GDPR and in a recent statement, has made very clear that: “People have the right to object to their personal information being used for direct marketing, and we have been clear that online targeted advertising should be considered as direct marketing.”
Meta is considering introducing a ‘Consent or Pay’ model in the UK which the ICO has published guidance on here and if implemented, would prevent this scenario reoccurring.
If your organisation undertakes targeted advertising, this case has the potential to set a precedent for you and your organisation’s activities. We strongly advise taking note of the ICO’s very clear statement above and taking a deep dive into your targeted advertising practices to ensure it is compliant with direct marketing rules.
This is an update for our financial service clients: the FCA and ICO are committed to providing regulatory clarity and certainty to support responsible AI innovation in financial services and issued a joint letter which was addressed to trade association chairs and CEOs. The letter acknowledged the regulators’ history of collaboration and alignment with the UK government's goal of fostering economic growth through regulatory cooperation.
A recent survey by the FCA and Bank of England highlighted data protection and Consumer Duty as major regulatory constraints to AI deployment, indicating a lack of confidence and potential uncertainty among firms regarding regulatory interactions.
To better understand and address these challenges, the FCA and ICO will host a roundtable with industry leaders on 9 May in London, focusing on areas of regulatory uncertainty, collaboration for greater regulatory support, and specific needs in data protection and financial regulation to enhance innovation. Keep a watching brief on updates following the outcome of this roundtable.
The concept of an “enterprise” under the GDPR fine regime aligns with its definition in Articles 101 and 102 of the Treaty of the Functioning of the European Union (“TFEU”). This means that when a data controller, whether acting alone or as part of a larger corporate entity, is fined for violating the GDPR under Article 83(4) to (6), the maximum fine is calculated based on a percentage of the total worldwide annual turnover of the preceding financial year. This approach ensures that the fine takes into account the entity’s actual economic capacity, ensuring penalties remain effective, proportionate, and dissuasive.
However, the interpretation of an “enterprise” under competition law has no impact on whether an administrative fine can be imposed on a legal entity acting as a data controller. The power to sanction such an entity stems exclusively from Article 58(2) and Article 83(1) to (6) of the GDPR. The notion of an enterprise is relevant only for determining the fine’s amount, as confirmed by a Court of Justice of the European Union (CJEU) ruling on December 5, 2023 (Case C-807/21).
This interpretation is particularly important in the context of administrative fines under Article 83(4) to (6) of the GDPR. Recital 150 of the GDPR refers to the TFEU definition of an enterprise to establish that when the recipient of a fine is part of a larger corporate group, the maximum fine is based on the global turnover of the entire enterprise.
Nevertheless, distinguishing the maximum fine from the actual fine imposed is crucial. While the maximum fine depends on global turnover, the actual fine is determined by the supervisory authority based on the severity of the infringement and the behavior of the offender. Additionally, the concept of an enterprise remains relevant in assessing whether the imposed fine is economically viable, proportionate, and sufficiently deterrent in relation to the financial strength of the entity concerned.
For more details, you can access the article here.
The European Data Protection Board (EDPB) has launched its 2025 Coordinated Enforcement Framework (CEF). Following the 2024 coordinated enforcement action on the right of access, the 2025 CEF will focus on the implementation of another fundamental data protection right, namely the right to erasure or the "right to be forgotten," as enshrined in Article 17 of the General Data Protection Regulation (GDPR).
This topic was selected by the EDPB during its October 2024 plenary session, as the right to erasure is among the most frequently exercised rights under the GDPR and is the subject of numerous complaints submitted by individuals to data protection authorities (DPAs).
Next Steps:
For more details, you can access the article here.
On 13 March 2025, EU Commissioner Michael McGrath (DJ Justice and Consumers) confirmed in an interview at the Center for Strategic & Internation Studies (CSIS) that the European Commission (EC) plans a future omnibus package, particularly around the recordkeeping for SMEs with less than 500 people, to ease the burden on smaller organizations in relation to the retention of records while at the same time preserving the underlying core objective of our GDPR regime.
Further, Axel Voss, a MEP who played a leading role in the legislative process of the GDPR, and Max Schrems, a data protection activist and privacy lawyer, have voiced their support for a new three-layered risk approach under the GDPR:
"GDPR mini" layer: This should apply to businesses that process data from fewer than 100,000 data subjects and that are not handling special categories of data. Controllers falling under the GDPR mini category would not need to appoint a data protection officer, have to follow simplified transparency rules, provide reduced documentation and would be subject to lower administrative fines (capped EUR at 500.000).
"GDPR normal" layer: The "GDPR normal" should address businesses that process sensitive data or operate data at a large scale, but still do not reach the scale of large tech companies. The GDPR shall remain as it is for these companies; the legislator should remove certain outdated provisions.
"GDPR plus" layer: This layer shall apply to very large online platforms (VLOP), online advertisers and data brokers – essentially all companies whose business model is built fundamentally on the processing of personal data. As a threshold, Voss and Schrems suggest processing data of 10M+ individuals or handling data of 50%+ of a country's population. Companies within the scope should have mandatory annual external audits similar to financial audits. Further, Voss and Schrems demand stronger transparency obligations and a reversed burden of proof, meaning that such companies must prove data protection compliance, not the regulators.
On 5 March 2025, the European Data Protection Board (EDPB) launched its 2025 Coordinated Enforcement Framework (CEF). This initiative strengthens cooperation among the 32 data protection authorities (DPAs) across Europe to enforce data protection rights. This year’s focus is the right to erasure (Article 17 GDPR), following last year’s emphasis on access rights.
DPAs will contact businesses across sectors, conducting fact-finding or initiating formal investigations to assess compliance. Data controllers will need to demonstrate robust erasure processes, as non-compliance may result in fines under GDPR. Smaller businesses, especially those with limited data protection resources, are well advised to proactively address this requirement.
On 5 March 2025, the EU has published the EHDS in the Official Journal, hence entering into force on 26 March 2025. The EHDS aims to enable natural persons to access and control their personal electronic health data to be saved in their respective Electronic Health Records, regardless of where they seek healthcare services in the EU (such as when they are away from their regular healthcare practitioners). As a secondary purpose, the EHDS aims to improve the use of electronic data in health care sector. For example, research and health institutions can use health data securely for specific purposes.
The EHDS places significant emphasis on anonymized or pseudonymised data, particularly for its secondary purpose, such as research and innovation. Right on time, on 16 January 2025, the European Data Protection Board ("EDPB") has published its draft Guidelines on pseudonymisation (Guidelines 01/2025). These Guidelines place high demands on pseudonymisation and blur the lines between anonymisation and pseudonymisation. It remains to be seen to which extent the EDPB will revise the Guidelines and align them with the commentaries provided in the consultation process and whether it will provide further Guidelines on anonymisation.
On 5 March 2025, the EU Commission has published an updated version of its model contractual clauses ("EU MCC") for high risk AI systems. The EU MCC aim to help public organisations to procure AI systems developed by external suppliers. The EU Commission provides the EU MCC in two versions: "AI clauses high risk" and "AI clauses non-high", translated into all EU languages.
The update includes a full version for high risk AI systems that is aligned with the EU AI Act, a light version, customisable to specific needs for non-high-risk AI systems and an explanatory note on how to use, customise and apply the clauses in practice.
The French Data Protection Authority (CNIL) has announced that the "Compliance Club" dedicated to connected vehicles and mobility will focus its 2025 efforts on issuing recommendations regarding the use of onboard cameras ("dashcams") in privately owned vehicles in various contexts. Given the growing adoption of these devices and the absence of a specific legal framework, CNIL aims to develop clear recommendations to address privacy concerns, data protection rules, and compliance obligations.
The CNIL acknowledges that onboard cameras raise privacy and legal concerns, as they capture individuals without a clear regulatory framework. It is necessary to define the legal basis for data collection, set retention limits, and ensure security measures to protect recorded footage while balancing user interests and privacy rights.
To address these issues, CNIL will hold thematic workshops from April to June 2025, bringing together industry stakeholders. Discussions will cover data processing rules, retention limits, user rights, security measures, and potential anonymization techniques to mitigate privacy risks.
CNIL’s regulatory approach relies on collaboration with public and private stakeholders. Recommendations will be developed through the Compliance Club and undergo public consultation before adoption. Regular updates, including workshop agendas and summaries, will be published to ensure transparency.
The 2025 focus excludes interior vehicle cameras used in professional contexts, as CNIL has already provided guidance on workplace surveillance. Once the onboard camera work is complete, CNIL will define new regulatory priorities for 2026, balancing privacy risks with the need for legal clarity in mobility innovations.
For more details, you can access the article here (French only).
The case involves a complaint filed on 13 November 2023 by a customer, D. A.A.A., against SOCIEDAD CONJUNTA PARA LA EMISIÓN Y GESTIÓN DE MEDIOS DE PAGO EFC SA ("Iberia Cards"). The customer had requested the cancellation of his credit card from Iberia Cards and the deletion and blocking of their personal data on 28 November 2022. Nearly a year later, on 31 October 2023, the customer applied for a new card and was informed by Iberia Cards on that same date and on 2 November 2023 that they could not benefit from new customer promotions because he had been a previous customer. The customer claimed that this response indicated that their data had not been properly deleted or blocked as previously confirmed and provided copies of emails confirming that data suppression. Iberia Cards argued that they had complied with the data deletion request and that the customer's data was blocked. They stated that the system detected the previous customer status only when the new card request was made, which is why the new customer promotion was not applicable.
The Spanish Data Protection Agency (AEPD) concluded that Iberia Cards had unlawfully processed the customer's personal data after it had been blocked as it had processed the blocked data to determine their previous customer status, which is not permitted under the data protection laws. In this regard, the AEPD imposed a fine of 20,000 euros on Iberia Cards for violating Article 6.1 of the General Data Protection Regulation (GDPR) which was reduced to 16,000 euros for voluntary payment, renouncing any administrative recourse against the sanction.
The argument stated by the AEPD is that Iberia Cards processed the customer's personal data without a lawful basis breaching article 6.1 of the GDPR because it should have not been used to determine the customer's eligibility for new customer promotions because once the data is blocked it cannot be processed for any purpose other than those explicitly allowed by article 32 of the Spanish Organic Law 3/2018 on Data Protection and Guarantee of Digital Rights, such as for legal claims or obligations and only in regards to its statute of limitation.
The Supreme Court ruling dated 19 February 2025, addresses a cassation appeal filed by GAMBOA AUTOMOCIÓN SA ("Gamboa") against a decision by the Provincial Court of Oviedo. The case involves a fraud incident where a transfer was made under false pretences due to a security breach in GAMBOA's email system. The court upheld the previous ruling that held Gamboa responsible as a civil subsidiary for the damages incurred by COMERCIO Y ASISTENCIA S.A. (CYASA).
On 10 April 2018, unidentified individuals impersonated an employee from Gamboa's commercial department through his/her email and sent an email to CYASA's administration department instructing a transfer of 32,594.75 euros for a Nissan vehicle to a fraudulent account. CYASA made on 11 April 2018 the referred transfer believing the email to be legitimate. The fraudulent account was opened by Matías (who was not an employee of Gamboa) with the intent to receive the transferred funds without raising suspicion. Matías withdrew 3,000 euros on 12 April 2018 and another 10,000 euros on 13 April 2018 before the account was blocked, leaving 19,619.75 euros which were returned to CYASA.
It is analysed by the Spanish Supreme Court if Gamboa should be held subsidiarily liable due to the security breach in their email system and the failure to inform other dealers of the breach. The Spanish Supreme Court declared Gamboa's civil subsidiary liability for the fraud committed due to the security beach in their email system because: (i) Gamboa knew on 10 April 2018 of the security breach and despite this knowledge, failed to alert other dealers (including CYASA) which led to the fraudulent transfer; (ii) the breach of security in GAMBOA's email system constituted an infringement of professional duty, making GAMBOA liable under Article 120.3 of the Penal Code as the term "establishment" is understood broadly to include digital systems used in business operations; (iii) there was a causal link between Gamboa's failure to notify other dealers of the breach and the fraudulent transfer as the fraud would not have occurred if Gamboa have promptly informed CYASA and other dealers about their security issue and (iv) Gamboa's omission to act upon the security breach directly contributed to the financial loss suffered by CYASA.
The National Cyber Security Centre (NCSC) has warned of a "widening gap" between increasingly complex cyber threats (including with AI) and the cyber defensive capabilities in the UK. NCSC has also flagged the severity of the cyber risk in the UK as being "widely underestimated" and reports a need for businesses and other organisations to boost their cyber security.
In response, the UK Government is drafting, consulting on and publishing an assortment of obligations and standards to uplift organisational resilience. For example, the UK Government has issued a voluntary code of practice for the cyber security of AI (see our briefing here), the Home Office has opened an ongoing public consultation on proposals to address the threat and impact of ransomware, and the Cyber Security and Resilience Bill, which intends to expand the remit of existing cyber security legislation, is expected in Parliament imminently.
What you may have missed is a consultation by the Government on a Cyber Governance Code of Practice which seeks to formalise the UK Government's expectations of directors for governing cyber risk in the same way as any other material or principal business risk. The Code has now been finalised and is available here: Cyber Governance Code of Practice - GOV.UK. It consists of five principles for Boards and directors, each accompanied by several corresponding actions.
The Ashurst cyber team have been talking to clients about what this means in practice. "The Cyber Governance Code of Practice represents a significant step forward in addressing the widening gap between complex cyber threats and the UK's defensive capabilities. It underscores the critical need for businesses and organisations to elevate their cyber security measures to board-level priorities. By formalising expectations for directors to govern cyber risk with the same rigor as other principal business risks, the Code ensures that cyber resilience is integrated into the core of enterprise risk management. It emphasises the need for a comprehensive approach to cyber governance, integrating risk management, strategy, training, incident planning, and assurance.” Rhiannon Webster, UK head of data protection and cybersecurity.
An overview of each principle follows, together with our view of what they mean in practice:
1. Risk management
This includes seeking assurance that risk assessments are conducted regularly, cyber risks are embedded as part of an organisation’s broader enterprise risk management framework, controls have been identified and mapped to the identified risks, and there are clear accountabilities for cyber risk that go beyond the Chief Information Security Officer.
"You need to know where your cyber risks are, assess those risks, understand what the impacts are if these risks materialise, and have robust controls and mitigations for these risks. Cyber risks evolve quickly, therefore a key part of effective cyber governance is revisiting your risk assessment and adjusting your risk appetite. We often see red flags such as businesses stating a zero tolerance for cyber risk, which just isn’t possible. This can mean you do have not adequately considered your cyber risk and appropriate controls. Your risk position also needs to be constantly re-synced internally with developments in both your organisation's capabilities, and externally with the capabilities of cyber threats." – Matt Worsfold, Risk Advisory Partner, London
2. Cyber strategy
This includes monitoring and reviewing a cyber resilience strategy and its delivery and ensuring resources and investments are appropriately allocated and used effectively.
"Resourcing and investment is one way to monitor effective cyber risk management. Red flags we see are IT or security teams requesting budget that is rejected by the Board or not requesting a budget at all. This could mean that something is out sync with your cyber resilience strategy, particularly where you do not have a team actively reinvigorating their cyber budget." – John Macpherson, Risk Advisory Partner, Australia.
3. People
This includes curating a cyber security culture which encourages positive behaviours and accountability, supported by clear policies. Boards should ensure an organisation has an effective cyber security training, education and awareness programmes and that metrics are in place to enable the board to measure effectiveness.
"Creating an effective culture takes time and effort and needs to be assessed, monitored and embedded. Tone from the very top is crucial and, for that tone to be credible, all board members need a baseline of cyber literacy as their duties require of them. That’s where targeted training comes in." – Will Chalk, Partner, Corporate Governance, London
4. Incident planning and response
This includes ensuring a cyber incident plan is in place for business critical processes, technology and services and that there is at least annual testing of the plan. The Board should also support executives in critical decision making and external communications.
"One of the key takeaways is the importance of continuous reassessment of cyber risks and ensuring that the organisation's cyber resilience strategy is aligned with evolving threats. This proactive stance is crucial, especially as we see an increasing number of claimants seeking compensation for damages resulting from cyber incidents. Effective planning and involving legal teams in incident response can mitigate the risks of poor incident management and potential litigation." Amanda Ludlow, Digital Economy Practice Head.
"Claimants are trying to seek compensation for damages resulting from a cyber incident with an associated risk of litigation and regulatory enforcement action. Actions taken (or not taken) and documentation produced in the aftermath of a cyber incident can be critical. We offer training for legal teams on their role in responding to cyber incidents, including managing communications, protecting legal professional privilege, dealing with complaints and seeking compensation." – Jon Gale, Partner, Dispute Resolution, London
5. Assurance and oversight
This includes establishing clear roles, responsibilities and ownership of cyber resilience at both executive and non-executive director level, a formal reporting on at least a quarterly basis and a regular monitoring process of the organisation's cyber resilience.
"Many cyber incidents are caused by small lapses in security that are rooted in an organisation's governance. In nearly all the post incident reports we do, the root cause was already known to the organisation – it's an overdue action item in an audit report or IT issues register; a misconfigured control that wasn’t tested properly; or human error caused by the employee who hasn’t completed their mandatory cyber training. Designing your assurance activities requires ongoing engagement with the front line of the business and a deep understanding of the threat and risk environment” John Macpherson, Risk Advisory Partner, Australia
If you would like further details on any of the above, please get in touch.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.