Data Bytes 58: Your UK and European Data Privacy update for May 2025
16 June 2025

16 June 2025
I’m not sure if we ever thought we’d see the light of day for a new post-Brexit data protection law. After many false starts and changes of governments, and a “late in the day” parliamentary ping pong, on 11 June, Parliament passed the Data (Use and Access) Bill which is now ready to be given royal assent, at which time it will be known as the Data (Use and Access) Act 2025.
The main provisions of the Act will be brought into effect through implementing regulations, with key changes from a data protection and privacy perspective relating to automated decision making, cookie compliance and enforcement and reform of the ICO.
The EU Commission can now also get to work re-assessing the UK's adequacy status for data transfers which was delayed pending passing of the bill.
Watch this space for further updates from the Ashurst Digital Economy team over the coming weeks as we plan to deep-dive on specific aspects of the Act and what it all means in practice.
Moving to the East, keep scrolling to our spotlight section, where Raheel Butt, the newest member of our global data team analyses Saudi Arabia’s data protection laws.
Get your data bytes here.
On 1 May, the ICO and Canada's Office of the Privacy Commissioner (OPC) jointly released a statement calling for stringent protections of 23andMe's sensitive customer data during the company's ongoing bankruptcy proceedings. In a letter addressed to the US Trustee overseeing the case, the regulators emphasised that any sale or transfer of 23andMe's assets must comply with UK GDPR and Canada's PIPEDA, warning of potential enforcement actions if data protection obligations are not met.
This development follows the ICO's March 2025 issuance of a provisional £4.59 million fine against 23andMe, stemming from a 2023 data breach that compromised the personal information of approximately 6.9 million users. The ICO and OPC have expressed concern over the protection of highly sensitive information, including genetic data, health reports, and self-reported health conditions of 23andMe's customers.
23andMe filed for Chapter 11 bankruptcy protection on the same day the ICO announced its intended fine. The statements from the regulators highlight the challenges of effectively safeguarding personal data when companies enter into insolvency or restructuring processes. Insolvency practitioners should note that the ICO has previously issued guidance in connection with the sharing of personal data in connection with an asset being sold during insolvency proceedings.
On 10 May, the ICO has released a statement calling for organisations to improve their cyber security and protect their personal information. The ICO referred to its own "Learning from the mistakes of others" report which contains practical advice to help organisations understand common security failures and how they can prevent future data breaches before they happen.
ICO Deputy Commissioner Stephen Bonner stressed the importance of having foundational controls in place to protect people's personal information and mentioned that the ICO will take action against organisations that are not taking simple steps to secure their systems.
The report focuses on the five leading causes of cyber security breaches, methods of reducing the risks of each, and future developments in these areas:
Through releasing this statement the ICO is putting organisations on notice about the specific areas of their security programmes to focus resources, planning and testing.
On 13 May, the ICO released for consultation draft updates to its encryption guidance. The guidance explains how UK data protection law applies when organisations use encryption techniques in different contexts.
The draft guidance is aligned with the latest ICO anonymisation and pseudonymisation guidance noting that encryption is technically a pseudonymisation technique and encrypted data remains personal data “in your hands” to the extent you have means available to re-identify individuals through decryption of the data set.
The “encryption scenarios” section of the draft guidance will be particularly relevant for data protection practitioners’ discussions with information security teams in their organisatons.
You can respond to the consultation until 24 June via a survey available on Citizen’s Space.
On 21 May the EU Commission proposed a comprehensive package of measures aimed at reducing administrative burdens by simplifying certain regulatory obligations to free up resources for growth and investment, thereby enhancing the competitiveness and innovative capacity of EU businesses. This is part of a broader strategy to streamline EU rules with further measures anticipated in June 2025 focusing on the defence sector.
As part of this effort, the EU Commission introduced a new category of "small-mid cap companies" - businesses with less than 750 employees and up to EUR 150 million in turnover or up to EUR 129 million in total assets. Such small mid-cap companies that do not otherwise undertake high risk processing would benefit from simplified record-keeping as they would not need to maintain a record of processing activities (ROPA) i.e. the proposal would modify the ROPA GDPR exemption:
to not apply to high risk processing;
to delete the reference to "occasional processing" as a condition for the exemption; and
to potentially remove the reference to processing of special categories of data (although the recitals would clarify that processing such data for compliance with legal obligations in employment, social security, or social protection law would not trigger the record-keeping requirement).
This classification is designed to ease the compliance burdens for nearly 38,000 companies that are currently subject to significantly higher compliance requirements when they exceed the traditional SME GDPR ROPA threshold of 250 employees.
The EDPB and EDPS have issued a joint response to the European Commission’s proposal:
On 12 May the EU Commission published detailed FAQs on AI literacy (Article 4 EU AI Act), providing its interpretation of the provisions for the first time. The FAQs:
emphasise that AI literacy constitutes an “obligation to take measures” for providers and deployers;
clarify the target groups in scope: employees, contractors and service providers, clients and end-users who interact with AI systems;
are clear that providers and deployers of AI systems should tailor training sessions to participants’ individual knowledge levels and considering the specific context
Explain that when using generative AI systems (such as ChatGPT or MS Copilot), providers and deployers must identify specific risks, including the risks resulting from hallucinations or bias; and
according to the EU Commission, state that the AI Office (which is the key advisory board at the EU level, instituted under the AO Act) does not intend to define sector-specific requirements nor to impose strict requirements or mandatory trainings.
The EU Commission states that it is currently preparing a dedicated webpage on AI literacy and skills and will publish guidelines on the requirements for High-Risk AI Systems and the responsibilities along the AI Value Chain which will also address issues of AI literacy.
The European Union and Singapore have signed a landmark Digital Trade Agreement (DTA), marking a significant step in the deepening of their digital and economic relations. The DTA:
is a modern, self-standing agreement – separate to the Free Trade and Investment Protection Agreement – that sets high standards for digital trade rules, aiming to enhance consumer protection, facilitate trusted cross-border data flows, and prevent protectionist practices by prohibiting unjustified data localisation requirements;
includes rules on privacy and personal data protection, electronic contracts, authentication and trust services, online consumer trust, and regulatory cooperation on digital trade
stipulates general principles and objectives, requiring the establishment of a non-discriminatory legal framework for personal data protection, considering international standards; and
ultimately emphasises the desire for greater alignment and ensures full respect for the EU’s data protection framework and preserves the EU’s regulatory autonomy to pursue legitimate public policy objectives.
The DTA is currently undergoing ratification, with the EU requiring the consent of the European Parliament.
On 29 April, the General Court of the European Union dismissed the appeal brought by Meta Platforms Ireland Ltd, which sought the annulment of, and compensation for, Opinion 8/2024 of the EDPB concerning the validity of consent in “consent or pay” models under the GDPR.
Although Meta's challenge was dismissed, the General Court ruled that this was because the opinion is not legally binding on Meta meaning the opinion could not be challenged and therefore resulting in Meta's complaint being inadmissible.
The EDPB has issued several opinions recently on a range of topics and organisations should take note that these are intended as a method to guide EU data protection authorities rather than being binding guidance.
Meta has the option to appeal to the CJEU.
On 2 May, the DPC imposed a total EUR 530 million fine on TikTok comprising of:
EUR 45 million for not fulfilling transparency obligations; TikTok had failed to specify that the processing included remote access to personal data stored in Singapore and the United States by personnel based in China and had not explained the nature of the processing operations that constituted the transfer; and
EUR 485 million for breaching GDPR data transfer rules (Article 46 GDPR).
According to the DPC, "TikTok’s transfers to China infringed Article 46 sentence 1 GDPR because it failed to verify, guarantee and demonstrate that the supplementary measures and the Standard Contractual Clauses (SCCs) were effective to ensure that the personal data of EEA users transferred via remote access were afforded a level of protection essentially equivalent to that guaranteed within the EU".
Whilst TikTok had entered into SCCs and had conducted a Transfer Impact Assessment (TIA), it had not sufficiently evaluated whether the implemented protective mechanisms were truly effective within the context of the Chinese legal environment. According to the DPC, Chinese regulation such as the Anti-Terrorism Law, the Counter-Espionage Law, the Cybersecurity Law, and the National Intelligence Law grant Chinese authorities extensive access to data, without transparent procedures or effective legal remedies.
For any companies who undergo transfers to China or have Chinese group companies, take heed of these fines and ensure that your TIA comprehensively accounts for and assesses these specific Chinese regulations.
The validity of the consent or pay model is being tested across Europe. First we had the CO’s consultation (see our previous Data Byte on the ICO’s consultation), in April we saw Meta’s appeal against the EDPB opinion on its model be rejected (see our EU updates above) and now the Italian Data Protection Authority has initiated a public consultation to assess the lawfulness of user-consent for profiling activities collected by multiple data controllers through the adoption of the so-called "pay or ok" model (also referred to as "pay or consent" or "consent paywall").
By way of reminder, under this model, users seeking access to online content or services are compelled to choose between subscribing to a paid service or consenting to the processing of their personal data – via cookies and tracking technologies – for commercial profiling purposes; in the absence of either option, access to the website is denied.
This initiative is part of ongoing investigations by the Italian Authority into several newspaper publishers adopting this business model, which is considered controversial under privacy regulations, specifically the GDPR and the e-Privacy Directive. The consultation is open to all stakeholders and seeks to gather input to identify technical and operational solutions – such as alternative models for content access – that ensure compliance with a free, specific, and informed consent. Contributions must be submitted to the Authority (preferably via email to: protocollo@gpdp.it, or via certified email to: protocollo@pec.gpdp.it), by 11 July 2025 with the subject line: Consultazione pubblica sul modello “pay or ok” ("Public consultation on the 'pay or ok' model").
Targeted advertising clearly continues to be an area of regulatory focus and scrutiny so continue to approach with caution and ensure that you are doing a thorough analysis of any such activities to ensure compliance with marketing rules and regulatory expectations.
A complaint was filed with the Spanish Data Protection Agency (AEPD) against the General Council of Notaries (CGN) regarding the requirement to upload images of both sides of a national ID to register and use the Notarial Citizen's Portal. The complainant argued that this requirement was excessive and unnecessary, as electronic signature should suffice for identification. The investigation revealed that the uploading of ID images was only mandatory for certain notarial services, particularly those initiated online and which did not require subsequent in-person appearance. The portal’s privacy policy stated that the images would be sent to the notary providing the requested service and could be deleted by the user after the service was completed.
The AEPD concluded:
that the CGN lacked a valid legal basis for the retention of ID images beyond what was necessary for the requested notarial service. While notaries are legally required to verify the identity of individuals and, in certain cases, retain copies of identification documents (especially under anti-money laundering laws), this obligation does not extend to the CGN itself, which acts as the portal administrator. The AEPD were clear that the processing and retention of ID images must be limited to what is strictly necessary and have a valid legal basis such as explicit consent or legal obligation; the CGN were found not to have obtained valid GDPR consent for the retention of ID images.
the CGN failed to provide adequate information to users regarding its processing and retention of their data, in particular, it did not provide clear information about the purposes and duration of such retention, thus violating the principles of data minimisation and transparency.
that the DPO must act independently and cannot simultaneously hold a position that determines the purposes and means of data processing.
The AEPD ordered the CGN to:
implement corrective measures within six months to ensure compliance with its GDPR obligations regarding legal basis and information obligations
appoint a new, independent DPO and to revise its data processing practices, and information provided to users to ensure compliance with the GDPR.
This is yet again another reminder of the importance of: (i) ensuring that your DPOs, if they maintain another role or responsibility within your organisation, are not involved in any activities which would involve the determination of processing activities; and (ii) ensuring the need for clear separation of roles and responsibilities to avoid conflicts of interest in data protection governance.
The French Data Protection Authority (CNIL) released its 2024 Annual Report (French only), offering a comprehensive overview of its regulatory activity in a year marked by the increasing complexity of digital ecosystems and the accelerating deployment of artificial intelligence. The report reaffirms CNIL’s central role in upholding the rights enshrined in the GDPR and the French regulation, through a combination of robust enforcement, anticipatory guidance, and institutional cooperation. Key points to note from the report:
The 2024 report confirms CNIL’s dual function as both a regulator and a normative authority. As digital technologies evolve at an unprecedented pace, CNIL’s activity illustrates the critical balance between legal certainty, technological innovation, and the effective protection of fundamental rights.
Saudi Arabia's Vision 2030 strategy has sparked transformative shifts across multiple regulatory domains, with data protection emerging as a cornerstone of its digital governance agenda. Central to this development is the Personal Data Protection Law (PDPL), first promulgated under Royal Decree No. M/19 dated 16 September 2021. The PDPL was later amended by Royal Decree No. M/148 on 27 March 2023 and formally entered into force on 14 September 2023. It represents a foundational legal instrument intended to establishing a robust framework for data protection, balancing the Kingdom’s aim of stimulating investment in digital transformation and technology with local legal and cultural imperatives.
We discuss below the key provisions of the PDPL including data transfer restrictions as well as the proposed amendments to the implementing regulations of the PDPL which are currently subject to public consultation.
The PDPL applies to all processing of personal data that occurs within the territory of Saudi Arabia, regardless of the data subject’s nationality or residency status. Furthermore, the PDPL incorporates extraterritorial application similar to the EU and UK GDPR by extending its reach to entities not established in the Kingdom that process the personal data of individuals located in Saudi Arabia.
Legal bases for processing under the PDPL include consent, contractual necessity, compliance with legal obligations, legitimate interest, and processing justified by public interest. The legitimate interest lawful basis is subject to narrow regulatory interpretation as it must be supported by a demonstrable balancing test that respects the rights and expectations of the data subject.
Pursuant to Article 29 of the PDPL, the transfer of personal data outside the Kingdom is permissible only under defined conditions: such transfers must not jeopardize national security or the vital interests of the Kingdom, must incorporate adequate safeguards to protect data confidentiality, must be limited to the minimum amount of personal data necessary for the intended purpose, and must receive prior approval from the competent authority, as prescribed in the executive regulations.
Authorised transfer mechanisms include those recognized by the Saudi Data and Artificial Intelligence Authority (SDAIA), such as binding corporate rules and standard contractual clauses. Importantly, all cross-border data transfer arrangements must be documented in Arabic and supported by formal risk assessments.
Enforcement of the PDPL is administered by SDAIA, supported by sector-specific regulators such as the Saudi Central Bank and the Communications, Space and Technology Commission. The law empowers regulators to impose administrative fines of up to SAR 5 million per violation and permits the imposition of criminal penalties for serious infractions, including the unauthorized disclosure of sensitive data. The framework also includes provisions for mandating third-party audits, suspending data processing operations, and issuing compliance orders. As the initial grace period expired in September 2024, SDAIA has signaled a more proactive enforcement stance moving forward.
In April 2025, SDAIA initiated a public consultation on proposed amendments to the Implementing Regulations of the PDPL. Should the proposed amendments be enacted in their current form, organisations will need to comply with a series of additional compliance measures. These include:
The PDPL represents a significant inflection point in Saudi Arabia’s legal treatment of data privacy. While conceptually informed by international models such as the GDPR, the PDPL incorporates unique local elements, particularly in relation to data localisation, language requirements, and regulatory interfacing. The proposed 2025 amendments, if enacted, will introduce a further suite of compliance obligations for organisations operating in the Kingdom.
Organisations operating in or targeting the Saudi market must invest in jurisdiction-specific expertise and infrastructure to navigate the complex and evolving data protection landscape.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.