Legal development

Data Bytes 60: Your UK and European Data Privacy update for July and August 2025

graph and lights background

    Welcome back to a new term of Data Bytes. Having given the Data Bytes editorial team the summer off (err... apart from the small matter of recording and publishing a podcast series on the UK Data (Use and Access) Act! 4 episodes down, 2 to come – you can access the podcasts to date here) we have a bumper “back to school” edition for you, covering updates from July and August. 

    We haven't forgotten the small matters of the Latombe (EU-US data privacy framework remains intact) and SRM (pseudonymous data is not always personal data) judgments, they will fall into September’s updates.

    Keep scrolling below to our spotlight section, which in this issue focusses on the smart data provisions contained within the Data (Use and Access) Act, with a comparison between the EU Data Act and the proposed UK framework.  

    Get your data bytes here (and please listen to our podcast too!) 

    UK Updates 

    First Data (Use and Access) Act provisions come into force 

    Following receipt of Royal Assent on 19 June 2025, the first provisions of the Data (Use and Access) Act 2025 (DUAA) entered into force on 19 and 20 August 2025. The DUAA provides a framework through which the DUAA's provisions will be implemented over the next 12 to 18 months via secondary legislation. The first provisions of the DUAA commenced on 19 and 20 August 2025. These bring into force certain DUAA provisions including technical provisions, new statutory objectives for the ICO, and provisions requiring the government to prepare a progress update and a report on copyright works and artificial intelligence systems. The next tranche of provisions to enter into force (called Stage 2) are expected to commence in September or October 2025 and will introduce most of the provisions on digital verification services and certain other measures regarding retention of information. Listen to our podcast for further detail. 

    UK’s adequacy finding by the EU 

    On 22 July 2025, and following Royal Assent of the DUAA, the European Commissions has issued a draft implementing decision in which said it continues to assess UK data protection standards as being “essentially equivalent” to those in force in the EU and proposes to renew UK adequacy decisions for a period of six years – which is a welcome relief for UK businesses disclosing data to the EU as the current adequacy decisions will expire on 27 December 2025. 

    The Commission’s assessment of the UK’s data protection framework included a review of the DUAA and its decision concluded that despite the changes and some divergence between UK and EU regimes, the UK law continues to give adequate protection. Without the adequacy decisions being in place, organisations wishing to transfer personal data from the EU to the UK face much greater compliance hurdles. 

    The draft adequacy decisions still requires formal approval by the EU member state governments and are subject to a non-binding opinion to published by the European Data Protection Board. Nevertheless, the decisions should still be adopted by the 27 December 2025 deadline. 

    ICO Consultations and Guidance 

    The ICO chose the Summer months to publish a number of new pieces of guidance and request consultation on existing guidance:

    • On July 7, 2025 it opened a consultation on its approach to enforcement of the regulation 6 consent requirements of the Privacy and Electronic Communications Regulations (“PECR”) which then closed on August 29. The ICO is exploring how publishers could deliver advertising to users who have not given consent, where the risks to privacy are low. 
    • On July 7 it also opened a consultation on a new chapter within its draft guidance on storage and access technologies (“Detailed Cookie Guidance”). The chapter reflect changes to PECR introduced by the DUAA, i.e the ability to use certain non-essential cookies without consent for specific low-risk purposes.  This consultation is still open and closes on September 26, 2025.
    • On 30 July 2025, the ICO published guidance on the use of profiling tools for online safety – on which see below. 
    • On 31 July 2025, the ICO published guidance on disclosing documents to the public and how to minimise the risk of accidental breaches of personal information. It contains practical steps to help you check documents for hidden personal information, guidance to help you consider format for disclosure and videos and checklists to teach you how to support the development of your own policies, procedures and training.
    • The ICO's call for views on its guidance on international transfers of personal data under the UK GDPR closed on 7 August 2025. The consultation was an opportunity for commercial stakeholders to give their views on improvements to the ICO's approach to guidance in general. Additionally, the ICO inquired about potential other tools or guidance respondents would like to see from the ICO to make understanding their international data transfer obligations clearer and more accessible. It is expected that the ICO will review the responses and take on board feasible suggestions when drafting new and updated guidance on international transfers. 

    Online Safety 

    On 30 July 2025, the ICO published guidance on the use of profiling tools for online safety. Although not updated for the Data (Use and Access) Act, there are a few key points for organisations to consider:

    • Do tools draw inferences from special category data? The guidance notes that in some cases, it may be possible to infer details about an individual which constitute special category data, even if it is unintentional. The ICO notes, however, that if the inference is not intentional, or the organisation does not intend to treat the user differently as a result of the inference, it is unlikely to constitute processing of special category data. 
    • Intersection with the Online Safety Act 2023 (OSA). The guidance states that "The OSA requires regulated services to set out in their terms of service if they are using ‘proactive technology’ to comply with their Online safety duties. Services are also required to explain the kind of proactive technology they use, when they use it, and how it works. Complying with this duty may help you provide the transparency to users that UK GDPR requires. However, you must also provide the necessary transparency for data protection law." Therefore the ICO is placing the onus and responsibility on organisations to ensure they are complying with both regimes.
    • Automated decision making. The guidance warns that certain profiling tools may constitute automated decision making which falls under Article 22 of the UK GDPR, e.g., "if it results in financial loss or discrimination" and suggests that organisations seek to understand the "full context" in which automated decisions take place.

    Our key take away from this new guidance is that we now have two regulators in the UK (the ICO and OFCOM) who are actively involved in regulating this area and are producing guidance and have the potential to take enforcement action. Organisations should consider developing internal strategies and toolkits to manage any interactions with one, or both, regulators. 

    On 24 July 2025, Ofcom published a reminder for tech firms on the requirement to introduce age checks to prevent children from accessing particular harmful content by 26 July 2025. 

    As part of this reminder, Ofcom also announced: 

    • A focus on age assurance for pornographic content. Ofcom noted it is "ready to enforce against any company that allows pornographic content and does not comply with age -check requirements" by 25 July 2025. Ofcom has since opened further string of investigation actions in relation to compliance with the duty to prevent children from preventing children from encountering pornographic content.  
    • Wider enforcement action. Ofcom announced the launch of its new age assurance enforcement programme, that will specifically target sites dedicated to the dissemination of harmful content, including self-harm, suicide, eating disorder and extreme violence / gore content.
    • Extensive monitoring and impact programme. Ofcom announced the launch of an extensive monitoring and impact programme, to hold sites and apps to account, and it will be focusing on the "biggest platforms where children spend the most time". Ofcom noted that this will include a comprehensive review of these platforms' efforts to assess risks to children, which must be submitted to Ofcom by 7 August and scrutinising these platforms' practical actions to keep children safe, details of which must be disclosed to Ofcom by 30 September.  

    UK government unveils plans to prohibit ransom payments for CNI and public sector entities, alongside enforcing mandatory reporting for all private sector organisations. 

     Following a 12-week consultation, the UK government has announced it will progress proposals for a targeted ban on ransomware payments by public sector bodies and critical national infrastructure (“CNI”) owners and operators. In the response the UK government announced plans to take three main proposals forward to "smash the cybercriminal business model" as quoted by the Home Office security minister, Dan Jarvis: 

    1. A targeted ban on ransomware payments for owners and operators of regulated critical national infrastructure and the public sector. 

    2. A ransomware payment prevention regime. 

    3. A mandatory incident reporting regime. 

    The government is now working to refine the scope of these proposals , including key questions around thresholds, liability and enforcement. 

    EU Updates 

    EU Guidelines on Obligations for General-Purpose AI Models 

    On 18 July 2025, the European Commission has published guidelines ("Guidelines") to direct providers of general-purpose AI models in meeting their obligations under the AI Act. The Guidelines entered into force on 2 August 2025. For models already on the market before this date, a transitional period applies until 2 August 2027. In the first year, the Artificial Intelligence Office will actively assist providers in meeting the regulatory requirements. From 2 August 2026, the Commission will begin enforcing these requirements.

    The Guidelines specify what constitutes a general-purpose AI model under the AI Act, clarify who qualifies as a provider, and specify the obligations in detail. They emphasize the variety of the tasks performed by the models and of the AI systems in which they can be integrated, the computational resources used for training, as well as the manner in which the model is made available on the market. The Guidelines draw a clear line between standard general-purpose AI models and those with systemic risk, imposing stricter requirements on the latter. The Guidelines also clarify under which conditions providers can be exempted from certain obligations and how the AI Office and the EU Commission enforce and supervise obligations for General Purpose AI models.AI. The European Commission aims to create legal certainty for everyone involved in the AI value chain and supplement the Code of Practice for General-Purpose AI.

    Joint EDPB-EDPS Opinion on Proposed GDPR Simplification: Balancing Administrative Relief for SMEs and SMCs with Data Protection Standards 

    On 9 July 2025, the European Data Protection Board ("EDPB") and the European Data Protection Supervisor ("EDPS") issued a Joint Opinion on the European Commission’s Proposal to amend several regulations, including the GDPR, as part of the fourth simplification Omnibus. The proposal aims to streamline EU rules and reduce administrative burdens, with particular focus on benefiting small and medium-sized enterprises (SMEs) and small mid-cap enterprises (SMCs). 

    A key challenge identified by the EDPB and EDPS is maintaining the current standard of protection for natural persons fundamental rights contained, in particular their right to the protection of personal data, while pursuing the simplification proposed by the Commission on 21 May 2025. The Joint Opinion actively addresses several aspects of the Proposal, including: 

    • the extension of the Art. 30 para. 5 GDPR derogation on record-keeping for enterprises and organisations with fewer than 250 employees to those with fewer than 750; 

    • the introduction of definitions for SME and SMC; and 

    • broader application of Art. 40 para. 1 and Art. 42 para. 1 GDPR (Codes of conduct and Certifications) to SMCs. 

    The EDPB and EDPS call for further clarification on the extension of Art. 30 para. 5 GDPR and on the term of organisations, as well as for the reference to SMEs and SMCs in that article, while welcoming the Commission's other amendments. 

    Updates from France 

    AI: The CNIL finalises its recommendations on the development of artificial intelligence systems and announces its upcoming work  

    The CNIL has published its latest AI recommendations, clarifying GDPR applicability to models, security requirements, and conditions for annotating training data. Following a public consultation involving companies, researchers, academics, associations, and legal experts, these finalized recommendations address growing concerns about AI system development under the GDPR framework. 
     
    The opinion adopted by the European data protection board (EDPB) in December 2024 reminds that the GDPR often applies to AI models trained on personal data due their memorisation capabilities and risk of disclosure of personal information. The CNIL's new guidance helps stakeholders determine whether their AI models fall under GDPR scope and proposes concrete solutions such as clear documentation or implementing robust filters to prevent personal data processing.

    Two new practical fact sheets have been released: one on data annotation and another on ensuring secure AI system development. The new recommendations were developed with practical tools including summary sheets and checklists to help professionals verify compliance requirements.

    Looking ahead, the CNIL's 2025-2028 strategic plan outlines sector-specific work including recommendations for education, health, and workplace AI usage. In the second half of 2025, the CNIL will release new recommendations to clarify the responsibilities of actors in the AI system creation chain (model designers, reusers, integrators, etc.) under the GDPR. The authority is also developing technical tools through the PANAME project and conducting research on AI explainability to provide legal certainty while fostering innovation. 

    For more details, you can access the CNIL recommendations here (French and English versions available). 

    Right to silence in CNIL proceedings 

    The French Constitutional Council has ruled that the right to remain silent must be notified in CNIL sanction proceedings, including for legal entities (8 August 2025, n° 2025-1154 QPC, Sociétés Cosmospace et autres). It declared unconstitutional provisions of article 22 of the French Data Protection Act that failed to require informing individuals of their right to remain silent during CNIL sanction procedures. This decision, following a priority constitutional question, addresses due process rights in administrative sanctions. 

    The Constitutional Council based its decision on article 9 of the 1789 Declaration of the Rights of Man and of the Citizen, from which derives the principle that no one is required to incriminate themselves. When individuals present observations or appear before the CNIL's restricted formation, they may acknowledge violations without being informed of their right to silence. 

    The abrogation of said article 22 of the French Data Protection Act is postponed until October 1, 2026, during which time persons subject to CNIL sanction procedures must be notified of their right to remain silent. This decision extends silence rights from criminal to administrative proceedings, with broader implications for regulatory enforcement.

    Employment and data access 

    The Cour de cassation has confirmed that employees may exercise their GDPR access rights over the content and metadata of their professional emails, including after termination of employment, subject to third-party rights and confidentiality restrictions (18 June 2025, n° 23-19.022). Indeed, the Court ruled that professional emails sent or received by employees through their work email accounts constitute personal data under GDPR Article 4, and employees have the right to access these emails as well as metadata (timestamps, recipients) and content, including after employment termination. 

    The Court established that the right of access applies unless the requested elements would infringe third-party rights and freedoms. In that regard, employers can invoke exceptions such as business secrets, intellectual property, privacy rights, or correspondence confidentiality. Non-compliance can result in civil liability and damages.

    The CNIL maintains that access rights apply to personal data, not documents themselves, though employers may provide documents containing the data for practical purposes. The advocate general noted that access requests in employment disputes are often made to obtain evidence rather than for GDPR's intended purposes of data management, though the Court maintained its broad interpretation.  

    Update from Germany 

    NIS-2 Transposition Act: Current Legislative Status and Next Steps in Germany 

    On 25 July 2025 the German Federal Government (Bundesregierung) has introduced a draft law to transpose EU Directive 2022/2555 of the European Parliament and of the Council of December 14, 2022 ("NIS-2 Directive"). The NIS-2 Directive requires all EU Member States to adopt national cybersecurity strategies, designate central authorities and computer emergency response teams (CSIRTs) and introduce comprehensive obligations for risk management and reporting  boosting cybersecurity of network and information systems to safeguard vital services for the EU's economy and society. It expands the scope of relevant companies compared to its predecessor – NIS-1 Directive. 

    The NIS-2 Directive sets a deadline for transposing its rules by 17 October 2024. The former German Federal government published its draft by end of 2024. After the snap elections, the new government presented its own version in May 2025 which has already been submitted for approval to the Federal council (Bundesrat)on 15 August 2025 and will have to pass the parliament (Bundestag). 

    The law will enter into force on the day it is published in the Federal Law Gazette (Bundesgesetzblatt). 

    Spotlight on Smart Data: Comparing the UK Data Use and Access Act and the EU Data Act

    Introduction  

    Both the UK’s Data Use and Access Act and the EU Data Act represent significant legislative steps towards facilitating non-personal data sharing, with particular focus on the development of smart data schemes. While the two frameworks aim to enhance data access, competition, and innovation, they diverge in scope, approach, and regulatory detail, creating a complex and evolving legal landscape for businesses operating across the UK and EU.  

    Episode 2 of our Data Bytes podcast explores the UK’s Data Use and Access Act and its interplay with the EU Data Act, focusing on the evolving frameworks for non-personal data sharing. In the podcast (click here to listen or search for “Ashurst Legal Outlook” on Apple Podcasts, Spotify, or your favourite podcast player), Rhiannon Webster and Alexander Duisberg discuss these legislative developments. 

    Scope and Approach  

    The UK’s Data Use and Access Act introduces a sector-agnostic framework for smart data schemes, drawing inspiration from the open banking model to enable secure and efficient sharing of customer and business data. At present, the UK’s approach is high-level, with detailed obligations and requirements to be set out in future secondary legislation. In contrast, the EU Data Act, which will start to take effect this month, adopts a more prescriptive and horizontal approach, applying across all sectors but with an initial focus on connected products. The EU framework is expected to expand through additional sector-specific rules over time.  

    Data Access and Sharing Rights  

    Both the UK and EU regimes impose obligations on data holders to provide access to data. Under the EU Data Act, users of connected products are granted the right to access and share data held by manufacturers and service providers. Access for users is to be provided free of charge, while third-party sharing may be subject to cost-recovery fees. The EU Act also introduces detailed contractual and transparency requirements, with model clauses currently under development. By contrast, the UK’s framework leaves many specifics to be determined in subsequent regulations, creating a degree of uncertainty for businesses preparing for compliance. 

    Safeguards and Limitations  

    Both legislative regimes include safeguards to protect trade secrets and commercially sensitive information. The EU Data Act provides a narrow exemption for trade secrets, requiring data holders to demonstrate irreparable harm in order to withhold data. Additional restrictions are imposed on the use of shared data, including prohibitions on using such data to develop competing products and on sharing data with “gatekeepers” as defined under the EU Digital Markets Act.

    Final thoughts 

    The introduction of the UK Data Use and Access Act and the EU Data Act signals a shift towards a more open, yet carefully regulated, data sharing environment. Organisations should prepare for new compliance challenges and strategic opportunities, ensuring robust internal processes and innovative approaches to data management as these regimes take shape.

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.