Legal development

Data Bytes 62: Your UK and European Data Privacy update for October 2025

graph and lights background

    Welcome back to this latest edition of Data Bytes covering October 2025.  The cyber attacks on Marks and Spencer and Jaguar Land Rover this year are the most catastrophic cyber attacks on UK companies in history and in October we have seen the government and regulators react with some powerful messaging to UK companies. The ICO’s £14 million penalty against Capita signals that security fundamentals; privilege management, lateral movement controls, and responsive alerting, are not optional hygiene but enforceable obligations under UK GDPR. Internal security audits, least-privilege tiering, and robust penetration testing across group systems are now baseline expectations, not best practice badges.  

    That message is echoed in Whitehall. A letter from Ministers to CEOs of leading companies last month emphasises that cybersecurity is a board responsibility, anchoring governance in the Cyber Governance Code of Practice published earlier this year. The call to enrol in the NCSC’s Early Warning service and require Cyber Essentials across supply chains reinforces the fact that you are only as secure as your weakest third party link. This is not gentle guidance; it is an action list for executive accountability.  

    The Information Commissioner’s opening address at its annual data protection practitioner conference sharpened the point: resilience is dynamic and collaborative. DPOs should be shoulder to shoulder with information security teams, pressing for sustained investment in core controls and preparedness. With social engineering still the gateway to many incidents, regular phishing simulations and penetration testing must be routine.  

    The call to action is clear. Build resilience now, and plan for the inevitable. Test incident response end to end, rehearse crisis communications, and ensure legal, technical, and operational teams can move at speed when, not if, the next alert lands. Cyber readiness is not just about preventing compromise; it is about recovering with purpose, compliance, and credibility.

    Moving away from cyber, we’ve now published all our podcasts in our Data Bytes mini series on the Data Use and Access Act.  You can listen to them all here.  Keep scrolling down to our Spotlight section below where we summarise episode 4 of that podcast where Rhiannon Webster, Nicolas Quoy and Shehana Cameron-Perera discussed the changes that the DUAA makes to PECR, the implications for businesses, and practical steps to prepare for the future. 

    Get your Data Bytes here. 

    UK Updates

    ICO fines Capita 14 £million for data security failings

    Following the pattern of increased regulatory scrutiny of data security that we have seen this year, on 15 October 2025 the ICO issued a monetary penalty of £14 million (collectively) to Capita PLC and Capita Pensions Solutions Limited (CPSL). The cyber incident which subsequently lead to this fine occurred in March 2023 and involved the exfiltration of sensitive data (such as special category data, criminal record information and child data) of 6,656,037 individuals across the Capita Group.

    The monetary penalty was issued because the processing of pensions administration and human capital resourcing data was not carried out in accordance with technical and organisational measures or appropriate security processes expected under UK GDPR. Specifically, the Capita Group did not have in place measures that could properly prevent privilege escalation, unauthorised lateral movement through their network or respond to security alerts from as early as May 2018 to the event (for Capita plc) and September 2022 (for CPSL). 

    Following the incident Capita plc implemented security improvements which were deemed to have addressed the previous deficiencies. As such, key takeaways for organisations are: 

    • Undertake internal audits of security for all business units; 

    • Implement a tiering model for administrative accounts following the principle of ‘Least Privilege’; and 

    • Establish a penetration testing program for all internal group systems 

    Note that vulnerability scans do not replace the need for penetration testing. 

    UK Ministers publish letter to leading CEOs about cybersecurity responsibilities

    On 13 October 2025, a ministerial letter titled "making cyber security a board responsibility" was published (the Letter). The letter was sent to leading UK companies (including all FTSE 100 and FTSE 250 companies), in light of recent cyber incidents and details three specific requests that ministers put to the the organisations:  

    • Make cyber risk a Board-level priority using the Cyber Governance Code of Practice: The Letter directs decision makers to the Cyber Governance Code of Practice which was developed in conjunction with industry leaders. Organisation practices with respect to cyber security should be guided by this code, and the ministers re-emphasized the availability of free training that supports the code.

    • Sign up to the NCSC’s Early Warning service: The National Cyber Security Centre (NCSC) offers a free service (known as Early Warning) that inform organisations of potential cyber attacks on their networks. Organisations are encouraged to sign up to this service.

    • Require Cyber Essentials in your supply chain: Cyber Essentials is a government-backed certification scheme that assesses if organisations have key cyber protections. Organisations with Cyber Essentials are 92% less likely to make a claim on their cyber insurance. The Letter reminds decision makers that supply chain related cyber-attacks can have a material impact on organisations, and that immediate suppliers should be required to possess a Cyber Essentials certification.  

    The relevant organisations were encouraged to confirm receipt of the Letter, and provide details of relevant senior contacts that the government may engage with.  

    ICO consults on investigations and enforcement guidance

    The ICO issued on 31 October a consultation on the processes it will follow when investigating breaches of the UK GDPR and Data Protection Act 2018.  The guidance is significantly more detailed than the previous guidance on this topic and explains how the ICO will apply new investigative powers gained under the Data (Use and Access) Act 2025 including the commissioning of reports from approved persons.

    To find out more on these new ICO powers listen to our latest Ashurst Data Bytes Podcast here

    ICO publishes its own internal AI use policy  

    Earlier this month, the ICO announced it had published its internal AI use policy, to help demonstrate the ICO's commitment to harnessing AI in an ethical, transparency way and to help give businesses more confidence and regulatory certainty by showing how the ICO is internally using AI.

    The ICO internal AI use policy can also be a helpful steering document for businesses producing and reviewing their own AI use policies, and includes considerations on: 

    • accountability, decision-making and governance, particularly in the procurement context;  

    • proportionality; 

    • impact assessment, fairness and explainability;

    • safety security and robustness; 

    • AI performance monitoring, and

    • transparency and documentation. 

    Other takeaways that may be interesting to businesses are that the ICO internal AI use policy:

    • Acknowledges that ICO senior leadership should ensure all staff have access to high-quality general AI literacy training and all ICO staff users of a planned AI deployment should be given relevant training prior to such product deployment; 
    • Incorporates practical guidance for ICO staff using AI, such as on being transparent about AI use as appropriate and proportionate, including being clear on and marking where work produced includes AI generated outputs; and
    • Includes a template AI screener and full AI use case specification for the relevant ICO stakeholders to populate, as part of the ICO’s processes on approving AI solutions for development, procurement or deployment. 

    Clearview AI: Upper Tribunal clarifies UK data protection’s extraterritorial reach

    On 7 October 2025, the UK Upper Tribunal handed down a judgement in the ICO’s appeal against the First Tier Tribunal decision concerning Clearview AI’s data scraping and facial recognition practices.

    Despite having no UK presence and serving only non-UK public sector clients, Clearview’s processing activities were found by the Upper Tribunal to fall under the material and territorial scope of the UK GDPR.  In reaching this decision, the Upper Tribunal adopted a broad view of “monitoring the behaviour” of UK data subjects. It noted that monitoring did not require “active watchfulness” and found that creating a persistent facial recognition database constitutes ongoing monitoring, meaning UK GDPR can apply even if a provider has no UK customers and processes data entirely outside the UK.

    The judgement represents a success for the ICO who noted in a statement that the ruling gives greater confidence to people in the UK that the ICO can and will act on their behalf, regardless of where the company handling their personal information is based.  The First Tier Tribunal must now determine ClearView’s substantive appeal against the fine issued by the ICO in 2022 for £7.5million.  

    Key Takeaways from the Information Commissioner’s Opening Address at DPPC 2025

    On 14 October 2025, the UK Information Commissioner, John Edwards, delivered the opening address at the annual Data Protection Practitioners’ Conference. The key messages for organisations are outlined below:

    • Information Commission: From April 2026, the ICO will transition to become the Information Commission signaling a shift in governance structure. However, organisations should not notice much difference in the services provided by the ICO during this transition.   

    • DUAA: The ICO will issue updated guidance to reflect the changes introduced by the Data (Use and Access) Act 2025. The ICO is prioritising the publication of guidance on key topics, including international data transfers and automated decision-making.

    • Cyber resilience: Organisations should remain agile in responding to the increasing threat of cybercrime and focus on reducing the likelihood of successful attacks:  

      • Data Protection Officers should collaborate closely with their information security teams to assess whether their organisation has invested adequately in the fundamentals of cyber security, or if this needs to be addressed as a matter of urgency.

      • Many cybercrime incidents are caused by social engineering, where criminals deceive employees into revealing credentials. Organisations should conduct regular penetration testing and phishing simulations to identify and address vulnerabilities.  

    ICO Puts Imgur in the Frame

    The ICO issued on 30 September an update on its ongoing investigation into MediaLab AI Inc (MediaLab), following concerns about its handling of personal data (especially children’s personal data) in the UK by the social media platform Imgur.

    The ICO has confirmed that it has reached provisional findings on its investigation into MediaLab, and it issued a notice of intent to impose a monetary penalty on MediaLab on 10 September 2025. Following this, the ICO confirms that MediaLab has restricted UK users’ access to Imgur.

    In its statement, the ICO reiterates its commitment to robust enforcement where organisations fall short of their data protection responsibilities. While no formal findings or enforcement actions have been announced at this stage, the ICO’s update serves as a timely reminder for all organisations, especially those which are processing children’s personal data, of the importance of clear privacy notices, robust security measures, and demonstrable compliance with UK GDPR.

    Organisations which do, or may, process children’s personal data should monitor this ongoing investigation closely (along with other ICO investigations into Reddit) and should review internal compliance procedures to ensure that privacy notices are drafted to be sufficiently clear and transparent to children and DPIAs have been completed for all processing involving children’s personal data.

    EU Updates 

    EDPB endorses the UK adequacy decision until December 2031

    On 20 October, the EDPB adopted the European Commission's draft decision to extend the UK’s adequacy decision until December 2031 – this means that data flowing from the EU to the UK can continue to occur without putting in place any additional safeguards such as the EU SCCs.

    The EDPB acknowledged that the UK's data protection framework remains closely aligned with European standards, even in light of recent modifications to UK legislation in the form of the Data (Use and Access) Act 2025. According to the EDPB, the majority of the amendments made to the UK's data protection framework are intended to provide greater clarity and support adherence to the law.

    Notwithstanding this, the EDPB: (i) urged the Commission to monitor areas of potential divergence, such as the Secretary of State’s new regulatory powers, the UK’s rules for data transfers to third countries, and the use of technical capability notices that could undermine encryption; and (ii) encouraged encourages the Commission to continuously monitor and evaluate any changes to the structure and power of the ICO.

    EDPB selects compliance with transparency obligations as fifth co-ordinated enforcement action

    During its October plenary, the EDPB selected compliance with transparency and information obligations under Articles 12, 13, and 14 of the GDPR as the focus of its fifth coordinated enforcement action.

    • Article 12 - The enforcement action will likely assess whether organisations are meeting these standards in their privacy notices and other communications with data subjects. 

    • Article 13 - The action will focus on whether organisations are providing all required information at the time of data collection and whether this information is sufficiently detailed and understandable.

    • Article 14 - The enforcement action will review how organisations fulfil these obligations, particularly in cases involving indirect data collection, and whether data subjects are adequately informed about the origin and use of their data. 

    The coordinated enforcement action will be carried out under the EDPB’s Coordinated Enforcement Framework (CEF), which was established in October 2020 to harmonize supervisory activities across EU Member States. Under this framework, Data Protection Authorities from different countries are invited to participate voluntarily. Those that choose to join will prioritise national-level checks on how organisations provide information to data subjects regarding the processing of their personal data. The results of these national assessments will be aggregated and analysed by the EDPB to generate comprehensive insights into the state of transparency and information practices across the EU. The action is scheduled to be launched in 2026, following a recruitment period for participating DPAs in the coming weeks.

    EU Commission announces stakeholder event on anonymisation and pseudonymisation

    The EU Commission is organising an event for the end of the year to collect stakeholders’ input on anonymisation and pseudonymisation following the clarification on the concept of personal data provided by the Court of Justice of the European Union ("CJEU") in its judgement in EDPS v Single Resolution Board (SRB) (see our article on this ruling here). The SRB ruling will likely motivate the EDPB to revamp its draft pseudonymisation Guidelines and most likely also influence further drafting process of the long-awaited EDPB's Guidelines on anonymisation, too.

    Ashurst's Digital Economy Team is accompanying stakeholders in that event; please contact us if you are interested to learn more about this. 

    LinkedIn’s AI Plans for use of publicly available data under Regulatory scrutiny

    LinkedIn has announced that it will start using EEA user data to train its AI models from 3 November 2025 onwards. Such user data will include public profile information and user content but will exclude private messages and data from minors. This approach follows Meta’s use of public data from Instagram and Facebook users to train its AI.

    Authorities have responded swiftly:

    • The Irish DPC will lead the case under the one-stop-shop mechanism and assess LinkedIn’s approach in coordination with other EU regulators. 

    • German authorities have expressed concerns on whether this approach is compliant under the GDPR. 

    • Norway’s DPA confirmed the update and asked LinkedIn to justify its legal basis, clarify use of non-minor content, and explain opt-out options for business users. 

    • The Dutch DPA urged users to review settings and confirmed it has received complaints. 

    • The Belgian DPA has published instructions for objecting, warning that AI cannot “unlearn” data once used.  

    • Croatia’s PDA (AZOP) reminded users of their right to object which can be exercised via account settings until 3 November 2025. 

    This approach is a reminder to organisations that the use of users’ personal data to train AI models will have many data protection implications and will come under regulatory scrutiny. Ensure that a DPIA is completed for any such similar use, in particular paying close attention to lawful basis and the individual's right to object. 

    New rules for AI incidents

    On 26 September the EU Commission issued new draft guidance and a reporting template on serious AI incidents ("draft Guidance"). However it does not cover AI systems in medical devices or certain regulated sectors, such as vehicles or aviation as the AI Act exempts such sectors from the reporting rules. The obligation to report serious incidents under the AI Act applies from the date on which the relevant provisions of the AI Act become applicable (i.e. 2 August 2026 if the EU does not postpone its application). 

    The draft Guidance clarifies that:

    • providers and deployers of high-risk AI systems must report “serious incidents” immediately, including deaths, major health impacts, irreversible disruptions to critical infrastructure, or significantbreaches of fundamental rights, with statutory periods ranging from 2 to 15 days depending on the severity of the incident; and

    • “Widespread infringements” meaning large-scale violations, must also be reported without delay. 

    Companies providing or deploying high-risk AI systems should: 

    • review and update their reporting procedures and internal workflows; and 

    • note that incident reporting obligations under the NIS2, DORA and GDPR will also apply. Therefore companies should map its reporting obligations and adjust reporting procedures accordingly.

    EU Turns Up the Heat: Major Platforms Face DSA Scrutiny Over Child Protection

    On 10 October, the EU Commission launched investigations following the Guidelines on Protection of Minors under the Digital Services Act (DSA) against Snapchat, YouTube, the Apple App Store, and Google Play Store focusing on child protection and demanded answers relating to age verification and content moderation.  

    The Commission demanded: 

    • Snapchat to demonstrate how it blocks minors under the age of 13 from its platform; 

    • YouTube to explain its recommender system after reports became public about harmful content reaching minors; 

    • that Apple and Google disclose how they prevent illegal or harmful apps, including gambling or non-consensual sexual content tools – from reaching minors. 

    The Commission also seeks detailed information on the application of App age verification ratings. Platforms must review and strengthen their systems to better protect minors. The Commission is collaborating with national authorities to identify high-risk platforms. 

    European Commission and EDPB publish joint draft Guidelines on DMA - GDPR interplay 

    On 9 October, the European Commission and the EDPB published their joint (draft) guidelines on the interplay between the Digital Markets Act (DMA) and the GDPR. The draft guidelines are open for consultation until 4 December 2025, and the final guidelines should be adopted in 2026.  This collaborative effort aims to ensure a coherent and effective application of the two regimes and provide guidance to gatekeepers on how to interpret and comply with these two sets of rules. In particular, the Guidelines: 

    • build on the inherent complementarity and interplay between the DMA and GDPR, which both seek to safeguard individual rights in digital environments: fairness and contestability under the DMA, and privacy and data protection under the GDPR. In particular, they clarify that the rights and remedies available to data subjects under the GDPR remain unaffected by DSA procedures, e.g. the action and complaint procedure;

    • clarify how gatekeepers can comply with both frameworks simultaneously, addressing the cross-use of personal data across core platform services, data portability, access requests, and the interoperability of messaging services;

    • explain that where intermediary service providers process personal data in the context of addressing illegal content, any processing of personal data under the DSA must still be both necessary and proportionate;

    • address the intersection of the DSA’s provisions on deceptive design patterns and recommender systems with the GDPR’s requirements for transparency and profiling. The EDPB underscores that certain content recommendations may qualify as automated decision-making under the GDPR, demanding controllers to implemented and document additional safeguards;

    • provide practical guidance on how specific DMA provisions should be applied consistently with the GDPR. E.g they outline how gatekeepers must obtain valid consent before cross-using personal data, ensure effective anonymisation in search data sharing, and respect data minimisation and proportionality principles when implementing interoperability obligations.; and

    • emphasise close coordination between the European Commission and data protection supervisory authorities to avoid conflicting enforcement and ensure a coherent interpretation of both regulatory frameworks. 

    Updates from France 

    Ruling that the right to rectification does not include subjective assessments

    On 30 September, the Conseil d'État ruled (French only) that the right to rectification cannot be applied to assessments and other subjective personal data in particular professional assessments. This ruling was issued in a case regarding a rectification request contained in professional assessment forms used to determine whether or not to grant a disability compensation benefit.  

    In this decision, the Conseil d'État recalled that, pursuant to Article 16 GDPR, the data subject has the right to have rectified, inaccurate personal data which are relevant to the purposes of the processing, where such rectification is not likely to affect the purposes of the processing. It also states that the data subject can obtain the completion of incomplete personal data when such a situation is likely to compromise the purposes of the processing.  

    The CNIL publishes its final version of its transfer impact assessment (TIA) guide

    Following a public consultation and the recommendations of the EDPB, the CNIL published the final version of its TIA guide for controllers and processors (Guide). This version includes new analyses and contributions, especially the most recent opinions of the EDPB. Whilst not mandatory, it serves as a methodology outlining the steps that should be taken before conducting a TIA.  

    The CNIL recommends the following six-steps process to carry out a TIA and provides examples of tables to be completed to follow each of the steps:  

    • Step 1: Description of the transfer (e.g. name of the exporter, contact details, destination country, name of the importer, importer qualification in the context of data transfer, type and frequency of transfer, categories of data subjects, etc.).

    • Step 2: Identification of the transfer tool used, in accordance with Article 46 of the GDPR.

    • Step 3: Assessment of the legislation and practices in the third country where the data will be transferred and the effectiveness of the transfer tool (e.g. data protection framework applicable to the importer, scope of data protection framework, independence of the data protection authority, rights of the data subjects, remedies and sanctions, etc.). 

    • Step 4: Adoption of additional measures, where required: if the Steps 1 to 3 reveal shortcomings, additional measures might be necessary and it is then required to follow Step 4 (e.g. description of additional technical, organisational or contractual measures).

    • Step 5: Implementation of additional measures, where required: after completing Step 4, an action plan might be necessary to implement additional measures. 

    • Step 6: Regular reassessment of the transfer tool: the transfer tool and, where required, the additional measures should be reassessed periodically. 

    Where your organisation makes any transfers from France, we strongly advise that such transfers and TIAs are documented in compliance with the CNIL Guide. 

    Updates from Spain

    DIGI Spain Telecom fined with 200,000 euros  for duplicating SIM cards without proper verification of identity 

    The Spanish Data Protection Agency ("AEPD") has fined DIGI Spain Telecom ("DIGI") 200,000 euros which ultimately stemmed from a complaint filed by a customer of DIGI. The complainant reported that, in October 2022, a third party, using a manipulated ID, fraudulently obtained duplicates of his SIM card at DIGI distributor establishments and this led to access of his personal and banking data and fraudulent transfers and operations which resulted in the theft of significant amounts of money.  

    The AEPD concluded that DIGI infringed Article 6(1) GDPR when issuing SIM duplicates to a third party without proper identity verification and without a valid legal basis. The AEPD classified this infringement as very serious and argued that: 

    • issuing SIM duplicates involves the processing of personal data, as the SIM card and associated data (name, surname, ID, phone number) allow the identification of the holder; 

    • DIGI did not effectively apply its own security protocols, since the back office agents did not properly verify the authenticity of the presented ID, which enabled the impersonation;  

    • the existence of DIGI's internal protocols does not result in the exemption from responsibility if they are not followed or prove ineffective; and 

    • DIGI’s actions were deemed negligent, as the company failed to detect the anomaly or set up alerts to prevent repeated fraud. The AEPD also rejected DIGI’s arguments regarding lack of fault and absence of economic benefit, noting that corporate responsibility is based on the required diligence, not on the outcome. 

    The AEPD concludes that the repetition of this type of conduct (DIGI had been sanctioned on several occasions for substantially identical acts) reveals a lack of effective correction of internal procedures and insufficient assumption of legal obligations; therefore requiring a more severe sanction. 

    Spotlight on Data Protection and Privacy: Key Changes to PECR and What’s Next  

    Introduction 

    The UK’s Data (Use and Access) Act (DUAA) is not just a technical update to the country’s data protection landscape; it marks a significant shift in how marketing practices, cookie use, and enforcement under the Privacy and Electronic Communications Regulations (PECR) will be approached.

    In episode 4 of Ashurst’s Data Bytes podcast, Rhiannon Webster, Nicolas Quoy and Shehana Cameron-Perera discuss the changes that the DUAA makes to PECR, the implications for businesses, and practical steps to prepare for the future. 

    Cookie Use: A New Direction 

    The DUAA introduces a more risk-based and flexible approach to the use of cookies. Firstly, it sets out a list of what will be deemed strictly necessary cookies:

    • cookies that ensure the security of the terminal equipment; 

    • cookies that prevent or detect fraud, or detect technical faults; 

    • cookies that automatically authenticate the identity of the user; and 

    • cookies that record information or record selections that the user has made on an online service. 

    Secondly, it introduces additional exceptions to obtaining consent for other cookies. When in force, consent will no longer need to be obtained for cookies that are used:

    • for statistical purposes to make improvements to the service or website; 

    • to adapt the website to reflect and save preferences or to improve appearance and functionality; and 

    • to identify the location of a subscriber or user requiring emergency assistance. 

    Although for the analytics and the appearance cookies, users still have to be given information about those cookies and a means to object.

    PECR Fining Regime: Higher Stakes for Non-Compliance with Marketing and Cookie Laws 

    One of the most immediate and impactful changes brought by the DUAA is the increase in maximum fines for breaches of PECR which governs marketing and cookie practices. The new regime aligns PECR penalties with those under the UK GDPR, meaning that organisations can now face fines of up to £17.5 million or 4% of global annual turnover, whichever is higher. This significant increase in potential liability underscores the importance of robust compliance programmes and regular reviews of marketing practices.

    Looking Ahead: Preparing for Change 

    While the DUAA does not immediately rewrite the rules for marketing and cookies, it sets the stage for further regulatory evolution. Organisations should: (i) review existing marketing practices in light of the increased penalties and reconsider risk appetites; and (ii) in respect of use of cookies, prepare for a future in which cookie banners and consent mechanisms will become less burdensome.  

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.