Data Bytes 61: Your UK and European Data Privacy update for September 2025
08 October 2025
08 October 2025
Welcome back to our September/October edition of Data Bytes. At the time of writing the Data Bytes have just launched the final episode of our podcast series on the UK’s new Data (Use and Access) Act (“DUAA”). This sixth episode concentrates on the new enforcement and information gathering powers of the Information Commission and the potential effect on businesses. You can listen to the full series of Ashurst Data Bytes here. Keep scrolling to our spotlight section where we summarise Episode 3 of our series on AI, Data Protection, and IP.
There are still plenty of DUAA stories this month, with the ICO gearing up its new suite of guidance. You will see from our alerts below that there are many consultations you could respond to and help shape the ICO view on the practical implication of the new law.
You’ll also see a flurry of enforcement action in relation to data subject rights requests (“DSRs”). It feels like now could be the time for organisations to relook at its DSR practices, particularly when looked at in conjunction with the implementation of the new complaints handling processes required by the DUAA.
Moving over to Europe, you will also find our digests of the Latombe Judment (where the EU-US data privacy framework survived legal challenge) and the SRB judgment (where the CJEU ruled that pseudonymised data in the hands of the controller can be anonymised data in the hands of the processor, provided the processor does not have the means reasonably likely to be used to identify the natural person).
Get your data bytes here (and please listen to our podcast too!)
In response to changes brought in by the Data (Use and Access) Act 2025 (DUAA) which will be in force by January or at the latest June 2026, the ICO launched two public consultations on 21 August to help shape its final guidance on ‘recognised legitimate interests’ and ‘data protection complaints’.
Recognised legitimate interest:
Data protection complaints:
Keep a watching brief on both consultations and the final guidance as it will be key to understanding ICO expectations on when you can rely on recognised legitimate interests and how organisations should handle data subjects.
We also recommend waiting for the final guidance before operationalising the complaints requirement and/or updating privacy notices and ROPAs to rely on recognised legitimate interests as the guidance may change as a result of the consultation process.
The ICO is also consulting on how it handles data protection complaints (the consultation launched on 22 August and is open until 31 October). The draft proposals in the consultation represent the most significant overhaul of the ICO’s complaints handling model since the UK GDPR came into force and are designed to address a sharp rise in case volumes while ensuring that regulatory effort is directed to the areas of greatest harm.
At the heart of the consultation is a draft framework that will allow the ICO to decide, at an early stage, the extent to which each complaint merits investigation; the ICO would assess each complaint at the outset and pursue only those suggesting serious, systemic or high-impact infringements. Matters involving minor harm or where the controller has already engaged the individual could be logged for intelligence purposes and closed, freeing resources for cases that genuinely warrant intervention.
The consultation is designed to dovetail with the new DUAA complaints obligations, described above. For businesses, the message is twofold:
The ICO is warning the public to be on their guard against unlawful 'robo' calls and has recently fined Home Improvement Marketing Ltd (HIM) £300,000 and Green Spark Energy Ltd (GSE £250,000 for making 9.5m automated marketing calls. Both firms used ‘robo call technology’ in the form of an avatar software, to give call recipients the impression they were talking to a staff member in the UK but were in fact scripted lines recorded by voice actors and played by call agents abroad. The tell tale sign of ‘robo calls’ are: slight paused before responses, limited flexibility due to a heavily crafted script, identical voice/tone across calls and no background noise.
Robo and scam calls have become increasingly prevalent in recent years and these recent fines demonstrate the ICO’s willingness to crack down on them. In the ICO’s blog about this enforcement action, it is urging the public to “take note of our tips to spot these robo calls so they can tell us when they’ve received one. This will help us investigate and take enforcement action.” If your organisation undertakes the use of automated calls or such software, take heed of this ICO warning and undertake a review of your marketing practices and risk appetite; particularly as the Data (Use and Access) Act will be amending PECR to increase the fining regime to that under the UK GDPR. So, whilst marketing fines are currently capped at £500,000, once the relevant provisions come into force (expected January, or at the latest June 2026), the ICO will be able to significantly increase its fines and perhaps it will start flexing such powers with robo calls!
On 22 August , the Court of Appeal ("CoA") overturned the High Court in its decision in Farley v Paymaster 1836 Ltd (t/a Equiniti). The CoA ruled that:
Organisations should take heed of the judgment, noting the following significant implications:
In September 2025, John Blake, the director of Bridlington Lodge Care Home was fined £6,540.00 for refusing to respond to a Data Subject Access Request (DSAR) made on behalf of a resident. Unsurprisingly, Mr. Blake's refusal to respond to, let alone fulfil, the DSAR resulted in a complaint to the ICO. Upon investigation, Mr. Blake was unable to provide an explanation for the lack of response and a court found him in contravention of s. 173 of the Data Protection Act 2018 (DPA).
Pursuant to s. 173 DPA, it is a criminal offence for organisations to alter, deface, block, erase, destroy or conceal information with the intention of preventing disclosure of data to a DSAR. The criminal liability can lie with the controller entity or with individual directors. This prosecution is a stark reminder to all directors and employees that data protection compliance is not just an obligation for the controller organisation; it can have serious repercussions for them personally too.
Carrying on with the theme of DSARs, on 24 September the ICO issued an enforcement notice to Bristol City Council (the Council) following its repeated failures to comply with DSARs within the statutory deadlines. The ICO had been engaging with the Council since February 2023 following multiple complaints it had received: 170 DSARs had not been responded to within the statutory deadline and the oldest outstanding DSAR was from 2022. Whilst the Council had taken steps to address the backlog such as: (i) engaging an external organisation for assistance; (ii) increasing staffing on its response teams; and (iii) creating an action plan focusing on training and resource deployment, these were deemed not to unsatisfactory.
The enforcement notice imposed by the ICO requires the Council to: contact all people with overdue DSARs to notify them of delays; provide responses to outstanding DSAR responses by set deadlines (for the oldest cases from 2022 to be resolved within 30 days), provide weekly progress updates to the ICO until all overdue SARs are resolved; create a new action plan within 90 days to address the SAR backlog, including clear responsibilities, prioritisation and timelines; and within 12 months make system and process changes to ensure future SARs are identified and completed on time.
Key takeaways, that informed the ICO's decision, for all organisations are:
DSAR Management – ensure you have a robust procedure for tracking DSARs against statutory deadlines.
Communication – key is communicating with data subjects. Regularly update data subjects on timelines and explain any delays in responding to SARs.
Active management of backlogs – if you have any late DSARs, implement a plan to actively address backlogs in an adequate timeframe.
In February, the ICO published draft guidance for organisations eyeing a ‘consent or pay’ cookie model; meaning users must either agree to all cookies or pay a subscription to browse without non-essential cookies. See our previous data byte on this.
Meta has been at the center of this debate, having recently been fined €200 million in the EU for breaching the Digital Markets Act (but not for breaching the EU GDPR or PECR). Now, after discussions with the ICO, Meta is set to roll out ‘consent or pay’ across its UK services. The ICO has released a statement, highlighting that:
it is pleased Meta is moving away from making consent to targeted ads a condition of using Facebook and Instagram, a practice previously flagged as non-compliant with PECR;
it recognises that online platforms need to be commercially viable for organisations (a key driver for advocates of ‘consent or pay’), but insists users must have “meaningful transparency and choice” about their data; and
it expects Meta to assess the impact of this new model, especially how users respond, to ensure ongoing compliance with UK law. The ICO will be watching Meta’s roll-out - and the wider market - closely.
Any organisation considering ‘consent or pay’ should keep a close eye on these developments. Those already using this model should consider undertaking the impact assessments the ICO expects of Meta. Additionally, organisations with global or UK/EU websites should watch for diverging approaches between the UK and EU, especially regarding any Digital Markets Act requirements in the EU that organisations will trigger. The upcoming Data (Use and Access) Act, which will exempt some cookies from opt-in consent, could also shake up how ‘consent or pay’ works potentially affecting what can be charged when fewer cookies need consent.
On 3 September 2023, the EU General Court ("GC") ruled on a case brought by Philippe Latombe, a French Member of Parliament (GC, T-553/23, Latombe v Commission). Latombe challenged the European Commission’s adequacy decision for the US Data Privacy Framework ("DPF") and argued that the US legal system lacked judicial independence, effective redress, or protection against government surveillance.
The GC dismissed the challenge. In particular, the GC held that the US’ Data Protection Review Court ("DPRC") is independent and provides adequate safeguards Further, the GC held that the US intelligence agencies are now subject to strict legal requirements and ex-post judicial review when collecting data.
The GC marks a shift from earlier European Court of Justice ("ECJ") rulings, including Schrems I (ECJ, C-362/14, 6 October 2015) and Schrems II (ECJ, C-311/18, 16 July 2020). It remains unclear whether Mr Latombe will appeal this decision to the ECJ but for now, transfers to the US under the DPF can continue!
On 4 September , the Court of Justice of the European Union (CJEU) ruled that pseudonymised data in the hands of the controller can be anonymised data in the hands of the processor, provided the processor does not have the means reasonably likely to be used to identify the natural person directly or indirectly, considering all relevant circumstances and technical safeguards.
The ECJ ruling follows an action by the Single Resolution Board ("SRB") against the European Data Protection Supervisor ("EDPS"). By way of background, the action was brought by SRB, the EU central authority responsible for bank failure management within the Banking Union, against Banco Popular Español, a Spanish bank. As a consequence, the Spanish bank entered into a restructuring phase under the control of the SRB and SRB transferred pseudonymised comments of former creditors and shareholders and other data subjects’ personal data of the Banco Popular Español to Deloitte. SRB did not inform the data subjects about the transfer of their comments to Deloitte and the EDPS argued that SRB had breached the GDPR's information obligations by not doing so.
This is a landmark case clarifying the distinction between pseudonymized data and anonymized data and should be considered when sharing data with third parties and read in conjunction with the EDPB’s draft guidelines on pseudonymisation. Key takeaways are:
The Austrian consumer protection association (Verbraucherschutzverein, "VSV") has initiated a class action before the Higher Regional Court of Hamburg (OLG Hamburg, 11 VKl 1/25). VSV claims that Meta tracks user activities across thousands of websites and apps using hidden programs on Instagram and Facebook and obtaining sensitive user data, including information about health, religion, and sexuality.
VSV seeks EUR 5,000 in damages for each adult user and EUR 10,000 for each minor user. In addition to monetary compensation, the VSV demands that Meta deletes user data, ceases further data collection, and provides detailed information about its data practices.
Meta has publicly rejected all of VSV's claims as unfounded and has stated to the German Press Agency that it will contest the allegations vigorously.
This class action follows a recent decision by the Regional Court of Leipzig on 4 July 2025, which found that Meta Business Tools, as implemented on Facebook and Instagram, can cause non-material damage to users (LG Leipzig, 05 O 2351/23). The court ruled that a user's feeling of constant observation constitutes compensable damages under the GDPR and awarded EUR 5,000 in damages.
With the Data Act taking effect on 12 September 2025, the EU Commission published an updated version of the FAQ ("FAQ-Update") and new Guidance on Vehicle Data under the Data Act ("Guidance").
The FAQ-Update: Key Practical Changes
The Guidance on Vehicle Data: Clarifications for the Automotive Sector
Purpose of the Guidance: The European Commission's Guidance assists automotive stakeholders, including manufacturers (OEMs), suppliers, aftermarket service providers, and insurance providers, in understanding and applying the data access and sharing rules.
Scope: it only covers vehicles that qualify as connected products and related services which include remote vehicle control services that activate or perform vehicle functions and non-regular repair and maintenance that use a bi-directional data exchange between the vehicle and the service provider to change or add vehicle functions.
Limits on Data Sharing Obligations: Data holders must provide access only to 'raw' and 'pre-processed' vehicle data, plus the necessary metadata, and not to information inferred or derived from this data, such as insights generated through proprietary algorithms or complex processing. The Guidance provides the following examples in the automotive sector:
In scope data: sensor signals, sound waves captured by microphones and data directly resulting from manual commands.
Out of scope data: vehicle speed, flow rates, state or condition of a vehicle system or component obtained from the processing of raw data.
User Access and Data Holder Responsibilities: The Commission reminds data holders in the automotive sectors that user access and use of product and service data – including sharing with third parties – is the core feature of Chapter II. Data holders must provide users access to product and service data. If direct access is not possible, they must provide indirect access to readily available data. This includes data from a connected vehicle sent to an OEM’s backend server, as in the extended vehicle model. OEMs can choose how to provide access.
On 1 September, CNIL imposed 200 million euro fine against GOOGLE LLC and 125 million euro fine against GOOGLE IRELAND LIMITED, totalling 325 million euros, for violating:
Both companies were ordered to comply with the decisions within six months or pay 100,000 euros per day of delay.
SHEIN
On 1 September, the CNIL imposed a fine of 150 million euros against the company INFINITE STYLES SERVICES CO.LIMITED, a subsidiary of the SHEIN group, for placing trackers on the website 'shein.com' without requiring prior consent from internet users. In addition, the choices of the users of the website to refuse cookies were not respected and the users were not informed properly regarding the use of trackers.
The CNIL considered that:
there is a ban on placing trackers for advertising purposes as soon as users arrive on the website, even if they have not yet interacted with the information banner to indicate their choice; and
information banners must contain information on the advertising purpose of trackers and on the identity of third parties that may place trackers on the website. It also reminded that users must have the option to refuse the recording of trackers on their device.
During the proceedings, the company modified its website and therefore no compliance order was issued by the CNIL.
The enforcement action taken against these two companies highlight that violations of cookies rules are sanctioned heavily in France so organisations with a presence in France should take this as a reminder to regularly review and audit your cookie practices.
While the UK’s Data (Use and Access) Act (the DUAA) has introduced changes to how AI is regulated from a data protection perspective, its passage through parliament also opened up a broader debate about the regulation of AI including in the context of intellectual property rights.
In episode 3 of our Data Bytes podcast Rhiannon Webster is joined by Aaron Cole, Will Barrow, and Tom Brookes to dissect the DUAA's implications, with a particular focus on the evolving frameworks for automated decision-making and the ongoing debate over copyright and data mining.
As you may recall, several last-minute proposals from the House of Lords relating to AI and copyright led to weeks of back and forth between the Lords and the Commons (and some high profile comments from stars such as Elton John and Dua Lipa), delaying the Act’s Royal Assent.
Ultimately, DUAA leaves the rules on AI data mining and scraping largely untouched for now. Artists and creators pushed for tougher protections, while the tech industry lobbied for more freedom. Instead of picking a side, the government hit pause—ordering an economic impact report to weigh up four options: keep the current law, tighten licensing, allow broad data mining, or introduce a transparent opt-in/opt-out system. Until the report lands, uncertainty reigns, and ongoing lawsuits like Stability AI v. Getty Images highlight the risks for businesses.
From a data protection perspective, the DUAA introduces changes in connection with AI which are focused on decisions without any human involvement - these are known as automated decisions.
The previous rules in place in the UK before DUAA passed were inherited from the current EU GDPR regime. Under Article 22 of the EU GDPR there is a prohibition on automated decisions which have a legal or similarly significant effect unless one of three lawful are relied on. The first is you've got the explicit consent of the individual concerned. The second is that there was a contract between the organisation who's making the automated decision and the impacted individual. And the third is that the decision is authorised by law.
Under DUAA, these limitations now only apply to automated decisions with a significant effect involving the processing of special category personal data. This means organisations can rely on any lawful basis, including the more flexible “legitimate interests” ground, when the processing involves other types personal data. This opens the door for broader use of AI in areas like HR and finance.
Even with these changes, the DUAA keeps key protections in place. Organisations must be transparent about automated decisions and respect individuals’ rights to challenge or seek human review. The divergence between the UK and EU rules adds complexity, especially as the EU AI Act brings in extra requirements for high-risk AI systems. For global businesses, this means compliance frameworks will need to cover both regimes, and major changes are unlikely in the short term.
The ICO is set to ramp up its focus on automated decision-making, with new guidance and statutory codes on the horizon. Organisations should get ahead by reviewing their data protection and legitimate interest assessments, and by making sure legal and tech teams are aligned on how AI decisions are made. With the government’s impact report and further ICO guidance on the way, businesses need to stay alert and ready to adapt as the rules continue to evolve.
The DUAA represents a step towards a more flexible and innovation-friendly regime for AI and data-driven technologies in the UK. However, with key questions around copyright and data mining still unresolved, and with regulatory divergence from the EU, organisations face a period of uncertainty. Staying abreast of forthcoming government reports and regulatory guidance will be essential for managing risk and ensuring ongoing compliance in this dynamic environment.
Listen to the full Ashurst Data Bytes podcast series on the Data (Use and Access) Act here.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.