Legal development

Data Bytes 39: Your UK and European Data Privacy update for August 2023 

Triangular Colorbond profiles

    Welcome to our August edition of Data Bytes, where the Ashurst UK and European Data Privacy and Cyber Security Team look to summarise the key privacy legal and policy developments of the previous month.

    It's been another month of data issues hitting news headlines and consumer inboxes. On the cyber side, there have been a number of high profile data breach and cyber activities announced this month affecting the Police Service of Northern Ireland, the Electoral Commission, the NHS and a custom invite service provider who was forced to shut down following a cyber-attack. Meanwhile, the data geeks among us may have spotted a flurry of privacy policy update emails as many organisations, in fear of the multi-million Euro fines levied against Meta earlier this year, cease to rely on "performance of a contract" for their legal basis of processing data unless such processing is necessary "to fulfil the clearly stated and understood objectives or 'core' of the contract”.  This has prompted organisations across Europe to assess whether each processing activity undertaken can be justified as going to the core of the contract and, if not, switching to the legitimate interests legal basis (and the resulting necessary legitimate interests assessment and update to privacy policies). 

    We've also seen a very public U-turn from Zoom, who after updating their policy to allow use of customer data for training AI, in response to a very public backlash, quickly rowed back and confirmed that it will not use any "audio, video, chat, screen sharing, attachments or other communications-like customer content (such as poll results, whiteboard and reactions) to train Zoom or third-party artificial intelligence models". This story is a good reminder of the importance of good data governance in AI, and prompted by this we asked Matt Worsfold, Data & Analytics Practice head at Ashurst, to consider the topic of good AI governance in our "Have you thought about" section below.  

    Get your byte sized updates here.

    UK Developments

    1. UK Information Commissioner's Office (ICO) and Financial Conduct Authority (FCA) Joint Statement confirms regulatory communications can be sent to customers who have opted out of marketing

    With Consumer Duty going live on 31st July 2023, firms have been considering what communications they can lawfully send to customers who have otherwise opted out of marketing communications but which meet the new requirements of Consumer Duty.  The joint statement by the ICO and FCA announces that the two regulators have written a letter to the UK Finance and Building Societies Association in response to queries from some firms regarding whether UK data protection laws prevented them from sending communications to customers about better savings rates. The letter and statement follow on from ICO guidance on direct marketing and regulatory communications in March 2023 which explains how to draft regulatory communications and contains several useful examples.

    The statement confirms that firms can send regulatory communications to its savings customers providing "neutral, factual information about the interest rate and terms of the savings product they hold, interest rates and terms of other available savings product, and what their options are for moving to another product". The FCA and ICO expressly clarify that the sending of such communications is not prohibited by the UK GDPR and/or PECR when requested or required by a statutory regulator such as under the FCA's Consumer Duty and can even be sent to customers who have opted out of receiving marketing communications. We would also recommend that you adhere to the joint letter's advice and ICO guidance on direct marketing and regulatory communications: keep the communication to a neutral and factual tone and even consider calling out that the message being provided is a regulatory requirement. Alternatively, consider displaying regulatory communications on websites or customer portals. 

    2. ICO and CMA to investigate harmful web-design practices

    The ICO and the Competition and Markets Authority (CMA) released, on 9 August, a position paper concerning harmful design practices that undermine control over data and lead to worse consumer and competition outcomes. The regulators warned they will be conducting a review of the most frequently used websites in the UK and will be taking enforcement action where design is affecting customers and in particular, people at risk of vulnerability. This reference to focussing on harms impacting people at risk of vulnerability has parallels to the FCA's recently released Consumer Duty obligations. One area of harmful design highlighted in the position paper is cookie consent banners which have bundled consent mechanisms or employ nudging or encouraging users to take certain decisions. The ICO and CMA provided an example of a user with a gambling addiction consenting to use of their information for targeted advertising and this leading them to be shown adverts encouraging them to gamble. Organisations conducting targeted advertising campaigns should review their cookie consent journeys in light of the harmful practices described in the position paper and pay particular attention to any targeting practices that could cause financial loss or distress to vulnerable individuals. 

    3. ICO Annual Report highlights focus on children's privacy and targeted advertising

    The ICO released in July its Annual Report and Financial Statements for 2022-23 which summarises its regulatory activity over the past 12 months including 13 investigations in relation to children's privacy. These investigations resulted from a targeted "sweep" assessment by the ICO around the period when the Children's Statutory Code was released. The ICO noted it has concluded three of the investigations and is currently in the process of finalising a further three investigations. Organisations who are unsure whether the Children's Code applies to their processing activities should consider reviewing this FAQs page. The ICO also provided details of its investigations into the use of targeted advertising in the gambling sector. The ICO explained the investigations are focussed on how the misuse of people's personal information can contribute to problem gambling. Organisations who are processing children's data or operating in the gambling sector should expect further regulatory activity and publications from the ICO over the coming months.    

    4. ICO consults on Biometric Data Guidance

    The ICO launched on 18 August a consultation on draft guidance it has produced on biometric data and technologies. The overarching purpose of the guidance is to explain how data protection law applies to the use of biometric data in biometric recognition systems. The draft guidance follows warnings last year from the ICO about discrimination risks relating to "immature biometric technologies" which monitor subconscious behavioural or emotional responses. Key definitions, such as "biometric recognition" and "biometric data", are clarified in the draft guidance along with advice on compliance topics such as designation of controller/processor roles, choice of lawful basis and conducting data protection impact assessments. The consultation runs until 20 October 2023. 

    5. Online platforms under greater scrutiny to address unlawful data scraping

    On 24 August the ICO and eleven other data protection and privacy regulators across the world issued a joint statement that operators of platforms (especially social media platforms) and other publicly accessible sites have obligations to protect publicly available personal information from unlawful data scraping, and that data scraping incidents can constitute notifiable data breaches. The data protection regulators said that they are seeing increasing incidents involving data scraping, particularly from social media and other websites that host publicly accessible data.

    This is a significant shift in regulatory focus: from clamping down on those that unlawfully collect personal information through data scraping to placing greater regulatory scrutiny on the obligations of operators of online sites and platforms (particularly those with significant or sensitive datasets) that host publicly accessible personal information to protect personal information hosted on their websites or platforms from unlawful data scraping. 

    Operators of platforms and websites that host publicly accessible personal information should review their privacy collection and handling practices and review their terms of use with users and subscribers to ensure appropriate terms reflective of their collection and handling of personal information are included. This statement is also a reminder for all other organisations that publicly available personal data is still subject to data protection laws in most jurisdictions including the UK.

    Read more on the data scraping joint statement in our Ashurst article here.

    EU developments 

    1. Rights of access and portability under the GDPR and Data Act

    On 27 June, the European Parliament, the Council and the Commission concluded their trilogue with a final agreement on the wording for the new EU Data Act. At this stage, the formal publication in the Official Journal is pending, after which a 20-month transition period will start before the Data Act comes into effect. The Data Act focusses largely on data from connected products and related services (IoT data) and creates a harmonised regulation for using, accessing and sharing such data, including rules on unfair contract terms for sharing data between enterprises. However, it also sets rules for business-to-government data sharing in exceptional situations, switching between data processing services (cloud-switching), the interoperability of data spaces and data processing services between data spaces and the use of smart contracts for data sharing agreements, which are all intended to unlock the economic value of data and foster the growth of the data economy. The data subject rights of access and portability under the GDPR have played an important role in shaping the core rights of access and data sharing under the Data Act. However, unlike the GDPR, the Data Act allows the data user (which can be a business or an individual) to make a wider claim against the data holder when sharing data. Notably, the data user can claim that the data holder provides the "relevant metadata that is necessary to interpret and use the data" and share that data with a third party (which is not granted under the GDPR). Therefore, it appears that the Data Act could give the data user (as an individual) a wider right than as a data subject under the GDPR. Data holders (who may also be data controllers under the GDPR) will need to prepare accordingly.

    2. CNIL strengthens parental control for connected devices 

    In July, the French Data Protection Authority (CNIL) issued an opinion on two decrees implementing law which reinforces parental control online. From now on, Internet-connected technical devices sold in France (smartphones, computers, video game consoles, etc.) must feature an easily accessible and understandable parental control system. Activation of this parental control system must be offered free of charge, as soon as the technical device is first taken into operation. The parental control systems must include two main functions: blocking downloads of applications made available in app stores, which prohibits or restricts access to minors who fall within certain age categories (for example, certain social networking applications are blocked for children under 13) and blocks access to content installed on terminals where access is forbidden to minors or restricted to a certain age category. With these changes, the CNIL exceeds the level of protection in most other EU member states and provides parents with a strong protection mechanism. So far, the European Data Protection Board (EDPB) has only dealt with the obligations of controllers in seeking consent of minors. The CNIL compliments the duties of the controller by enabling parents to exercise parental control. The EDPB is expected to issue guidelines on children's data and on the use of Technologies for Detecting and Reporting Online Child Sexual Abuse, as announced in its 2023/2024 working programme

    3. EDPB Statement on first review of the EU adequacy decision regarding Japan

    On 19 July, the EDPB published its first periodical review of the EU adequacy decision on Japan. The EDPB addresses various aspects in the Japanese adequacy decision, given that Japanese privacy laws have been subject to significant changes. The EDPB explicitly welcomes the revised definition of "personal data the business holds", the extension of the right of objection and breach notification duties towards the authority and data subjects, as well as amendments to third country data transfers, all of which further converge towards the stipulations of the GDPR. The EDPB confirms its appreciation of Japan's rules on automated decision making, including profiling. However, the EDPB notes one significant deviation regarding the newly introduced category of pseudonymized data, for which Japanese law establishes that data controllers are relieved from certain data protection obligations – such as data breach notifications.

    4. CNIL confirms the relevance of the "refuse all cookies" button

    On 13 July, CNIL formally closed its proceedings against Google LLC and Google Ireland Limited that started at the end of 2021. At the time, the CNIL set a EUR 150 million administrative fine and ordered Google LLC and Google Ireland Limited to modify the methods used on the and websites to obtain consent of users located in France by providing a "refuse all cookies" button within three months. The CNIL has now decided to close the proceedings after Google provided a respective "refuse all button" within the given timeframe. 

    5. EDPB Recommendations on Binding Corporate Rules (BCR)

    On 20 June, the EDPB adopted a new version of the application form for Controller BCR (Recommendations 1/2022), replacing its first version which was adopted on 14 November 2022. The new recommendations provide a standard form to apply for Controller BCR, with more clarity on the necessary content of the BCR-C as such and what must be presented by the applicant to the lead supervisory authority in the related materials, explanations and commentary notes.

    Have you thought about AI  governance?

    Artificial intelligence has been around for some time, and the concept of AI governance is not a new one, however the rise and proliferation of generative AI tools has thrust the focus on AI governance and risk management into the spotlight. This comes down to the way in which generative AI has broken down the barriers around the access and use of AI. It is no longer a self-contained ecosystem, limited to large, technically minded and data-literate data science teams, but is now at the fingertips of anyone with an internet connection. For this reason, the way in which AI risk management and governance should be thought about needs to evolve. Traditional governance concepts still apply when it comes to AI: define what it is, understand its use, set policies and standards and develop and apply a risk and control framework; however, the way these are applied needs to change.

    The challenge organisations now face given the rise in generative AI is understanding its use. Who in the business is using it? What is it being used for? What decisions are being made based on its use? What risks does this pose to the business (and this is not just limited to data protection risks)? The other challenge is developing a fit-for-purpose risk and control framework. The risks and therefore the controls required will typically vary depending on how AI is being developed and deployed. They will differ depending on whether AI models are proprietary and are being built in-house from scratch or whether they are leveraging existing large language models as a base for proprietary models, or whether third-party models are being used. This will influence where governance can be applied across the AI ecosystem. For example, if the AI model is proprietary, then additional governance and controls can be applied at the infrastructure level (e.g., building in security and privacy controls into the infrastructure). If the AI model is being leveraged using a third-party large language model, then governance and controls need to focus at the interface level, essentially in the form of end-user governance. This all points to the need for organisations to have an in-depth understanding on AI, its development lifecycle and its use across the business.

    We will be holding further events this year on AI, covering data protection and data governance considerations for your business. If you would like to register your interest please get in touch.

    Authors: Rhiannon Webster, Partner; Andreas Mauroschat, Partner; Alexander Duisberg, Partner; Matthew Worsfold, Partner; Shehana Cameron-Perera, Senior Associate; Tom Brookes, Associate; David Plischka, Associate; Emily Jones, Solicitor

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.


    Stay ahead with our business insights, updates and podcasts

    Sign-up to select your areas of interest