Written by Loretta Maxfield and Cara Collins, Thorntons
Loretta Maxfield and Cara Collins review the major data protection changes in 2024, including new laws, ICO decisions, marketing, cookies, biometrics, and AI.
Throughout 2024, the UK experienced several notable developments and transformations in data protection, from new laws and the Information Commissioner’s Office (ICO) consultations to landmark enforcement decisions and continued signs of a drive towards responsible use of artificial intelligence. January is often a time of reflection and so this article looks back on the key data protection law highlights of 2024, with a particular focus on some high-risk areas such as marketing, cookies, biometric data, and artificial intelligence.
Marketing:
In total, according to the ICO 2024 – a year in review | ICO, it received 44,400 reports relating to nuisance calls and 28,969 reports relating to spam emails in 2024. Consequently, the ICO issued 14 monetary penalties throughout the year for poor marketing practices, which is higher than in any other type of risk area. Marketing practices continue to receive attention from the ICO as one of its primary areas of focus and given the number of complaints, it is an area that organisations continue to get wrong and cause upset to data subjects.
The most noteworthy fine was issued to HelloFresh, which sells meal kits at scale, for sending 79 million marketing emails and 1 million marketing text messages over a period of seven months. The organisation was hit with a fine of £140,000 by the ICO as it ruled that HelloFresh acted contrary to Regulation 22 of the Private Electronic Communication Regulations 2003 (“PECR”), since the opt in consent statement for the sent messages did not meet the requirements of being “specific” and “informed”. Whilst the statement did refer to marketing via email, it failed to mention marketing via text message and was unclear and bundled within other statements. The statement also failed to highlight that customers would receive marketing messages for up to 24 months after they cancelled their HelloFresh subscription and so all direct marketing messages that were sent by the organisation were deemed to have lacked proper consent.
In issuing this fine, the Information Commissioner, yet again, demonstrates the importance of ensuring that consent-based marketing satisfies the consent requirements of UK GDPR. Consent should fundamentally be specific, informed and clear and organisations should regularly review the lawful basis upon which they market to ensure the lawful basis is clearly identified and meets UK GDPR standards. Organisations should also be sure to act on complaints quickly to avoid associated penalties for non-compliance.
Cookies and Consent:
Consent issues were also relevant in relation to the use of Cookies. The beginning of 2024 saw a sharpened focus on consent and cookie compliance. It has long been established that when using cookies, organisations must be transparent over what kinds of cookies they use and the purpose of them and obtain consent for non-essential cookies prior to deployment. Non-essential cookies are effectively cookies that are helpful or convenient to the website user but are not essential for the purposes of using or operating the website, such as tracking and advertisement cookies. When obtaining consent for non-essential cookies, users must be able to “accept or reject” them, equally easily. Users must also be able to manage their choices and withdraw from consent previously given as easily as they provided it. This is typically managed via a cookie consent mechanism that pops up when users visit a website, following which it remains accessible on each subsequent visit.
The ICO expressed concerns over compliance. To monitor cookie compliance, the ICO conducted a sweep of the top 100 UK websites in 2024 and threatened enforcement action if they were found to be non-compliant. From this, 53 organisations were found to be non-compliant and the ICO consequently wrote to each organisation, ordering them to implement a compliant solution. Following their contact, the ICO announced that 80% of the contacted organisations had successfully changed their cookie banners and implemented change. The ICO has also advised that it has already begun carrying out another sweep of a further 100 UK websites, highlighting that regulation in this area is likely to continue to dominate the ICO’s agenda in 2025. This action acted as a catalyst for other organisations to review their own website and check for compliance in this area.
Beyond the UK, cookie compliance has also been a regulatory focus for other Data Protection Authorities (“DPA”) in the EU throughout 2024. Several EU DPAs have taken stringent regulatory action by intervening with organisations to order compliance and in some instances, some DPA’s have issued significant fines for cookie consent infringements. In particular, the Dutch DPA, Autoriteit Persoonsgegevens, fined the parent company of a popular Dutch drugstore chain, namely Kruidvat, for unlawfully placing tracking cookies before obtaining consent. Following the investigation, the Dutch authorities also found that merely providing users with a pre-ticked box for accepting tracking cookies does not constitute sufficient and freely informed consent. This issue, which is clearly not a UK-only area of focus, is likely to remain prominent throughout 2025 and beyond, therefore it would be prudent for organisations to check the use of cookies on website and ensure any consent obtained for non-essential cookies meets UK GDPR standards.
Biometric Data:
Throughout 2024, the ICO has also increasingly taken enforcement action against organisations for the improper use of biometric technologies, particularly in the workplace.
In February of 2024, the ICO issued an enforcement notice against the public service provider, Serco Leisure, to prevent them from using facial recognition and fingerprint technology to monitor staff attendance and subsequent payment for their time. The ICO ruled that Serco had failed to establish an appropriate lawful basis and special category condition for processing the biometric data. Although the organisation had carried out a Data Protection Impact Assessment (“DPIA”) and Legitimate Interest Assessment (“LIA”), the ICO found that Serco’s reliance and reasoning for relying on legitimate interests as its lawful basis was incorrect. It found that Serco had not sufficiently demonstrated that it was necessary and proportionate to use a biometric system, which resulted in an increased risk of privacy intrusion in comparison to other alternative methods of recording employee attendance. Following this finding, the ICO subsequently issued new guidance on the lawful use of biometric data, detailing how organisations can make effective use of biometric technologies whilst implementing them safely.
Additionally, in July 2024, the ICO issued a reprimand to a school in Essex, namely Chelmer Valley High School, for unlawfully using facial recognition technology to take cashless canteen payments from students. Despite organisations being legally required to have a DPIA in place prior to implementing biometric technologies, including facial recognition, Chelmer Valley High School failed to produce a DPIA, meaning that no prior assessment was made of the risks to the children’s information. The school had also failed to properly obtain clear consent from the students, who were not provided with the opportunity to opt out. In this case, the ICO made it clear that when relying on consent for using biometric technologies, explicit consent is essential, and it should be ‘opt in’ and assumed consent is not nearly sufficient.
Clearly, these cases represent the high bar that the ICO has set for the use of biometric technologies in the workplace. To lawfully process the data, organisations must ensure that they can demonstrate (a) necessity and proportionality; (b) that they have selected the appropriate lawful bases; and (c) that they have completed a comprehensive assessment of the associated risks and put in place mitigations to manage these risks. It is an area that can be particularly intrusive in terms of privacy and organisations are urged to exert caution.
Artificial Intelligence:
Another prominent area of focus in 2024 for the ICO was artificial intelligence (“AI”). Unlike the other focus areas that have been covered thus far, the regulatory focus for AI has largely been on consultation and guidance rather than enforcement.
At the beginning of the year, the ICO launched a consultation series on generative AI and data protection, outlining their views on the innovative technology. The consultation covered topics including: the lawful basis for web scraping to train generative AI models, purpose limitation in the generative AI lifecycle, accuracy of training data and model outputs to allocating and engineering individual rights into generative AI models. The ICO has also now published its response to this consultation, warning developers that they are required to tell people how their information is being used.
Later in 2024, the ICO also published its decision about Snapchat’s "MyAI tool” and whether its DPIA was compliant with the requirements under Article 35 GDPR. The preliminary enforcement notice by the ICO and subsequent revision of the DPIA by Snapchat highlights the importance of undertaking a thorough DPIA to properly assess the risks posed before bringing AI products to the market. Whilst the ICO is keen to support organisations that wish to use this innovative technology, its conclusion of the Snapchat investigation serves as a reminder that use of AI can be high-risk and organisations must innovate responsibly; otherwise, they risk regulatory interference.
Lastly, it is also worth noting that the new EU AI Act was passed in early May of 2024. The new Act essentially puts in place different regulatory frameworks, depending on the AI's level of risk, with some rules for general purpose AI, other rules for high-risk AI and another regime for lower risk AI systems. There are also certain categories of AI that are completely prohibited, such as any AI system related to "social scoring" resulting in detrimental treatment. Whilst the UK is no longer domestically bound by EU legislation, UK based businesses that operate AI systems within the EU market will need to ensure that their systems comply with the new Act’s requirements. The majority of the Act’s provisions will come into force in 2026, but certain provisions, such as the regulations covering prohibited AI practices, will come into force within the next six months and so companies that deal with AI in the EU should begin reviewing their practices to ensure that they are complying with the Act's provisions.
Whether UK or EU based, it seems that, for now, the regulation of AI will remain one of the main regulatory focuses in 2025, particularly in light of Sir Keir Starmer’s recent announcement that the UK Government has adopted an AI Opportunities Action Plan to position the UK as a future global leader for AI. The Government has also announced that it will seek to implement AI specific legislation in the not-too-distant future. Organisations wishing to develop or procure AI tools should recognise that while AI may bring significant efficiencies, it as an area developing at pace and potentially high-risk regarding data protection. Ultimately, keen attention should continue to be paid to the guidance issued by the ICO to support the adoption of AI technologies with minimal risk to data subjects and a watch-full eye on Government policy and legislation in the coming months and years.
By Loretta Maxfield, partner, and Cara Collins, trainee solicitor, Thorntons