A company that collected billions images of people's faces from social media and used them to provide data matching services has been fined more than £7.5m by the UK's Information Commissioner.
Clearview AI Inc collected more than 20 billion images of people’s faces and data from all over the world using information available on the internet and social media platforms. People were not informed that their images were being collected or used in this way. It used the data "scraped" in this way to create an online database through which it allowed customers, including the police, to upload a person's image to the company’s app, to check it for a match against all the images in the database.
Given the high number of UK internet and social media users, the ICO said the database was likely to include a substantial amount of data from UK residents, gathered without their knowledge. Although Clearview no longer offers its services to UK organisations, the company has customers in other countries, and would still be using personal data of UK residents.
In addition to imposing a fine of £7,552,800, the ICO has issued an enforcement notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.
Its action follows a joint investigation with the Office of the Australian Information Commissioner (OAIC), conducted in accordance with the UK Data Protection Act 2018 and the Australian Privacy Act, alongside international co-operation agreements on enforcing data privacy.
In relation to the UK legislation, Clearview was found to have breached UK data protection laws by:
- failing to use the information of people in the UK in a fair and transparent way, since individuals were not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people’s information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (classed as "special category data" under the GDPR and UK GDPR);
- asking for additional personal information, including photos, when asked by members of the public if they were on their database, which may have acted as a disincentive to individuals who wished to object to their data being collected and used.
John Edwards, UK Information Commissioner, commented: "The company not only enables identification of those people [whose data it collected], but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.
"People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.
"This international cooperation is essential to protect people’s privacy rights in 2022. That means working with regulators in other countries, as we did in this case with our Australian colleagues. And it means working with regulators in Europe, which is why I am meeting them in Brussels this week so we can collaborate to tackle global privacy harms."