The Information Commissioner’s Office has fined Clearview AI Inc £7,552,800 for using images of people that were collected from websites and social media. The images were used to create a global online database for facial recognition.
The ICO has also issued an enforcement notice, requiring Clearview AI to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.
Clearview provides a service that allows customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database. The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.
The ICO enforcement action follows a joint investigation with the Office of the Australian Information Commissioner which focused on Clearview AI Inc’s use of people’s images, data scraping from the internet and the use of biometric data for facial recognition.
According to the investigation, Clearview AI collected over 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. People were not informed that their images were being collected or used in this way.
Given the high number of web and social media users in the UK, the ICO says that Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge. Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.
The ICO found that Clearview AI Inc breached UK data protection laws by:
- failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people’s information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR); and
- asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.