Please assign a menu to the primary menu location under menu

Dev

Clearview AI had no ‘lawful reason’ to collect Brits’ images


The UK’s data protection body today made good on its threat to fine controversial facial recognition company Clearview AI, ordering it to stop scraping the personal data of residents from the internet, delete what it already has, and pay a £7.5 million ($9.43 million) fine.

The company, which is headquartered in New York, claims to have over 20 billion facial images on its databases, mostly culled from YouTube, Facebook, and Twitter. Clearview AI has developed a facial recognition tool – which it is attempting to patent – that is trained on these images. The tool attempts to match faces fed into its machine learning software with results from its enormous image database, which it claims is the largest of its kind “in the world” and which it sells (to law enforcement bodies, among other clientele) across the globe.

The move from the Information Commissioner’s Office (ICO) comes after an investigation launched in 2020 in conjunction with the Australian Information Commissioner to see if Clearview had breached the Australian Privacy Act or the UK Data Protection Act 2018.

Handing down a fine that is £10 million less than originally envisaged, the ICO also said it was not impressed that the company had no “process in place to stop the data being retained indefinitely.”

In addition to the fine, the selfie-scraper was also slapped with an enforcement notice ordering it to stop collating the data and delete all information of British residents from its systems.

In defense of its business model, the CEO has previously said that the images, mostly uploaded by the data subjects themselves, were publicly available, and that it didn’t see why it couldn’t collate and search them in the same way Google does. CEO Hoan Ton-That remarked at the time: “If it’s public and it’s out there and could be inside Google’s search engine, it can be inside ours as well.”

John Edwards, UK Information Commissioner, said of the action:

The ICO found it had had breached UK’s GDPR by “failing to meet the higher data protection standards required for biometric data” (classed as “special category data” under the GDPR and UK GDPR); failing to use the info in a way that is “fair and transparent”; failing to have a lawful reason for collecting it; and failing to have a process in place to stop the data being kept “indefinitely.”

Finally, the ICO said the company had illegally requested “additional personal information” (including photos), when members of the public approached it to ask if they were on their books – presumably to check against images it already has. “This may have acted as a disincentive to individuals who wish to object to their data being collected and used,” noted the regulator.

We have asked the company to comment and will update when it responds. ®



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.