Clearview AI’s scraping of facial recognition data has created trouble for it around the world, both with the social media platforms it takes material from and governments that have found it in violation of privacy laws. The latter has happened yet again in the United Kingdom, as lead privacy regulator The Information Commissioner’s Office (ICO) has issued a decision after a lengthy investigation conducted in partnership with Australian authorities.
Clearview AI will be fined £7,552,800, and has been ordered to stop collecting facial recognition data in the country and to delete all of the data it had previously collected. The company was found to be in violation of the terms of the UK Data Protection Act 2018.
Clearview AI’s facial recognition data hoarding continues to rack up penalties
Clearview AI maintains, as it always has, that it is justified in mass scraping of facial recognition data from public profiles as it is taking material that is made freely available to anyone. This in spite of being told by Facebook, Google, Twitter, LinkedIn, Venmo and others that the practice violated their terms of service and must be stopped (and being served cease and desist letters by some of these companies).
As part of a recent settlement over the company’s violation of Illinois biometric privacy laws, it is essentially restricted from doing business with anyone but federal agencies and local police departments in the United States. Other countries have also fined the company and/or banned it outright: Canada, Australia, France and Italy among others.
ICO decided that the UK Data Protection Act 2018 had been breached in several different ways: failure to be “fair and transparent” in the use of personal data, failure to provide a lawful reason for collecting the data, not having a documented process in place to ensure the data is not stored indefinitely, failing to meet special standards that apply to the collection of facial recognition data, and requiring data subjects to provide even more personal information when making a request for deletion of their data.
The ICO enforcement notice essentially means that Clearview AI has been run out of another country, unable to legally collect or use the facial recognition data of UK citizens under its usual business model. The company is estimated to have collected about 20 billion images around the world without the knowledge or consent of the subjects, harvesting them from social media profiles viewable by the general public. The images of UK citizens will now have to be deleted, that is to say if Clearview AI intends to comply with UK law.
The firm was similarly ordered to stop collecting information about Australians in November 2021 and to delete any facial recognition data it had already collected, but was not hit with a fine in that country.
Limited use of Clearview AI in UK did not justify mass harvesting of facial recognition data
In spite of its legal challenges and battles with the tech platforms it scrapes facial recognition data from, Clearview AI still has many customers in the US (including select federal agencies, such as the FBI and ATF). Not so in the UK, where it offered a “free trial” to several law enforcement agencies but did not enter into any known long term agreements with them. The Metropolitan police and the National Crime Agency are among those known to have sampled Clearview AI’s services for some amount of time.
Clearview AI chief executive Hoan Ton-That expressed disappointment in the ICO’s decision, and said that the agency had “misinterpreted his intentions.” The company told investors in February that it is expanding its contracts with US federal government agencies, and has entered into a contract with the US Air Force to develop augmented reality glasses for use in securing bases. They will have to carefully proceed without including the facial recognition data of Illinois residents, however, who receive ongoing protection from the state’s unique biometric privacy laws.
A statement from Clearview AI’s legal team indicates that it may not be planning to pay much attention to the ICO’s enforcement order, saying that the company does not operate in the UK and is therefore not subject to its jurisdiction. The legal team also said the decision to impose a fine was “incorrect.” The fine amount is actually a substantial reduction from the initial amount that was proposed; in November 2021 ICO initially declared intent to fine the firm at least £17 million. Clearview AI has 28 days to appeal the UK ruling and will not be required to be in full compliance for six months from that date.
Despite the reduced fine amount (and possible difficulty in forcing Clearview AI to comply), Chris Olson, CEO of The Media Trust, believes that this systematic blocking of the company from one country after another demonstrates how privacy legislation can ultimately be effective:
“The U.K’s action against Clearview AI demonstrates that emerging data privacy legislation has teeth, and businesses around the world need to take it seriously. Today, a majority of organizations with online domains or digital apps are in violation of the guidelines set forth in GDPR, whether they realize it or not. Developers frequently collect and handle user data in irresponsible ways, and without proper disclosure – moreover, they share that data with unknown third parties, who may then share it with fourth and fifth parties. For businesses that don’t commit to digital safety and trust, it’s