Controversial facial recognition company Clearview AI says it will stop selling tech to private companies after furious backlash and class-action lawsuit
- Court documents suggest Clearview will stop selling software to private entities
- It will continue its contract with governments and law enforcement
- The move has done little to appease public advocates like the ACLU
The controversial facial recognition company Clearview AI says it will stop providing private entities with its technology.
According to legal documents first reported by Buzzfeed, the company is ending non-government related contracts in response to class-action lawsuits and scrutiny from regulators.
The court documents suggest that Clearview is voluntarily avoiding ‘transacting with non-governmental customers anywhere.’
Clearview has amassed a database of over 3 billion photos from sites like Facebook, YouTube, Twitter, and even Venmo, which its proprietary AI scans to try and match people in photos uploaded by their clients
‘Clearview is cancelling the accounts of every customer who was not either associated with law enforcement or some other federal, state, or local government department, office, or agency,’ the company said in a filing
Buzzfeed reports that the lawsuit from which the documents stem relate to the companies use of biometric data that is being heard in a being heard in an Illinois federal court.
The documents also show that Clearview will cease its contracts with all entities in Illinois as part of the lawsuit.
Clearview’s decision to wind down contracts with private companies has done little appease public advocates like the American Civil Liberties Union, however.
‘These promises do little to address concerns about Clearview’s reckless and dangerous business model,’ said Nathan Freed Wessler, a staff attorney with the American Civil Liberties Union in a statement.
‘There is no guarantee these steps will actually protect Illinois residents. And, even if there were, making promises about one state does nothing to end Clearview’s abusive exploitation of people’s faceprints across the country.’
Clearview AI software allows its customers to identify people by uploading photos to the company’s servers, where they’re compared against a database of more than 3 billion photos pulled from Facebook, YouTube, Twitter, and even Venmo.
Clearview AI has come under scrutiny for its practices of scraping pictures from social media as well as its partnerships with law enforcement
The service was reportedly used by at least 600 different law enforcement agencies in the last year, including the Chicago Police Department, the Department of Homeland Security and the FBI.
While Clearview has signaled that it will cease private contracts, a recent report from Buzzfeed shows evidence that the comapny is continuing to develop other commercial hardware.
According to documents obtained by Buzzfeed News, Clearview AI is exploring the possibility of making surveillance cameras that use computer vision software to identify subjects by cross-referencing a database.
Clearview’s cameras are reportedly being tested by two companies and are being developed under a division of Clearview called Insight Camera.
The company has yet to publicly link itself to Insight, but a Buzzfeed investigation was able to connect the two by assessing code on both companies’ websites. In each, similar code contained references to Clearview’s servers.