Facial recognition firm Clearview AI says it will soon have 100 BILLION photos in its database to ensure 'almost everyone in the world will be identifiable' and wants to expand beyond law enforcement

 A controversial AI company has announced it aims to put an image of nearly every human face in its facial recognition database, making it possible for 'almost everyone in the world [to] be identifiable.' 

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

The collection of images - approximately 14 photos for each of the 7 billion people on the entire planet, scraped from social media and other sources - would extensively bolster the company's extensive surveillance system, already the most elaborate of its kind.


The American company headquartered in Manhattan further told investors that its 'index of faces' has grown from 3 billion images to more than 10 billion since the start of 2020. 

The firm's technology has already been used by myriad law enforcement and government agencies around the world, helping police make thousands of arrests by aiding in various criminal investigations.

Clearview fills its database by scouring sources like Facebook, YouTube, Venmo and millions of other sites, according to the company.

The company, founded in 2016 by Australian CEO Hoan Ton-That, 34 - and currently valued at more than $100 million - is seeking to expand its facial recognition empire beyond law enforcement.

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and access to social media [File photo]

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and access to social media [File photo]


In the presentation to investors last year, obtained by The Washington Post, Clearview brass pleaded for $50 million in additional funding to boost the new undertaking.   

The infusion of funds would allow the company to reach its goal of 100 billion photos, while also building new products, expanding its international sales team, and increasing pay to lobbying government policymakers to 'develop favorable regulation,' The Post reported. 

At the time of the presentation, its data collection system was ingesting 1.5 billion images a month, the company said. 

Clearview added that the improved database would help organizations using their tech better monitor 'gig economy' workers, and that it is currently researching a number of new technologies could identify someone based on how they walk, detect their location from a photo, or even scan subjects' fingerprints from afar. 

Clearview's technology is being used by private companies, including Macy's, Walmart, BestBuy and the NBA, and even a sovereign wealth fund in the United Arab Emirates, according to a report from BuzzFeed last January. 

At least seven states and nearly two dozen cities have limited government use of Clearview's technology amid fears over civil rights violations, racial bias and invasion of privacy. 

The New Jersey attorney general has banned state law enforcement from using Clearview's system. 

For many worried about privacy, news of the stockpile raised concerns that unauthorized surveillance used in authoritarian countries like China could happen in the US and other democracies. 

European nations - including the United Kingdom, France, Italy, Greece and Austria - have all expressed disapproval of Clearview's method of extracting information from public websites, saying it comes in violation of European privacy policies. 

Canadian provinces from Quebec to British Columbia have requested the company take down the images obtained without subjects' permission. 

In December, Tech Times reported that Clearview had been called out by multiple privacy watchdogs in countries across the globe for alleged privacy violations.

Clearview AI CEO Hoan Ton-That has said his company collects only publicly available photos from the open internet that are accessible 'from any computer anywhere in the world.' He said its database cannot be used for surveillance

Clearview AI CEO Hoan Ton-That has said his company collects only publicly available photos from the open internet that are accessible 'from any computer anywhere in the world.' He said its database cannot be used for surveillance

Clearview AI, founded in 2016 as a facial recognition firm, is currently collection 1.5 billion images of people a month, the company said during the December report

Clearview AI, founded in 2016 as a facial recognition firm, is currently collection 1.5 billion images of people a month, the company said during the December report

In the presentation to investors last year, Clearview brass pleaded for funding for the undertaking, to the tune of $50 million. The company was sued by the American Civil Liberties Union in March 2020, contending it illegally stockpiled images of 3 billion people scraped from internet sites without their knowledge or permission

In the presentation to investors last year, Clearview brass pleaded for funding for the undertaking, to the tune of $50 million. The company was sued by the American Civil Liberties Union in March 2020, contending it illegally stockpiled images of 3 billion people scraped from internet sites without their knowledge or permission

Various law enforcement agencies have also expressed their concern regarding Clearview's collection of personal information, with the NYPD turning down a partnership with Clearview in April after undergoing a 90-day free-trial of its facial recognition software.

The NYPD decided against using the app, citing potential security risks and potential for abuse, sources said.

Social media sites including Facebook and Twitter have urged the company to delete the photos that it has collected. 

Ton-That refused and pointed out the company gathers only publicly available photos from the open internet that are accessible 'from any computer anywhere in the world'. 

He asserted that its database cannot be used for surveillance. 

In March 2020, Clearview was sued by the American Civil Liberties Union, which contended the company illegally stockpiled images of three billion people scraped from internet sites without their knowledge or permission. The case was filed in Illinois with the backing of a consortium of Chicago-based rights groups.

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses [File photo]

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses [File photo]


After the suit was filed, authorities said Clearview had halted sales of its facial recognition technology to US-based private firms. 

The Vermont attorney general also sued in 2020.

Illinois was the first state in the U.S. to regulate the collection of biometric data, with the introduction in 2008 of the Biometric Privacy Act (BIPA).

BIPA requires companies that collect, capture or obtain an Illinois residents' biometric identifier — such as a fingerprint, faceprint, or iris scan — to first notify that individual and obtain their written consent. 

ACLU said that their lawsuit was 'the first to force any face recognition surveillance company to answer directly to groups representing survivors of domestic violence and sexual assault, undocumented immigrants, and other vulnerable communities uniquely harmed by face recognition surveillance.' 

In the court documents, filed in Cook County, Illinois, on Thursday, the ACLU team claim that the facial recognition technology provided by Clearview puts vulnerable people at risk. 

'Given the immutability of our biometric information and the difficulty of completely hiding our faces in public, face recognition poses severe risks to our security and privacy,' they claim. 

'The capture and storage of faceprints leaves people vulnerable to data breaches and identity theft. 

'It can also lead to unwanted tracking and invasive surveillance by making it possible to instantaneously identify everyone at a protest or political rally, a house of worship, a domestic violence shelter, an Alcoholics Anonymous meeting, and more. 

'And, because the common link is an individual's face, a faceprint can also be used to aggregate countless additional facts about them, gathered from social media and professional profiles, photos posted by others, and government IDs.'

Nathan Freed Wessler, senior staff attorney with the ACLU's Speech, Privacy, and Technology Project, described Clearview's technology as 'menacing'.

He said it could be used to track people at political rallies, protests, and religious gatherings, among other uses. 

The coalition is asking a judge to order Clearview to delete the images, and inform in writing and obtain written consent from 'all persons' before capturing their biometric identifiers. 

Tor Ekeland, an attorney for the company, described the lawsuit as 'absurd' and a violation of the First Amendment, which protects freedom of speech, religion, assembly and protest.

'Clearview AI is a search engine that uses only publicly available images accessible on the internet,' he said.  

 'It is absurd that the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment forbids this.' 

The collection of images - approximately 14 photos for each of the 7 billion people on the entire planet, scraped from from social media and other sources - would extensively bolster the company's extensive surveillance system, already the most elaborate of its kind

The collection of images - approximately 14 photos for each of the 7 billion people on the entire planet, scraped from from social media and other sources - would extensively bolster the company's extensive surveillance system, already the most elaborate of its kind

Clearview AI was founded in 2016 by Hoan Ton-That, a 31-year-old Australian tech entrepreneur and one-time model. 

Ton-That co-founded the company with Richard Schwartz, an aide to Rudy Giuliani when he was mayor of New York.

It is backed financially by Peter Thiel, a venture capitalist who co-founded PayPal and was an early investor in Facebook.  

Ton-That describes his company as 'creating the next generation of image search technology', and in January 2020 the New York Times reported that Clearview AI had assembled a database of three million images of Americans, culled from social media sites.

The paper published an expose of the company in which Ton-That described how he had come up with a 'state-of-the-art neural net' to convert all the images into mathematical formulas, or vectors, based on facial geometry - taking measurements such as how far apart a person's eyes are.

Clearview created a directory of the images, so that when a user uploads a photo of a face into Clearview's system, it converts the face into a vector.

The app then shows all the scraped photos stored in that vector's 'neighborhood', along with the links to the sites from which those images came. 

Amid the backlash from the Times article, Clearview insisted that it had created a valuable policing tool, which they said was not available to the public.

'Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only,' the company said.

Clearview insisted the app had 'built-in safeguards to ensure these trained professionals only use it for its intended purpose'. 

Facial recognition firm Clearview AI says it will soon have 100 BILLION photos in its database to ensure 'almost everyone in the world will be identifiable' and wants to expand beyond law enforcement Facial recognition firm Clearview AI says it will soon have 100 BILLION photos in its database to ensure 'almost everyone in the world will be identifiable' and wants to expand beyond law enforcement Reviewed by Your Destination on February 17, 2022 Rating: 5

No comments

TOP-LEFT ADS