Underground Facial Recognition Tool Unmasks Camgirls


Underground Facial Recognition Tool Unmasks Camgirls

An underground site uses facial recognition to reveal the site a camgirl streams on, potentially letting someone take a woman’s photo from social media, then use the site to out their sex work.

The site presents a serious privacy risk to sex workers, some who may not want stalkers, harassers, or employers to discover their profiles. The site’s creator claimed to 404 Media that millions of searches are done each month on the site.

“The site was created to help users find the models they like. For example, if they saw a random video or image on the internet without attribution,” the creator, who did not provide their name, said in an email. “Or just to see on which other platforms a model is active.”

Camgirlfinder has been running for several years, with most adult streaming platforms being added in 2021, the site says. It claims to have a database of 2,187,453,798 faces from 7,050,272 persons. The site says the database it uses contains faces from a wide variety of adult streaming platforms, including Chaturbate, MyFreeCams, and LiveJasmin. Of course, sex workers often have multiple accounts on multiple sites.

💡
Do you know anything else about this site or others like it? I would love to hear from you. Using a non-work device, you can message me securely on Signal at joseph.404 or send me an email at joseph@404media.co.

404 Media tested the service by uploading a photo of a camgirl who streams publicly. The site then successfully found her other profiles on other streaming platforms. 

The results page shows other similar faces the site detected. The results include the model’s username on the streaming platform; the probability of the face match; and the last time their account was online. “Additionally you can see the most similar persons for each individual person of this model account. This is a great way to find all other accounts of a model,” the site says.

Users can also search the database of models by their username or a term similar to it. The database appears to include sex workers who may not have streamed for years, creating the risk that someone may use the site to find them even if they decided to not stream anymore. The site then sells all images it has of a particular person for $1 per model. 

Asked about how this site impacts camgirls’ privacy, and how someone could take a photo from social media then unmask a person’s channels, the creator said, “If that is a problem for you then the sad reality is this job is not for you. If you publicly stream your face for everyone to see to the internet, people will obviously see it.”

“One consequence of this job is you can not publish images of yourself on your private social media accounts, if you want to keep them private (just for friends and family). This is similar to actors, politicians, youtubers or other public figures. If you stream content to the public internet you become a public figure yourself,” they said.

The site says models can opt-out from their results appearing if they fill out a form. The creator claimed to 404 Media that around 25,000 accounts have opted-out, with most models having multiple accounts across different platforms. “Yes, their images get deleted,” they claim.

The creator told 404 Media the site uses AdaFace, an open source face matching algorithm.

Over the last several years, facial recognition technology has morphed from a government surveillance tool, to one that members of the public use regularly against one another. In 2023, we covered a TikTok account that was using off-the-shelf facial recognition tech to dox random people on the internet for the amusement of millions of viewers. The following year, we reported two students had taken facial recognition software and paired it with Meta’s RayBan smart glasses, letting them dox people in seconds.

While government agencies, including ICE, continue to use facial recognition too, some people have used that technology to monitor those agencies instead. Last year, artist Kyle McDonald launched FuckLAPD.com, a site that uses public records and facial recognition technology to allow anyone to identify police officers.

Scroll to Top