How Facial Recognition Technology Could Bring a Slut-Shaming Nightmare

/ reader
Por: PeriodicoVirtual

How Facial Recognition Technology Could Bring a Slut-Shaming Nightmare

Though its existence isn't confirmed, advocates worry a database of sex workers could put many women at risk.

A Chinese social media user is claiming to have developed such a system — but using tech to track and control women is sadly not new

Facial recognition technology is everywhere these days, and people aren’t having it. Cities like San Francisco have officially instituted bans preventing police officers from using it, and none other than Alexandria Ocasio-Cortez issued an ominous warning that the rise of facial recognition technology is “tied to the political reality that there is a global rise in authoritarianism and fascism.” Now, it’s allegedly being used to ID members of the sex industry, which could potentially be used to chilling effect.

According to a post on the Chinese social network Weibo, a Chinese programmer based in Germany has created an elaborate facial recognition system to identify performers in adult films. In the post, he claims he has developed the system to the point that it can now successfully recognize the faces of nearly 100,000 adult performers. The system reportedly cross-checks porn performers’ images culled from platforms like xvideos and Pornhub with those of women on popular social media platforms like Facebook and TikTok. While this is ostensibly horrifying enough, the goal of the technology is even more frightening: according to a now-viral Twitter thread by Stanford political science PhD student Yiqin Fu, which translates the original posts, the system was developed specifically so male users could identify whether their female partners were performing in these films. (Fu’s translation was independently verified by Rolling Stone.)

Of course, it’s unclear how legitimate these claims are, or whether this facial recognition system actually exists: the Weibo user has produced no proof verifying the efficacy of such technology, and in a direct message to Rolling Stone, the original author of the viral thread, Yiqin Fu, said that she did not attempt to verify the original poster’s claims before tweeting about the post. But facial verification technology has apparently been previously used to similar ends, specifically for the purposes of tracking and identifying sex workers. In 2017, for instance, the streaming behemoth Pornhub announced it would be utilizing a similar system to make it easier for viewers to find their favorite stars, which Vice’s Samantha Cole correctly pointed out could have serious implications for revenge porn survivors or others whose images had ended up on the website without their consent.

Additionally, the idea of using a “database” to track and surveil sex workers and their clients is not without precedent. Earlier this year, for instance, a bill that would have created a database of those suspected of soliciting prostitution reached the Florida state legislature; though the bill was ostensibly intended to protect sex trafficking victims, many sex workers protested that it would have served to shame and expose those doing consensual sex work. (The bill no longer proposes a database for those suspected of selling sex, instead mandating a database for those suspected of buying it.)

Compiling sex workers’ personal information in any form, whether it’s a traditional database or a more sophisticated theoretical facial recognition system, could potentially put sex workers in tremendous danger should such information fall in the wrong hands, says Kristen DiAngelo, the executive director of the Sex Workers’ Outreach Project (SWOP) in Sacramento, California. Due to the unregulated and illegal nature of the sex industry (sex work is only legal in a few select counties in Nevada), sex workers are at increased risk for violence; due to the stigma associated with sex work, they are also at increased risk of being discriminated against by potential employers outside the industry.

“Sex workers are not all career sex workers,” DiAngelo tells Rolling Stone. Many of them are “college students, or they have another job where they don’t make enough money. They have lives. If any of this got out, it would destroy the life they know.” A database of sex workers in any form “allows people to target us, stigmatize us, and harm us,” she says.

Carrie Goldberg, a Brooklyn-based lawyer at CA Goldberg, PLLC who specializes in sexual privacy violations, has worked with many women who have been victims of nonconsensual porn. Through her work, she has come across a substantial number of men who, “as if for sport, hunt the internet to dox women [who appear in porn] — share her name, school, job, teenage sister’s social media, websites where her parents work.” As an example, Goldberg cites a client who appeared in adult content as a teenager and has spent the past eight years trying to dodge trolls’ doxxing efforts:  “She and her parents have had to twice change their names to escape the men who harass them,” she says. On occasion, trolls will act in concert to expose sex workers, as evidenced by last year’s #thotaudit, an attempt spearheaded by incels (self-identified “involuntarily celibate” men) to create a massive database of sex workers to get them audited by the IRS.

Unfortunately, DiAngelo says, it has become much easier for parties searching for sex workers’ personal information to find it. Surveillance efforts targeting sex workers have been ramping up in the wake of SESTA/FOSTA, a 2018 anti-sex-trafficking bill that sex-workers’ rights advocates argue made it more difficult for sex workers to vet clients and ensure their safety. “Everyone’s hustling, trying to work again. They’re trying to figure out how to survive. Things have gotten a lot worse,” she says. In an effort to crack down on sex trafficking, for instance, large hotel chains like Marriott are training employees to keep a close eye on suspected sex workers, a definition that appears to include women who are simply traveling alone; additionally, DiAngelo says she has heard reports from friends in Las Vegas who have been barred from entering casinos at the door, purportedly due to security using facial recognition technology to identify them. (While this is unconfirmed, it is not unheard of for large casinos to use such technology to target banned gamblers.)

Whether it’s under the guise of “protecting” sex workers, as the proposed Florida database would ostensibly have done, or a naked attempt to shame them, there have been many efforts to track and shame women in the sex industry. But such surveillance efforts don’t just put those who do sex work at risk. Consider, for instance, the very idea that the alleged facial recognition system was expressly intended to out wives and girlfriends who were doing sex work surreptitiously; outside the context of a culture that aggressively works to control female bodies and target women who do not adhere to conventional sexual norms, such technology simply would not have any need to exist (nor, for that matter, would Twitter posts about it go viral). Regardless of whether this specific Weibo user’s claims are legit, the idea of tracking and punishing women who do not conform to gendered expectations of behavior is all too real, and there’s no telling how such invasive and arguably creepy forms of technology will be used to those ends in the future.

“Technology can be used for evil,” says Goldberg. “This [would be] one prime example.”


Do you have any presale question to ask?

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been.