Internet gatekeepers block sex ed content because algorithms think they’re porn


The internet has changed how kids learn about sex, but sex ed in the classroom still sucks. In Sex Ed 2.0, Mashable explores the state of sex ed and imagines a future where digital innovations are used to teach consent, sex positivity, respect, and responsibility.


The algorithms that drive products like YouTube, Facebook, and Apple‘s iOS software share a common challenge: They can’t seem to consistently distinguish between pornography and sexual and reproductive health content. 

That’s because the code engineered to prevent “adult” material from popping up in your timeline or search results can also easily block educational content meant to offer internet users candid, factual information about sex, sexuality, and health. 

Critics say the algorithmic confusion may reflect lazy engineering and tech’s infamous diversity problem. When the engineers who write code meant to push nudity or porn to the web’s margins don’t understand or care about the importance of accessing sexual and reproductive health content, especially for LGBTQ youth and other users who’ve been historically marginalized online, of course algorithms will block the widest possible swath of content. Critics also believe a straightforward solution to this problem exists, but say tech companies aren’t interested in addressing their concerns. 

The online sexual health company O.school reported in October how the iPhone’s new software, with the parental control setting enabled, blocked not just its website but numerous entertainment sites and health resources for teens and adolescents. While the filter restricted sites like Teen Vogue and Scarleteen, it didn’t deny users access to websites like the neo-Nazi Daily Stormer or the anti-gay Westboro Baptist Church

That shocking contrast convinced O.school founder Andrea Barrica that Apple’s algorithm might just be blocking certain terms, like teen, wholesale in order to prevent any clicks that might possibly send a user to prohibited content (i.e. “teen porn”). Yet Barrica couldn’t confirm or dispel her suspicions — or learn anything about Apple’s algorithm. 

Barrica used her own network to reach Apple employees with hopes of discussing the situation but was met with silence. Then she published a blog post entitled “Censorship and Sex Ed” with pointed questions for Apple: Who designed the filter? Were parents consulted? Conservatives and religious groups? Doctors and Sex educators? What non-porn sites are being blocked? 

She never heard from Apple. The company did not comment to Mashable about its algorithms or Barrica’s post.

“They’re writing the policies in the most conservative way to avoid the problem.”

“They’re not targeting sex ed; they’re writing the policies in the most conservative way to avoid the problem,” claims Barrica. “Apple’s conservative views on sexuality have so many far-reaching effects.”

Barrica isn’t the only one who’s written an open letter to the platform controlling whether her company’s content is seen online. In May 2017, a writer for the menstrual tracking app Clue detailed in a blog post how Facebook blocked the company’s ads boosting its sexual and reproductive health content. Educational illustrations that featured vulvas, breasts, and penises were blocked. Ads to promote posts about underwear, birth stories, and puberty advice were also rejected. 

A representative for Clue said that while the company declined to comment on the issue, the company stood by its 2017 post. A spokesperson for Facebook said Clue’s ads ran afoul of its advertising policy’s restrictions on “adult content” that, among other things, forbids ads that include nudity or images focused on individual body parts. Facebook’s advertising policies are applied globally and are stricter than its community standards. Clue continues to publish content and advertise on Facebook. 

AMAZE, a sex ed video series for adolescents and teens, has faced a similar problem on YouTube. Since its channel launched nearly three years ago, several of its 84 videos have been rejected for advertising because they were deemed to be “adult” content. Those include videos about female and male biological anatomy. 

“YouTube advertising is critical to our work at AMAZE because it allows us to reach young people all over the world who are searching for guidance around sex, mental health, and more,” Lincoln Moody, a spokesperson for AMAZE, said in an email. 

Though AMAZE is not considered adult content, its videos do include accurate depictions of genitalia and discussions of sexual health. That forthrightness, which is sometimes graphic, could be perceived by an algorithm and human reviewer as violating the platform’s policy against advertising adult content that’s “non-family safe.” YouTube declined to comment on the policies and practices that inform its algorithms. 

Tech companies might argue that their algorithms are actually working as designed by flagging content that violates its policies. Yet the fact that, for example, a benign illustration of a breast in an educational context is deemed objectionable gets at a bigger issue.  

Part of the challenge facing engineers and tech companies is the reality that sexual health material produced for the internet today is often free of the stigma and shame traditionally associated with talking about sex. Instead of staid explainers that use vague terms and descriptions, this new generation of content asks and answers potentially embarrassing questions, sensitively addresses the diverse concerns of marginalized readers, and is unafraid to use accurate depictions of genitalia, making what once were awkward conversations sound pretty fun along the way.  

So engineers who aren’t paying attention to this trend, or don’t even realize it’s happening, are likely to write code that assumes most explicit words or images that appear on the internet are most likely a gateway to porn. 

“One of the dynamics is they’re not thinking about this as a case at all,” says Jon Pincus, a software  engineer and entrepreneur who is an adviser to O.school. “Whether it’s lazy or overly simplified, my guess is they’re not actually trying to measure if they’re letting legitimate [sexual and reproductive health] stuff in while keeping other stuff out.”

Pincus says designing algorithms that perform substantially better than they do today wouldn’t be hard. Engineers and the companies that employ them could embrace fairness, accountability, and transparency as guiding principles, particularly because the availability of accurate sex ed information online is a public health issue. 

Ideally, companies using machine learning algorithms would train them with words, images, and descriptions of valid sexual and reproductive health information they want to accept, as well as the adult content or pornography they want to reject.

Beyond their philosophical approach, Pincus says tech companies could invite sexual and reproductive health experts to provide feedback on how algorithms are designed, or even hire them to consult. Pincus says that’s common practice in the industry when there are no subject matter experts on staff. 

Moody, of AMAZE, agrees with such an approach. 

“To us, the only solution involves an intentional partnership between tech giants and sexual health experts when they’re creating algorithms and content blockers,” he said. “Tech giants aren’t sexual health experts and shouldn’t make such consequential decisions on what is and isn’t ‘age appropriate’ when it comes to online information.”

“Tech giants aren’t sexual health experts.”

Those companies, however, are reluctant to surrender that power and give outsiders influence over their product. When Tumblr announced last week that it would ban adult content, a spokesperson for the company declined to explain the criteria by which its algorithms and human reviewers would distinguish sex ed from nudity or porn but instead noted that “health-related situations” would still be allowed on the platform.

Though the resistance to transparency makes sense given the ruthless competition in Silicon Valley, Barrica believes tech companies have no incentive to endanger major advertising or a spot in Apple’s app store by writing more nuanced algorithms that could maximize access to sexual and reproductive health information but potentially let pornographic content slip through the cracks. 

“It’s really fear-based,” she says. “It goes back to lack of inclusion and diversity, and back to stigma.” 

“There’s so much power to control what people do and don’t see.”

from Mashable! http://bit.ly/2SFJogP
via IFTTT