Technology

Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts

Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts. Going back over a decade, the quintessential example used to show the impossibility of coming up with clear, reasonable rules for content moderation at scale is Facebook and breasts. In the early days, as Facebook realized it needed to do some content moderation, and had to establish a clear set of rules that could be applied consistently by a larger team, it started with a simple “no nudity” policy — and then after that raised questions, it was narrowed down to define female nipples as forbidden. As a wonderful episode of Radiolab detailed last year, questions kept getting raised about how specific do you need to be (each paragraph here is a different speaker, but since Radiolab doesn’t supply transcripts, I’m not entirely sure who’s speaking):

So, for example, by then, nudity was already not allowed on the site. But they had no definition for nudity. They just said “no nudity.” And so the site integrity team — those 12 people at the time — they realized that they had to start spelling out exactly what they meant.

Precisely. All of these people at Facebook were in charge of trying to define nudity.

The first cut at it was “visible male and female genitalia.” And then “visible female breasts.” And then the question is “well, okay, how much of a breast needs to be showing before it’s nude?” And the thing that we landed on was, if you could see essentially the nipple and areola, then that’s nudity. And would have to be taken down.

This might have seemed like a straightforward rule… until mothers posting breastfeeding photos started complaining — as they did after a bunch of their photos got blocked. Stories about this go back at least until 2008 when the Guardian reported on the issue, after a bunch of mothers started protesting the company, leading Facebook to come up with this incredibly awkward statement defending the practice:

“Photos containing a fully exposed breast, as defined by showing the nipple or areola, do violate those terms (on obscene, pornographic or sexually explicit material) and may be removed,” he said in a statement. “The photos we act upon are almost exclusively brought to our attention by other users who complain.”

More public pressure, and more public protests, resulted in Facebook adjusting its policy to allow breastfeeding, but photos still kept getting taken down, leading the company to have to keep changing and clarifying its policy, such as in this statement from 2012.

When it comes to uploaded photos on Facebook, the vast majority of breastfeeding photos comply with our Statement of Rights and Responsibilities, which closely mirrors the policy that governs broadcast television, and which places limitations on nudity due to the presence of minors on our site. On some occasions, breastfeeding photos contain nudity – for example an exposed breast that is not being used for feeding – and therefore violate our terms. When such photos are reported to us and are found to violate our policies, the person who posted the photo is contacted, and the photos are removed. Our policies strive to fit the needs of a diverse community while respecting everyone¹s interest in sharing content that is important to them, including experiences related to breastfeeding.

In the Radiolab episode they pointed out that photos of babies sleeping after having breastfed were getting taken down because the baby’s head was no longer blocking the nipple.

In 2014, Facebook clarified its policies on nipples again:

“Our goal has always been to strike an appropriate balance between the interests of people who want to express themselves with the interests of others who may not want to see certain kinds of content,” a Facebook spokesperson told the Daily Dot. “It is very hard to consistently make the right call on every photo that may or may not contain nudity that is reported to us, particularly when there are billions of photos and pieces of content being shared on Facebook every day, and that has sometimes resulted in content being removed mistakenly.

“What we have done is modified the way we review reports of nudity to help us better examine the context of the photo or image,” the spokesperson continued. “As a result of this, photos that show a nursing mothers’ other breast will be allowed even if it is fully exposed, as will mastectomy photos showing a fully exposed other breast.”

Right. And then, just a few months later, people started protesting again, as more breastfeeding photos were taken down.

Again in the Radiolab program, they then discuss how this gets even more confusing, as some people started posting photos of “breast feeding porn” that appeared to show breast feeding that wasn’t infants. So they modified the rule to say the breastfeeding individual had to be an infant. But how does Facebook determine who is and who is not an infant? We’re right back to the definitional problem. The original rule Facebook put in place was “does the kid look old enough to walk?” which raises other problems, since many kids breastfeed long after they can walk. Facebook has to keep amending and changing. It eventually allows one (just one) nipple/areola showing if it appears related to breastfeeding… then after some time a second one could be shown.

But as Radiolab documented, every time you set a definition, a new exception comes up. In the midst of the breastfeeding mess, this happens:

Literally every time this team at Facebook would would come up with a rule that they thought was airtight–ka-plop–something would show up that they that they weren’t prepared for, that the rule hadn’t accounted for.

As soon as you think, “yeah, this is good” like the next day something shows up to show you, yeah, you didn’t think about this.

For example, sometime around 2011, this content moderator is is going through a queue of things–accept, reject, accept, escalate, accept–and she comes upon this image: the photo itself was a teenage girl, african by dress and skin, breastfeeding a goat — a baby goat. And the moderator throws their hands up and said “what the fuck is this?”

And we Googled breastfeeding goats and found that this was a thing. It turns out it’s a survival practice according to what they found, this is a tradition in Kenya that goes back centuries, that in a drought, a known way to help your herd get through the drought is to, if you if you have a woman who’s lactating, you have her nurse the kid, the baby goat, along with her human kid. And so there’s nothing sexual about it.

…. And theoretically if we go point by point through this list: it’s an infant–it sort of could walk so maybe there’s an issue there–but there’s physical contact between the mouth and the nipple. But (obviously) breastfeeding as we intended, anyway, meant human infants. And so, in that moment, what they decide to do is remove the photo. And there was an amendment, an asterisk, under the rule stating “animals are not babies.” So in any future cases people would know what to do.

This then raised new problems and so on and so on.

And so consider me not at all surprised that Facebook is still facing this very same issue. Late last week there were reports in Australia of some (reasonably) outraged people, who were angry that Facebook was taking down a series of ads for breast cancer survivors.

Facebook has come under fire from outraged breast cancer awareness groups after it banned online advertisements that featured topless survivors, claiming they violated the platform’s nudity policy.

The Breast Cancer Network of Australia (BCNA), in partnership with Bakers Delight, launched its annual Pink Bun Campaign yesterday to raise awareness and money for charity.

As the article notes, the ads showed “10 topless breast cancer survivors holding cupcakes to their chests”. In another article Facebook gives its reasoning, which again reflects much of the history discussed above:

Facebook said it rejected the ads because they did not contain any education about the disease or teach women how to examine their breasts.

It said since the ads were selling a product, they were held to a higher standard than other images because people could not block ads the way they could block content from pages they followed.

So, clearly, over time the rule has evolved so that there’s some sort of amendment saying that there needs to be an educational component if you’re showing breasts related to breast cancer (remember, above, years back, Facebook had already declared that mastectomy photos are okay, and at least some of these ads do show post-mastectomy photos).

The charity in question is furious about this and calls the whole thing “nonsensical,” but it’s actually the opposite of that. It’s totally “sensical” once you understand much of the history, and the fact that Facebook keeps having to change and adapt these rules, often multiple times a month, to deal with the “new” cases that keep showing up that don’t quite match. And you could (and many do) argue that it’s “obvious” why these ads should be allowed, but you forget that the company can’t just rely on something being “obvious.” It has over 10,000 people it employs who are in charge of making these decisions, and what’s obvious to one of them may not be obvious to another. And thus it needs clearly spelled out rules.

And those rules will never encompass every possible situation, and we’ll continue to see stories like this basically forever. We keep saying that content moderation at scale is impossible to do well, and part of that is because of stories like this. You can’t create rules that work in every case, and there are more edge cases than you can possibly imagine.

Permalink | Comments | Email This Story



Source link

قالب وردپرس

Related posts

How to Customize the To-Do Pane in Outlook

MasMaz

Android Q: OnePlus 6T to Realme 3 Pro, 15 non-Google phones receiving the latest beta

MasMaz

Hulu has repurchased AT&T's 9.5% stake in the business for $1.43B, giving Hulu a valuation of $15B; Disney said in November that Hulu was worth $9.26B (Ari Levy/CNBC)

MasMaz