There is no detail at all about what caused these accounts to get suspended. The Guardian just wants us to blindly trust this leftwing propaganda group when they say it was due to political censorship rather than a consistent application of Facebook policies.
Feels a lot more like the reporter already had a problem with Meta and chose the examples most favorable to their anti-Meta slant to report in the article. Of course on HN we're all just to happy to eat it up as it aligns neatly with our little bubble. Here's some still publically available posts from Sex Talk Arabic who they directly quote in the article complaining about these shadow bans. It makes it a lot harder to trust the reporting here when these examples were so easy to find.
And if you read further than the very first line...
>A message from Meta to the group dated 13 November said its page “does not follow our Community Standards on prescription drugs”, adding: “We know this is disappointing, but we want to keep Facebook safe and welcoming for everyone.”
>“The disabled accounts were correctly removed for violating a variety of our policies including our Human Exploitation policy,” it added.
... which is much more in-line with the idea that the actual reason is ideological positions. And if you scroll all the way to the bottom of the article you'll see that the "nudity" that was banned was not nudity at all. So non-nude they actually included the drawing in the Guardian article itself.
> The offending post was an artistic depiction of a naked couple, obscured by hearts.
Well, in these wonderful times we cannot exclude the possibility of entire flows being ran as just prompts, especially moderation and on an AI boo-boo having to roll back by a human. I do believe that's (much) cheaper than human moderation anyway, so it'll grow (even more).
I know it's against HN rules to ask if people have read the article, but you clearly didn't read the article.
The "non-sexual nudity" example is at the bottom of the article. It's a stylized cartoon drawing of a nude man and woman with arms around each others' waists viewed from the back as they walk along a path. There is a heart strategically placed around waist level so you can't even see their whole butts.
It's about the tamest artistic depiction of nudity you can imagine, certainly something that is totally fine anywhere else on Facebook. Very clear that this is a bullshit excuse being used by Meta.
> Fatma Ibrahim, the director of the Sex Talk Arabic, a UK-based platform which offers Arabic-language content on sexual and reproductive health, said that the organisation had received a message almost every week from Meta over the past year saying that its page “didn’t follow the rules” and would not be suggested to other people, based on posts related to sexuality and sexual health.
If you're getting a warning every week for a year, I would like to see the other 51 non cherry-picked examples that they didn't give to the guardian. Based on a quick look at some of their posts that are still publically available, I think Meta is completely justified in restricting visibility of some of these posts.
Indeed, how do we know they are nude if we can't see any of their parts? I mean, living in SF I've seen people walking around in public like that, wearing the most minimal covering possible.
that's a pretty heavily-worked little phrase. What is "non-explicit" nudity? That sounds to me like starting at the violation and then working backward to ensure that the people they want to be violators turn out to be violators.
Covered in the article (I realise, of course, that it is most improper to read those on this website). Stylised drawing of two humans with the naughty bits obscured based on how the picture was framed. However, that seems to have only been one account in any case and is probably not the thing to focus on.
i'm aware of the picture. i was trying to bring focus to the fact that no one has an intuitive definition of "non-explicit nudity" because these policies are kept intentionally vague so that anyone can be in violation at any time for any reason and the selectiveness with which they're enforced means that they can always be used to accomplish authoritarian goals with a fig leaf of non-authoritarian reasons for banning people.
I particular when every American blockbuster and TV show needs more girls with generous breasts and love scenes than actual plot and actors, but plain clothed girls holding hands online is nudity.
Somewhere along the way we decided that kids can't see boobs until they're 21, but should be fine watching people get murdered.
I don't have the words for it, but it seems like everyone is fine with MASSIVE violence in every piece of media. I feel like I've lost the plot somewhere.
It went off the rails with Game of Thrones. Before that hyperviolence was found mostly in horror movies, and that's fine imo, it's a specific genre. But nowadays it's in so many shows and movies.
All the american action movies we watched in the 80s and 90s with Rambo and Schwarzenegger etc were all about violence? Most American movies seem to work guns into them somehow too.
American TV has always been violent. It may have been less gory in the past, but there has always been gun fights and fist fights and violence of many kinds.
US culture imperialism. Only the US culture is right. Every other is wrong. This has been going on for 40 years with TV and then on the internet with social media.
Ok. Then people can use social media sites that match their ideologies. I'm not saying anything against one ideology or the other. I'm saying that people are allowed to have different ideologies, and they shouldn't be shamed or bullied into changing their ideologies because, apparently, Europeans are sexually loose. Meta Platforms Inc. is an American company. Don't like American ideologies? Don't subject yourself to them. Use something else. Decentralyzed social media is pretty popular these days, and it fits better with EU's digital privacy laws anyway.
Also, Facebook doesn't need to become a porn site when Porn Hub already has comments.
However the very first line reveals what the actual reason probably was: "posts showing non-explicit nudity triggering warnings"