this post was submitted on 26 Feb 2025
672 points (98.8% liked)

Technology

63277 readers
4270 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Update: After this article was published, Bluesky restored Kabas' post and told 404 Media the following: "This was a case of our moderators applying the policy for non-consensual AI content strictly. After re-evaluating the newsworthy context, the moderation team is reinstating those posts."

Bluesky deleted a viral, AI-generated protest video in which Donald Trump is sucking on Elon Musk’s toes because its moderators said it was “non-consensual explicit material.” The video was broadcast on televisions inside the office Housing and Urban Development earlier this week, and quickly went viral on Bluesky and Twitter.

Independent journalist Marisa Kabas obtained a video from a government employee and posted it on Bluesky, where it went viral. Tuesday night, Bluesky moderators deleted the video because they said it was “non-consensual explicit material.”

Other Bluesky users said that versions of the video they uploaded were also deleted, though it is still possible to find the video on the platform.

Technically speaking, the AI video of Trump sucking Musk’s toes, which had the words “LONG LIVE THE REAL KING” shown on top of it, is a nonconsensual AI-generated video, because Trump and Musk did not agree to it. But social media platform content moderation policies have always had carve outs that allow for the criticism of powerful people, especially the world’s richest man and the literal president of the United States.

For example, we once obtained Facebook’s internal rules about sexual content for content moderators, which included broad carveouts to allow for sexual content that criticized public figures and politicians. The First Amendment, which does not apply to social media companies but is relevant considering that Bluesky told Kabas she could not use the platform to “break the law,” has essentially unlimited protection for criticizing public figures in the way this video is doing.

Content moderation has been one of Bluesky’s growing pains over the last few months. The platform has millions of users but only a few dozen employees, meaning that perfect content moderation is impossible, and a lot of it necessarily needs to be automated. This is going to lead to mistakes. But the video Kabas posted was one of the most popular posts on the platform earlier this week and resulted in a national conversation about the protest. Deleting it—whether accidentally or because its moderation rules are so strict as to not allow for this type of reporting on a protest against the President of the United States—is a problem.

top 50 comments
sorted by: hot top controversial new old
[–] lenz@lemmy.ml 2 points 12 minutes ago (1 children)

I seem to be in the minority here, but I am extremely uncomfortable the idea of non-consensual AI porn of anyone. Even people I despise. It’s so unethical that it just disgusts me. I understand why there are exceptions for those in positions of power, but I’d be more than happy to live in a world where there weren’t.

[–] otp@sh.itjust.works 1 points 4 minutes ago

Where do you draw the line for the rich fucks of the world? Realistic CGI? Realistic drawings? Edited photos?

[–] disconnectikacio@lemmy.world 4 points 3 hours ago (1 children)

Bluesky will become just the same az elonx...

[–] astral_avocado@lemm.ee 0 points 23 minutes ago

It already is

[–] lmmarsano@lemmynsfw.com 19 points 8 hours ago* (last edited 8 hours ago) (8 children)

Ah, the rewards of moderation: the best move is not to play. Fuck it is & has always been a better answer. Anarchy of the early internet was better than letting some paternalistic authority decide the right images & words to allow us to see, and decentralization isn't a bad idea.

Yet the forward-thinking people of today know better and insist that with their brave, new moderation they'll paternalize better without stopping to acknowledge how horribly broken, arbitrary, & fallible that entire approach is. Instead of learning what we already knew, social media keeps repeating the same dumb mistakes, and people clamor to the newest iteration of it.

[–] Natanox@discuss.tchncs.de 2 points 1 hour ago

You clearly never were the victim back in those days. Neither do you realize this approach doesn't work on the modern web even in the slightest, unless you want the basics of both enlightenment and therefore science and democracy crumbling down even faster.

Anarchism is never an answer, it's usually willful ignorance about there being any problems.

I think there's a huge difference between fighting bullying or hate speech against minorities. Another thing is making fun of very specific and very public people.

[–] fossilesque@mander.xyz 3 points 2 hours ago

Elon acts like a new Reddit mod drunk on power. He is the guy screaming in the comments that he knows how to run a forum better and seized the chance, and now he cannot fathom why people hate him.

[–] Clbull@lemmy.world 4 points 3 hours ago (1 children)

I miss the early days of the internet when it was still a wild west.

Something like I hate you myg0t 2 or Pico's School would have gotten the creators cancelled if released in 2025.

[–] dustyData@lemmy.world 2 points 1 hour ago

Note on the term canceling. Independent creators cannot, by definition, get canceled. Unless you literally are under a production or publishing contract that gets actually canceled due to something you said or did, you were not canceled. Being unpopular is not getting canceled, neither is receiving public outrage due to being bad or unpopular. Even in a figurative sense, just the fact that the videos were published to YouTube and can still be viewed means they were not canceled. They just fell out of the zeitgeist and aren't popular anymore, that happens to 99% of entertainment content.

[–] noli@lemm.ee 8 points 4 hours ago* (last edited 4 hours ago)

You need some kind of moderation for user generated content, even if it’s only to comply with takedowns related to law (and I’m not talking about DMCA).

[–] cley_faye@lemmy.world 7 points 4 hours ago

Fuck it is & has always been a better answer

Sure. Unless you live in a place that have laws and laws enforcement. In that case, it's "fuck it and get burnt down".

[–] rottingleaf@lemmy.world 6 points 4 hours ago (1 children)

You do remember snuff and goatse and csam of the early internet, I hope.

Even with that of course it was better, because that stuff still floats around, and small groups of enjoyers easily find ways to share it over mainstream platforms.

I'm not even talking about big groups of enjoyers, ISIS (rebranded sometimes), Turkey, Azerbaijan, Israel, Myanma's regime, cartels and everyone share what they want of snuff genre, and it holds long enough.

In text communication their points of view are also less likely to be banned or suppressed than mine.

So yes.

Yet the forward-thinking people of today know better and insist that with their brave, new moderation they’ll paternalize better

They don't think so, just use the opportunity to do this stuff in area where immunity against it is not yet established.

There are very few stupid people in positions of power, competition is a bitch.

[–] CarbonBasedNPU@lemm.ee 3 points 1 hour ago* (last edited 1 hour ago) (1 children)

I'm weirded out when people say they want zero moderation. I really don't want to see any more beheading or CSAM and moderation can prevent that.

[–] rottingleaf@lemmy.world -1 points 52 minutes ago (1 children)

Moderation should be optional .

Say, a message may have any amount of "moderating authority" verdicts, where a user might set up whether they see only messages vetted by authority A, only by authority B, only by A logical-or B, or all messages not blacklisted by authority A, and plenty of other variants, say, we trust authority C unless authority F thinks otherwise, because we trust authority F to know things C is trying to reduce in visibility.

Filtering and censorship are two different tasks. We don't need censorship to avoid seeing CSAM. Filtering is enough.

This fallacy is very easy to encounter, people justify by their unwillingness to encounter something the need to censor it for everyone as if that were not solvable. They also refuse to see that's technically solvable. Such a "verdict" from moderation authority, by the way, is as hard to do as an upvote or a downvote.

For a human or even a group of humans it's hard to pre-moderate every post in a period of time, but that's solvable too - by putting, yes, an AI classifier before humans and making humans check only uncertain cases (or certain ones someone complained about, or certain ones another good moderation authority flagged the opposite, you get the idea).

I like that subject, I think it's very important for the Web to have a good future.

[–] CarbonBasedNPU@lemm.ee 3 points 40 minutes ago (1 children)

people justify by their unwillingness to encounter something the need to censor it for everyone...

I can't engage in good faith with someone who says this about CSAM.

Filtering and censorship are two different tasks. We don’t need censorship to avoid seeing CSAM. Filtering is enough.

No it is not. People are not tagging their shit properly when it is illegal.

[–] rottingleaf@lemmy.world 0 points 30 minutes ago

I can't engage in good faith

Right, you can't.

If someone posts CSAM, police should get their butts to that someone's place.

No it is not. People are not tagging their shit properly when it is illegal.

What I described doesn't have anything to do with people tagging what they post. It's about users choosing the logic of interpreting moderation decisions. But I've described it very clearly in the previous comment, so please read it or leave the thread.

[–] andros_rex@lemmy.world 15 points 7 hours ago (1 children)

I had to hack an ex’s account once to get the revenge porn they posted of me taken down.

There’s a balance at the end of the day.

load more comments (1 replies)
[–] Jumpingspiderman@lemmy.world 9 points 7 hours ago (1 children)

Bluesky had better take care that they not act like other cowardly tech media

[–] FauxLiving@lemmy.world 11 points 6 hours ago* (last edited 6 hours ago)

If they don't it is only because they are waiting to obtain a higher share of the social media market.

Jumping ship from one corporate owned social media to another corporate owned social media isn't a smart move. There is nothing about Bluesky that will prevent it from becoming X in the future. People joining now are only adding to the network effect that will make leaving more difficult in a decade or two.

The problem of social media won't be solved by choosing which dictator's rule you want to live under. You don't have the freedom to speak and express yourself if you give someone veto power over what you write.

[–] thisphuckinguy@lemmy.world 12 points 10 hours ago

Bluesky is BS

[–] kreskin@lemmy.world 6 points 9 hours ago (1 children)
[–] demizerone@lemmy.world 1 points 3 hours ago
[–] b3an@lemmy.world 37 points 13 hours ago

Put it on Facebook! Ol’ Zuck decided all the guardrails pretty much needed to go so. Post and do whatever. Plus, the people who should see it most are those still hanging around on Facebook 🤣

[–] fluffykittycat@slrpnk.net 10 points 10 hours ago (1 children)

Their moderation has been garbage lately. They're wrongly banning people for things they didn't do. It's just premusk twitter at this point. The real fediverse is a better vet medium and long term

[–] Ashelyn@lemmy.blahaj.zone 4 points 4 hours ago* (last edited 4 hours ago)

It's just premusk twitter at this point.

I mean, given that Jack Dorsey founded it as basically the "not Twitter Twitter" after musk bought the main one, I don't think it's surprising to see it face basically the same moderation issues in the name of being "even-handed"

[–] MolecularCactus1324@lemmy.world 258 points 18 hours ago* (last edited 18 hours ago) (26 children)

I guess I get it. They would not like to set precedent to allow non-consensual AI generated porn on the platform. Seems reasonable. That said, fuck Donny. The video is hilarious. It’s fine if Bluesky doesn’t host it though.

[–] GeneralEmergency@lemmy.world 12 points 5 hours ago

Holy shit. A reasonable take from someone who clearly leaves the house.

[–] Imgonnatrythis@sh.itjust.works 36 points 14 hours ago (5 children)

Well, looks like they put it back up. I think I agree with you though. It might be better for them to restrict this. Frankly republican incels excel at generating this kind of content and this sets the precedent that Bluesky will welcome such AI garbage. I'm not arguing that this stuff shouldn't be made in good spirit, but for a serious platform to not moderate it out I think invites chaos.

load more comments (5 replies)
load more comments (24 replies)
load more comments
view more: next ›