this post was submitted on 03 Mar 2024
200 points (86.5% liked)

Technology

59358 readers
6604 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

top 50 comments
sorted by: hot top controversial new old
[–] RainfallSonata@lemmy.world 127 points 8 months ago (2 children)

But they're banning printed books in libraries instead...

[–] Passerby6497@lemmy.world 66 points 8 months ago (1 children)

I've always taken the "protect our children" argument to have an implied "...from the knowledge that will actually protect them from abusers" based on the things the argument is trotted out for.

[–] umbrella@lemmy.ml 13 points 8 months ago

always be wary when they say either "protect the children" or " for your safety"

[–] RamblingPanda@lemmynsfw.com 31 points 8 months ago

Can't exploit kids as easily when they know what exploitation is.

[–] theneverfox@pawb.social 89 points 8 months ago (5 children)

I called this an unpopular opinion before, but maybe it's just an uncomfortable one

This isn't going away. It's in the wild, there's no putting it back in the bottle. Maybe, let's take this chance to stop devaluing women because their nudes exist. Men can post nudes with zero consequences - what's the logic here? IDGAF if they're a teacher with an only fans, if everyone can be rendered nude, no one can be.

Let's live in a post nudes world. Next time a woman is about to get fired over nudes, let's say "it's probably ai generated, you're disgusting for suggesting such a thing". Let them do it behind closed doors, or we shame them relentlessly. Anyone sharing nudes without consent should be the target here, who cares if they're generated, shared with trusted partners, or shared publicly for their own reasons.

The person bringing them into an inappropriate setting are the ones doing something wrong. No one should be shamed or feel fear because their nudes are being passed around - they should only feel disgust.

[–] rottingleaf@lemmy.zip 8 points 8 months ago (2 children)

Men can post nudes with zero consequences - what’s the logic here?

I'm not really sure if that'd be the case if I did that

load more comments (2 replies)
load more comments (4 replies)
[–] GenderNeutralBro@lemmy.sdf.org 81 points 8 months ago (2 children)

This is a human problem, not an AI problem.

Maybe if we hadn't neglected it for the past century.....

[–] Municipal0379@lemmy.world 27 points 8 months ago

It’s both.

[–] VaultBoyNewVegas@lemmy.world 24 points 8 months ago (4 children)

Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes. Neither of which was happening at such a large degree in a school before as it would create a shit show.

Also most jurisdictions don't actually have strict AI laws yet which is making it harder for authorities to deal with. If you genuinely believe that AI isn't at fault here then you're ignorant of what's happening around the world.

https://www.theguardian.com/technology/2024/feb/29/clothoff-deepfake-ai-pornography-app-names-linked-revealed That's an article about one company that provides an app for deep fakes. It's a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

[–] tsonfeir@lemm.ee 36 points 8 months ago

When I was a kid I used to draw dirty pictures and beat off to them. AI image creation is a paint brush.

I very much disagree with using it to make convincing deepfakes of real people, but I struggle with laws restricting its use otherwise. Are images of ALL crimes illegal, or just the ones people dislike? Murder? I’d call that the worst crime, but we sure do love murder images.

[–] DarkThoughts@fedia.io 20 points 8 months ago (1 children)

Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes.

I'm sorry but this is bullshit. You could "photoshop" someone's face / head onto someone else's body already before "AI" was a thing. Here's a tutorial that allows you to do this within minutes, seconds if you know what you're doing: https://www.photopea.com/tuts/swap-faces-online/

That's an article about one company that provides an app for deep fakes. It's a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

Also very ignorant take. You can download Stable Diffusion for free and add a face swapper to that too. Generating decent looking bodies actually might take you longer than just taking a nude photo of someone and using my previous editing method though.

[–] ominouslemon@lemm.ee 8 points 8 months ago (4 children)

You could do everything before, that's true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it's easy, the number of victims (if we can call them that) is huge. And that changes things. It's always been wrong. Now it's also a problem

[–] Bleach7297@lemmy.ca 7 points 8 months ago

This is right. To do it before you had to be a bit smart and motivated. That's a smaller cross section of people. Now any nasty fuck with an app on their phone can bully and harass their classmates.

load more comments (3 replies)
[–] xePBMg9@lemmynsfw.com 8 points 8 months ago* (last edited 8 months ago) (2 children)

Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier. I don't want a future where my device is locked down and surveiled to the point I can't install what I want on it. Neither should the common man be excluded from taking advantage of these tools. This is a people problem. Maybe culture needs to change. Limit phone use in schools. Technical solutions will likely only bring worse problems. There are probably no lazy solutions here. This is not one of those problems you can just hand over to some company and tell them to figure it out.

Though I could get behind making it illegal to upload and store someone's likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that. So it would be a hard sell. In fact, I would like all personal data be illegal to store, trade and sell.

[–] TheBat@lemmy.world 6 points 8 months ago

Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier.

Slightly easier? That's a one hell of an understatement. Have you ever used Stable Diffusion?

load more comments (1 replies)
load more comments (1 replies)
[–] Mango@lemmy.world 39 points 8 months ago (30 children)

Paywall. Nope. Exploit those girls with your AI then. The messages against it can't be heard without money.

load more comments (30 replies)
[–] pineapplelover@lemm.ee 30 points 8 months ago (4 children)

As much as I would like to blame ai. Photoshop has been doing this for a long time already. I think the public ai sites already do csam searching and stuff for this already but somebody could still run it locally.

load more comments (4 replies)
[–] alphacyberranger@lemmy.world 29 points 8 months ago

It's not exactly fault of AI. It's a human problem. People have been doing this for years now but AI does make it easier.

[–] lud@lemm.ee 23 points 8 months ago
[–] uriel238@lemmy.blahaj.zone 13 points 8 months ago

Usually for grade schoolers it's haha you're naked not haha you're a slut now.

Some day we'll develop better attitudes regarding nuditymand sexuality, but not today, and not in the US.

[–] boatsnhos931@lemmy.world 10 points 8 months ago

Forbidden computer bobs and vagenes

[–] autotldr@lemmings.world 9 points 8 months ago

This is the best summary I could come up with:


But the idea of such young children being dehumanized by their classmates, humiliated and sexualized in one of the places they’re supposed to feel safe, and knowing those images could be indelible and worldwide, turned my stomach.

And while I still think the subject is complicated, and that the research doesn’t always conclude that there are unfavorable mental health effects of social media use on all groups of young people, the increasing reach of artificial intelligence adds a new wrinkle that has the potential to cause all sorts of damage.

So I called Devorah Heitner, the author of “Growing Up in Public: Coming of Age in a Digital World,” to help me step back a bit from my punitive fury.

In the Beverly Hills case, according to NBC News, not only were middle schoolers sexualizing their peers without consent by creating the fakes, they shared the images, which can only compound the pain.

(It should be noted that in the Beverly Hills case, according to NBC News, the superintendent of schools said that the students responsible could face suspension to expulsion, depending on how involved they were in creating and sharing the images.)

I regularly hear from people who say they’re perplexed that young women still feel so disempowered, given the fact that they’re earning the majority of college degrees and doing better than their male counterparts by several metrics.


The original article contains 1,135 words, the summary contains 230 words. Saved 80%. I'm a bot and I'm open source!

[–] jaschen@lemm.ee 5 points 8 months ago (2 children)

This will not be popular, but I welcome this. Once we normalize AI video porn, then videos are no longer trusted like we don't trust photoshopped images.

Even if the video is real, anyone can just claim it's AI. Nobody will even need to create their own home videos anymore because the kink is sorta gone now since anyone can claim it's fake.

[–] excitingburp@lemmy.world 21 points 8 months ago (1 children)

I used to agree with this, but hearing interviews with actual victims changed my mind. This only works in theory.

load more comments (1 replies)
load more comments (1 replies)
[–] Grimy@lemmy.world 4 points 8 months ago (1 children)

A very ugly side to the technology, I absolutely think this should be considered on the same level as revenge porn and child pornography.

I also fear these kind of stories while be used to manipulate the public into thinking banning the tools is in their best interest instead of punishing the bad actors.

[–] curiousaur@reddthat.com 8 points 8 months ago

Your second point here is exactly what we should fear; that it might becomes legal only for companies and governments to use AI but not the masses.

That's why I'm of the opinion that we should maybe just get over it. It's going to continue to become easier and easier to use it for horny reasons. The guy wearing the smart glasses might be seeing every women around him undressed in real time. We're just a few years away from that and there is no acceptable way to prevent that.

[–] Jolteon@lemmy.zip 3 points 8 months ago

People are always complaining about how AI art is putting real artists out of business, so I would have thought that the same thing applied here (hopefully resulting resulting in less exploitation). Really sad to see that's not the case.

[–] D_Air1@lemmy.ml 3 points 8 months ago

Don't we just send people caught with such materials to jail then? Regardless of if it was AI generated or not. If it is made to look like them then clearly shouldn't be in your possession without their consent.

load more comments
view more: next ›