this post was submitted on 19 Sep 2023
635 points (98.0% liked)

Europe

8324 readers
3 users here now

News/Interesting Stories/Beautiful Pictures from Europe πŸ‡ͺπŸ‡Ί

(Current banner: Thunder mountain, Germany, πŸ‡©πŸ‡ͺ ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

founded 1 year ago
MODERATORS
 

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: β€œUndress anybody with our free service!”

you are viewing a single comment's thread
view the rest of the comments
[–] JoBo@feddit.uk 30 points 1 year ago (2 children)

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.

Fucking horrible world we're forced to live in. Where's the fucking exit?

[–] GreatGrapeApe@reddthat.com 13 points 1 year ago (1 children)

It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.

[–] JoBo@feddit.uk 10 points 1 year ago* (last edited 1 year ago) (2 children)

I was replying to someone who was claiming they aren't harmful as long as everyone knows they're fake. Maybe nitpick them, not me?

Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.

[–] ToyDork@lemmy.ca 2 points 1 year ago (1 children)

How about no. Look up on wikipedia what countries have ruled fictional CP is legal, but CSAM with actual children is as illegal as here. Then check the level of technological development, financial development and political stability of them. Not seeing a pattern? Then check the number of sexual assaults in those countries compared to countries of similar development levels. Now check the number of sexual assaults across ALL ages and genders in countries where all pornography is illegal.

Now, I'm not going to make any absolute claims. All I know is that actual psychologists have spoken out about the line between harmless (as in, would never consider harming a child) pedophiles and people who would be rapists even if they weren't sexualizing kids, and 15 year olds have spent time in a prison for age-of-majority sexual offenders because some asshole pedo creep with hacking skills used the teenage boy's computer as remote storage for CSAM material and the teen was arrested for someone else's crime.

I get why this is a controversial topic. I hate people who harm kids too. Just realize that laws meant to "protect kids" can end up harming kids (not to mention adults who are not guilty of crimes) or even fail to recognize that people hurting kids goes way beyond pedophile sickos and non-sexual violence against kids has itself become a serious problem yet is not being acknowledged.

The laws as they are need to start taking a more generalist view of this, because when pastors can get away with this disgusting behavior and judges can send a completely innocent 10 year old to juvie for a $500 kickback and not even flinch when the boy commits suicide, or a judge even sending an autistic boy who killed someone to the death penalty instead of accepting the insanity plea?

It says we (adults in general) don't really care about children, only looking like we do.

[–] JoBo@feddit.uk 0 points 1 year ago (1 children)

Not getting beyond your first sentence here. I am not interested in what fucked up laws have been passed. Nor in engaging with someone who wants to argue that any form of child porn is somehow OK.

[–] ToyDork@lemmy.ca -2 points 1 year ago (1 children)

Then go fucking die. People like you are what's wrong with this fucking witch hunt.

[–] ToyDork@lemmy.ca 1 points 1 year ago

You know what? Fuck everyone who downvoted me, and guess what? Tell lemmy.ca's admins to change their fucking password, because I know where you live and I'm going to murder both of you. You are the ones putting actual kids in danger, not me.

[–] GreatGrapeApe@reddthat.com -3 points 1 year ago (2 children)

Im addressing you because you made the claim they are equivalent when they clearly are not.

[–] JoBo@feddit.uk 7 points 1 year ago* (last edited 1 year ago) (1 children)

No I didn't. Go nitpick someone else.

Or better still, explain why you think AI-generated CSAM isn't harmful. FFS

[–] SharkEatingBreakfast@sopuli.xyz 8 points 1 year ago* (last edited 1 year ago)

Let's be real here:

Sure, it's not illegal. But if I find "those kinds" of AI-generated images on someone's phone or computer, the fact that it's AI-generated will not improve my view of that person in any possible way.

Even if it's technically "legal".

They tellin' on themselves.

[–] Ataraxia@sh.itjust.works 5 points 1 year ago

People who consume any kind of cp are dangerous and encouraging thar behavior is just as criminal. I'm glad that shit is illegal in most civilized countries.

[–] Seudo@lemmy.world 0 points 1 year ago (1 children)

Sauce that allowing computer generated cp causes more harm?

[–] JoBo@feddit.uk 0 points 1 year ago* (last edited 1 year ago) (1 children)

How is this place infested with so many fucking nonces?

I made no claims about "more harm" so what imaginary claim are you referring to in your attempt to justify CSAM?

[–] Seudo@lemmy.world -4 points 1 year ago

Oh, so you want more harm. Curious.