this post was submitted on 26 Jan 2024
85 points (100.0% liked)

chapotraphouse

13509 readers
1376 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Vaush posts go in the_dunk_tank

Dunk posts in general go in the_dunk_tank, not here

Don't post low-hanging fruit here after it gets removed from the_dunk_tank

founded 3 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] KittyBobo@hexbear.net 29 points 9 months ago (4 children)

I mean without AI they'd just be using bad photoshop. Heck if you got someone who was good at photoshop and could make realistic propaganda that'd be worse than AI that can easily be picked apart.

[–] regul@hexbear.net 29 points 9 months ago

Honestly this picture looks more like bad photoshop than AI.

[–] buckykat@hexbear.net 19 points 9 months ago

It looks like the word PRESS is still badly photoshopped in here, probably because the AI image generators still suck at generating text

[–] viva_la_juche@hexbear.net 10 points 9 months ago (1 children)

So what you’re saying is we need to shut down computers until we can figure out what’s going on

[–] JohnBrownNote@hexbear.net 8 points 9 months ago

no you can doctor physical photographs too. we need to smash cameras for stealing our souls and ban all visual arts

[–] gramathy@lemmy.ml 1 points 9 months ago

I wonder if the forensic techniques to identify photoshopped images /altered audio work on ai-generated media?

I know you can timestamp audio to a specific point in time by matching the frequency of the background electrical hum if it’s available, so if it should be available but isn’t that could indicate a video or audio clip is fake, and that differences in image grain/quality can identify patchwork images, but does that also come out in AI-generated images?