this post was submitted on 26 Dec 2023
111 points (81.4% liked)

Technology

59377 readers
3970 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Study shows AI image-generators being trained on explicit photos of children::Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built

you are viewing a single comment's thread
view the rest of the comments
[–] Fungah@lemmy.world 8 points 10 months ago* (last edited 10 months ago)

Yeah. You got it.

And sure you can hop on vocitai and download a model or lora or lycoris or comfyui workflow or whatwver. But we're only at the beginning stages of ai.

Like. Face swapping is mainly done with with the inswapper. Onnx model which was pulled by insight face after it started making the rounds in the face swapping applicarion roop. It's all well and good for hobbyist face swapping image gen but it's a 128 bit model and low res. It kind of makes a blurry mess on larger images. Insight face has higher resolution models available but they're not public, and to my knowledge there aren't any viable alternative to this model that can match the same speed and accuracy. So everyone is out here playing with sticks and rocks while those who can pay have shoyn new things.

There are very valid concerns about the harmful potential of deep fakes and I can understand how the models creator didn't want to take responsibility for enabling that. But if, say, google wanted to use that or a similar closed source in house model to deep fake CASM for propaganda purposes or the same for political leaders, celebrities, not only does the public not have access to those models to understand how it's being done and identify artifacts of that process, they lack the ability to "fight back" in any meaningful way.

To be clear I don't think the above is, or is inevitably going to happen, but it highlights the asymmetric nature of ai that big tech wants. It doesn't even have to be such high stakes. If you wanted to, say, swap out your son's face for Luke Skywalker on the star wars movie for a Christmas present or something, that's something that would be challenging to do locally and convincingly without the right model, but not having access to that model you could instead be forced to pay an absurdly high price by a private company or denied entirely due to fear of copyright infringement, even though I'm relatively certain doing that and not releasing it publicly falls purely in the realm of fair use.

And then there's text, speech, audio generation. What happens if the tech gets good enough for someone to spend a few hours setting up some parameters for some pop songs with vocals, hits go, and generates music as consistently appealing as what we hear on the radio? And when no one else can access that tech? They're able to pay artists nothing and basically produce free content wed have to pay for. If the public had access to that same tech then artists would still have a role in making popular music, even if the landscape had shifted totally. Either way the music business as we know it dies, but there's one option where creative people can still make money independently without getting on big techs dick to do so.

It's a complicated issue and the ethics of it are fraught no matter where you look, but take one look at how cynically terrible all of googles products are getting and I think it's painfully obvious we can't trust them and their ilk with some access to this kind of tech.