this post was submitted on 01 Aug 2023
528 points (82.5% liked)

Technology

60112 readers
2187 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] Mereo@lemmy.ca 6 points 1 year ago

It reminds me of Google back in the day (probably early 2010s). If you searched for White Women, it returned professional and respectable images. But if you searched for Black Women, it returned explicit images.

Machine learning algorithms are like sponges and learn from existing social biases.

[–] EmotionalMango22@lemmy.world 5 points 1 year ago

So? There are white people in the world. Ten bucks says she tuned it to make her look white for the clicks. I've seen this in person several times at my local college. People die for attention, and shit like this is an easy-in.

[–] LEDZeppelin@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

That's funny!

[–] mo_lave@reddthat.com 3 points 1 year ago* (last edited 1 year ago)

Like what some has already said here: it's a commentary of what Anglo-centric societies view as "professional" at the time the model is trained. Why Anglo-centric? By virtue that the US is the center of internet activity.

[–] Vlhacs@reddthat.com 3 points 1 year ago (1 children)

I wouldn't say "closer to Caucasian". She straight turned white

load more comments (1 replies)
[–] camillaSinensis@reddthat.com 3 points 1 year ago

Disappointing but not surprising. The world is full of racial bias, and people don't do a good job at all addressing this in their training data. If bias is what you're showing the model, that's exactly what it'll learn, too.

[–] funkajunk@lemm.ee 2 points 1 year ago (2 children)

Sigh...

It's not racial bias, it works from a limited dataset and what it understands a "professional headshot" even is.

Seems like some ragebait to me.

[–] mean_bean279@lemmy.world 5 points 1 year ago

While I agree with the dataset point I will say that I don’t believe this to be rage bait. It’s just pointing out and saying exactly what something did. That said AI isn’t meant for taking a picture and asking it to do something more with it for now. As well though to AI when it has tags like “professional headshot” in America and as an English input it will most likely pull from data that’s built around Hollywood types which will be a completely lopsided amount of blondes with blue eyes. It’s really important to me though that people read stuff like this article, and understand how we end up with some “news” outlets saying things like “someone said computers are racist” without understanding the context such as this. Outputs are only as good as the inputs.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›