this post was submitted on 06 Feb 2024
223 points (94.4% liked)
Technology
59428 readers
3278 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's well-known that these algorithms push topics to drive engagement, and naturally things that make people angry or frightened or disgusted etc enough are more likely to be engaged with regardless of what that topic is.
When outrage is the prime driver of engagement it's going to push some people right off the platform entirely, and the ones who stay are psychologically worse off for it.
Social media execs, "we've done the math and it's worth it"
Worth it for them, for short term profits. Good thing nobody is considering the net effect this has on society or political discourse.
They could certainly do with a control group or three. The point they're trying to make is that over 5 days of watching recommended videos the proportion that were misogynistic grew from 13% on day 1 to 52% on day 5. That suggests a disproportionate algorithmic boost but it's hard to tell how much that was caused by the videos they chose to view.
A real world trial ought to be possible. You could recruit thousands of kids to just do their own thing and report back. It's a very hard question to study in the lab because it's nothing like the real world.