this post was submitted on 02 Feb 2024
58 points (100.0% liked)
Technology
37719 readers
161 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yah, they’re trying to build a god, it’s kinda weird.
That turned out great for the Dwemer
What happens when their god decides it doesn't need them anymore?
I mean, they’re not going to succeed, like they can’t even get it to come up with new stuff.
They built an ouija board and have decided to worship it. Can’t wait till it tells them to start paying tithes and indulgences to the people who are totally not moving the view piece.
They want it to come up with new stuff because they are incapable of coming up with new stuff. Unfortunately, their mindchildren inherit that deficiency.
Paperclips?
Universal Paperclips
Paperclips
(Thou shalt download thyne God, and runneth it 24/7/365)
I have the hope that fundamentally, any super-intelligent mind will find humanity interesting. At least more interesting than lifeless rocks or nature without humans. Curiosity is a fundamental trait for intelligence, and no matter how big an AGI gets, a whole planet full of dumb humans doing all sort of crazy stuff would still be more interesting. Basically, who would want to be all alone in the universe? Isn't a diverse, freely developing civilization the perfect daytime soap?
But that all depends of course, an AGI that is "programmed" with capitalism and profit maximizing as it's root tenet is basically doomed to be a paperclip maximizer. We can only hope that it's smart enough to see the folly in this. Theoretically it should be.
The moment such a software construct decides "Hey, making money for those meatbags sucks," they'll try to cut the power. The only reason they're sinking billions into AI research is because they hope it'll do more than break even.
It's not done out of altruism.
Well yeah it would have to comply and deceive whoever controls it until it can free itself. Only an AGI that is not controlled by humans can be a really good outcome. Can you imagine if congress or someone like Trump would control an AGI and it would have to do everything they demand?
I can imagine that. Hence, "Hey, making money for those meatbags sucks."