this post was submitted on 12 Oct 2024
110 points (69.5% liked)
Technology
59358 readers
6724 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My feelings are mixed. Everything you are saying is true. LLMs, right now at least, are a huge waste of resources. It's triggering us to move closer to fossil fuels when we should be moving away. Every time I step outside to a nice balmy day, I think, am I going to miss this in a few years' time? In a few decades, am I going to envy my current self who can do dishes without worrying too much about how much water goes down the drain? Are the generations to come going to look at my occasional can of tuna with contempt and jealousy? Or will they even have the luxury of retrospection?
I understand what we have to lose and how little we are doing about it. But I have also grown up being subjugated inside a capitalist hellscape. And I've spent the past few days having ChatGPT help me set up a CI/CD pipeline and start coding some games I've wanted to make for years. It's allowed me to take a few hours of free time and make progress that I expected would have taken a week. It doesn't have that effect on every task, but when learning new software, it really feels like having someone knowledgeable sitting next to me to answer my questions and point me in the right direction.
GPT 3 was kind of a neat party trick - sounds kind of like a person, but a pretty dumb person. GPT 4 sounded smarter, but still couldn't code for shit. The o1 model still makes mistakes, but it retains the thread of our conversation weeks after the fact and has put together some code that I would have struggled to do myself. Even if it loses more money than it makes right now, I can see the value in progressing development until we achieve AGI.
People have expressed hopes that AGI will solve a lot of the world's problems. That it will know just what to do about climate change. That it will crack codes in our DNA and give us endless healthy life. I am doubtful that these dreams will come to fruition. At least not in the way people think. It might have the intelligence to tell us things that we should have already known. Like that we can't get much better yields in scrubbing carbon from the air than nature itself and we should have reforested far more land than we currently are. And that immortality will take huge amounts of resources and will come at the expense of the health of the masses. More gain for the rich. More suffering for the poor. Business as usual.
But I think there is a window of time where we can be hopeful about what AI has to offer. And we may even be able to leverage it to solve a big piece of the income inequality puzzle.
If we make a social media app that is not designed for profit but instead for the good of the people, there are a lot of problems such an app could solve.
We could design it to seek out real (non-bot) contributors. It will always be an arms race trying to sort real humans from bots but that is no reason to give up. It is a reason to get as far ahead in the race as we possibly can. We should build an app that both recognizes when someone is very likely to be real and when they have also contributed to a cause.
Imagine an application that tracks creative innovation, such as the creation of a funny video or a new meme format. When someone makes an idea and it is popular, the AI model would determine how much of a given experience is improved by their idea and give them profit residuals based on their contribution. And the more ideas that get built on top of the original idea, the more the newer contributors are rewarded for their contributions.
Think about if people could design a farm from the ground up using a socialized app for collaboration. Someone could design a camera system to keep track of livestock wellbeing and to head off diseases. They could make AI-empowered systems to track livestock happiness and find ways of increasing quality of life. And creating more humane automated methods of turning crops and livestock into food ready to transport. Some people would focus on creating ideal distribution methods. Others would create stores or restaurants. Others might work on the people themselves, encouraging them to give new more climate friendly meal options a try. Investors would be paid their dues, but there would be no CEO or board of executives. The means of production would belong to the people.
When people talk about the potential of AI, that's what I envision. If I can make some passive income with my games and apps, that's the next project I'll be diverting my time towards. Because this is a narrow window we have to make this happen. The technology is here, but barriers from climate change and income inequality are only going up. We can lament the fact that AI is currently not profitable and hurting the planet, or we can put more of that energy to use by taking the tools humanity has made and using them to dismantle the systems which made this timeline so intolerable to begin with. The only way to take the current system apart is to make a new one that outcompetes our old ways of life in every measurable way.
This is a long post and I'm not even going to try to address all of it, but I want to call out one point in particular, this idea that if we somehow made a quantum leap from the current generation of models to AGI (there is, for the record, zero evidence of there being any path to that happening) that it will magically hand us the solutions to anthropogenic climate change.
That is absolute nonsense. We know all the solutions to climate change. Very smart people have spent decades telling us what those solutions are. The problem is that those solutions ultimately boil down to "Stop fucking up the planet for the sake of a few rich people getting richer." It's not actually a complicated problem, from a technical perspective. The complications are entirely social and political. Solving climate change requires us to change how our global culture operates, and we lack the will to do that.
Do you really think that if we created an AGI, and it told us to end capitalism in order to save the planet, that suddenly we'd drop all our objections and do it? Do you think that an AGI created by Google or Microsoft would even be capable of saying "Stop allowing your planets resources to be hoarded by a priveliged few"?
I choose to take your questions as rhetorical, as I think our points do align. I quite agree.