Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.
This is up there with Uber pretending self driving cars will make them rich.
This is a most excellent place for technology news and articles.
Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.
This is up there with Uber pretending self driving cars will make them rich.
I mean... self driving cars probably will. Just not as soon as they think. My guess, at least another decade.
Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.
As someone said on this thread: as soon as they can convince legislators, even if they are murder machines, capital will go for it.
Borrowing from my favorite movie: "it's just a glitch".
there will be a massive building in like india with many thousand of atrociously paid workers donning VR goggles who spend their long hours constantly Quantum Leap finding themselves in traumatizing last second emergency situations that the AI gives up on. Instantly they slam on the brakes as hard as they can. They drink tea. there's suicide netting everywhere. they were the lowest bidder this quarter.
I hope this helps people understand that you don't get to be CEO by being smart or working hard. It's all influence and gossip all the way up.
In fact, being stupid is probably a benefit.
Yep if I had that kind of money and surrounded by like minded people I'd agree. Unfortunately I'm cursed with a rational mind 🙃🙃🙃
"Coding" was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack
This right here.
Problem is not coding. Anybody can learn that with a couple of well focused courses.
I'd love to see an AI find the cause of a catastrophic crash of a machine that isn't caused by a software bug.
Catching up on what Carmack's been up to for the last decade has revived the fan in me. I love that 2 years after leaving Oculus to focus on AGI, this is all the hype he's willing to put out there.
"Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of"
They've been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don't like that kind of thing.
Unfortunately, I don't think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.
It's worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it's a hilariously stupid comment to make, he's in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
PM and sales, eh?
So you're saying his lack of respect for programmers isn't new, but has spanned his whole career?
Lol sure, and AI made human staff at grocery stores a thing of the....oops, oh yeah....y'all tried that for a while and it failed horribly....
So tired of the bullshit "AI" hype train. I can't wait for the market to crash hard once everybody realizes it's a bubble and AI won't magically make programmers obsolete.
Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers...
Spoken like someone who manages programmers instead of working as one.
Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn't exist.
A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.
The funny thing was, it knew and could explain why those functions couldn't be used when I corrected it. But it wasn't able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.
Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it's talking about to the average person.
Basically, AI is currently functioning at the same level as the average tech CEO.
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
I just want to remind everyone that capital won't wait until AI is "as good" as humans, just when it's minimally viable.
They didn't wait for self-checkout to be as good as a cashier; They didn't wait for chat-bots to be as good as human support; and they won't wait for AI to be as good as programmers.
And then we should all charge outrageous hourly rates to fix the AI generated code.
They won't, and they'll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We've already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn't able to do their jobs.
It's really funny how AI "will perform X job in the near future" but you barely, if any, see articles saying that AI will replace CEO's in the near future.
'Soon' is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today's LLMs - which are sometimes hallucinate so bad, they claim 'C' in CRC-32C stands for 'Cool'.
I wish we could also add a "Do not hallucinate" prompt to some CEOs.
Meanwhile, llms are less useful at helping me write code than intellij was a decade ago
Extremely misleading title. He didn't say programmers would be a thing of the past, he said they'll be doing higher level design and not writing code.
Even so, he's wrong. This is the kind of stupid thing someone without any first hand experience programming would say.
We are now X+14 months away from AI replacing your job in X months.
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
Can AI do proper debugging and troubleshooting? That's when I'll start to get worried
Well, that would be the 3rd or 4th thing during my career that was supposed to make my job a thing of the past or at least severely reduce the need for it.
(If I remember it correctly, OO design were supposed to reduce the need for programmers, as were various languages, then there was Outsourcing, visual programming and on the server-side I vaguely remember various frameworks being hailed as reducing the need for programmers because people would just be able to wire modules together with config or some shit like that. Additionally many libraries and frameworks out there aim to reduce the need for coding)
All of them, even outsourcing, have made my skills be even more in demand - even when they did reduce the amount of programming needed without actually increasing it elsewhere (a requirement were already most failed) the market for software responded to that by expecting the software to do more things in more fancy ways and with data from more places, effectively wiping out the coding time savings and then some.
Granted, junior developers sometimes did suffer because of those things, but anything more complicated than monkey-coder tasks has never been successfully replaced, fully outsourced or the need for it removed, at least not without either the needs popping up somewhere else or the expected feature set of software increasing to take up the slack.
In fact I expect AI, like Outsourcing before it, in a decade or so is going to really have screwed the Market for Senior Software Engineers from the point of view of Employers (but a golden age for Employees with those skills) by removing the first part of the career path to get to that level of experience, and this time around they won't even be able to import the guys and galls in India who got to learn the job because the Junior positions were outsourced there.
I'd believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we'll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn't exist. I don't remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it's bad really helps you to get those little "Aha" moments that make programming fun. But at the end of the day it only serves as a learning tool because it's an engine for generating incompetent results.
ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it's doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.
Uh huh.
But you have to describe what it is. If only we had universal languages to do that... Oh yeah, it's code.
I've seen what Amazon produces internally for software, I think the LLMs could probably do a better job.
That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn't tell you where it got that. And then you're like "oh wow I didn't realize that was available" and then you try it and realize that's not part of the standard library and you ask it "where did you get that" and it's like "oh yeah sorry about that I don't know".
Current AI is good at compressing knowledge.
Best job role: information assistant or virtual secretary.
20 years ago at a trade show, a new module based visual coding tool was introduced in my field which claimed "You'll never need another programmer".
Oddly enough, I still have a job.
The tools have gotten better, but I still write code every day because procedural programming is still the best way to do things.
It is just now reaching the point that we can do some small to medium scale projects with plug and play systems, but only with very specific equipment and configurations.
The first thing AI gonna replace is CEO, dumb ass job, Mac Donald employer require more expertise
For like, a couple years, sure. Then there will be a huge push to fix all the weird shit generated by AI.