this post was submitted on 15 Sep 2024
894 points (98.1% liked)

Technology

60123 readers
3727 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Telorand@reddthat.com 274 points 3 months ago (11 children)

Wow, the text generator that doesn't actually understand what it's "writing" is making mistakes? Who could have seen that coming?

I once asked one to write a basic 50-line Python program (just to flesh things out), and it made so many basic errors that any first-year CS student could catch. Nobody should trust LLMs with anything related to security, FFS.

[–] skillissuer@discuss.tchncs.de 112 points 3 months ago* (last edited 3 months ago) (4 children)

Nobody should trust LLMs with anything

ftfy

also any inputs are probably scrapped and used for training, and none of these people get GDPR

[–] mox@lemmy.sdf.org 18 points 3 months ago* (last edited 3 months ago)

also any inputs are probably scraped

ftfy

Let's hope it's the bad outputs that are scrapped. <3

load more comments (3 replies)
[–] SketchySeaBeast@lemmy.ca 92 points 3 months ago (14 children)

I wish we could say the students will figure it out, but I've had interns ask for help and then I've watched them try to solve problems by repeatedly asking ChatGPT. It's the scariest thing - "Ok, let's try to think about this problem for a moment before we - ok, you're asking ChatGPT to think for a moment. FFS."

[–] USSEthernet@startrek.website 27 points 3 months ago (2 children)

Critical thinking is not being taught anymore.

[–] djsaskdja@reddthat.com 20 points 3 months ago (4 children)

Has critical thinking ever been taught? Feel like it’s just something you have or you don’t.

[–] Sauerkraut@discuss.tchncs.de 20 points 3 months ago (2 children)

Critical thinking is essentially learning to ask good questions and also caring enough to follow the threads you find.

For example, if mental health is to blame for school shootings then what is causing the mental health crisis and are we ensuring that everyone has affordable access to mental healthcare? Okay, we have a list of factors that adversely impact mental health, what can we do to address each one? Etc.

Critical thinking isn't hard, it just takes time, effort.

load more comments (2 replies)
load more comments (3 replies)
load more comments (1 replies)
load more comments (13 replies)
[–] blackjam_alex@lemmy.world 59 points 3 months ago (4 children)

My experience with ChatGPT goes like this:

  • Write me a block of code that makes x thing
  • Certainly, here's your code
  • Me: This is wrong.
  • You're right, this is the correct version
  • Me: This is wrong again.
  • You're right, this is the correct version
  • Me: Wrong again, you piece of junk.
  • I'm sorry, this is the correct version.
  • (even more useless code) ... and so on.
[–] saltesc@lemmy.world 36 points 3 months ago* (last edited 3 months ago)

All the while it gets further and further from the requirements. So you open five more conversations, give them the same prompt, and try pick which one is least wrong.

All the while realising you did this to save time but at this point coding from scratch would have been faster.

[–] sugar_in_your_tea@sh.itjust.works 31 points 3 months ago* (last edited 3 months ago) (2 children)

I interviewed someone who used AI (CoPilot, I think), and while it somewhat worked, it gave the wrong implementation of a basic algorithm. We pointed out the mistake, the developer fixed it (we had to provide the basic algorithm, which was fine), and then they refactored and AI spat out the same mistake, which the developer again didn't notice.

AI is fine if you know what you're doing and can correct the mistakes it makes (i.e. use it as fancy code completion), but you really do need to know what you're doing. I recommend new developers avoid AI like the plague until they can use it to cut out the mundane stuff instead of filling in their knowledge gaps. It'll do a decent job at certain prompts (i.e. generate me a function/class that...), but you're going to need to go through line-by-line and make sure it's actually doing the right thing. I find writing code to be much faster than reading and correcting code so I don't bother w/ AI, but YMMV.

An area where it's probably ideal is finding stuff in documentation. Some projects are huge and their search sucks, so being able to say, "find the docs for a function in library X that does..." I know what I want, I just may not remember the name or the module, and I certainly don't remember the argument order.

[–] 9488fcea02a9@sh.itjust.works 19 points 3 months ago (1 children)

AI is fine if you know what you're doing and can correct the mistakes it makes (i.e. use it as fancy code completion)

I'm not a developer and i havent touched code for over 10 yrs, but when i heard about my company pushing AI tools on the devs, i thought exactly what you said. It should be a tool for experienced devs who already know what they're doing....

Lo and behold they did the opposite... They fired all the senior people and pushed AI on the interns and new grads.... and then expected AI to suddenly make the jr devs work like the expensive Sr devs they just fired...

Wtf

load more comments (1 replies)
load more comments (1 replies)
load more comments (2 replies)
load more comments (8 replies)
[–] ulkesh@lemmy.world 141 points 3 months ago (1 children)

Oh geez…who could have seen this coming?

Oh wait, every single senior developer who is currently railing against their moron AI-bandwagoning CEOs.

[–] Aceticon@lemmy.world 18 points 3 months ago

Middle and upper management are like little children - they'll only learn that fire hurts by putting their hand in it.

[–] SaharaMaleikuhm@feddit.org 117 points 3 months ago (1 children)

But are the shareholders pleased?

load more comments (1 replies)
[–] Treczoks@lemmy.world 101 points 3 months ago

Good. This is digital Darwinism at its finest. Weeds out the companies who thought they could save money by relying on a digital monkey instead of actual professionals.

[–] yrmp@lemmy.world 82 points 3 months ago (5 children)

Lmao my job announced layoffs a few months back. They continue to parade their corporate restructuring plan in front of us like we give a fuck if shareholders make money. My output has dropped significantly as I search for another role. Whatever code I do write now is always just copy pasted from AI (which is getting harder to use...fuck you Copilot). I give zero fucks about this place anymore. Maybe if people had some small semblance of investment in their company's success (i.e.: not milked by shareholders and beaten to dust by shitty profit driven metrics that take away from the core business), the employees might give enough fucks to not copy paste shitty third party code.

Additionally, this is a training issue. Don't offload the training of your people onto the universities (which then trap the students into an insurmountable debt load leading them to take jobs they otherwise wouldn't want to take just to eat and have a roof over their heads). The modern corporate landscape has created a perfect shitstorm of disincentives for genuine effort and diligence. Then you expect us to give a shit about your company even though the days of 40 years and a pension are now gone. We're stuck with 401k plans and social security and the luck of the draw as to whether we can retire or not. Work your whole life for what? Fuck you. I'm gonna generate that AI code and enjoy my 30s and 40s.

A workforce trapped by debt, forced to prioritize job security and paycheck size over passion or purpose. People end up in roles they don't care about, working for companies they have no investment in, simply to keep up with loan payments and the ever increasing cost of living.

"Why is my organization falling apart!?" Fucking look up from the stupid fucking metrics that don't actually tell you anything you dumb fucks. Make an actual human decision and fix the wealth inequality. It's literally always wealth inequality.

[–] AdolfSchmitler@lemmy.world 40 points 3 months ago (2 children)

"People work in roles they don't care about, for companies they have no investment in, to pay loans they shouldn't have."

That sounds like a fight club quote lol. I know you didn't say "loans they shouldn't have" but the cost of college is just stupidly high. It doesn't have to be free but come on.

load more comments (2 replies)
[–] ozoned@lemmy.world 20 points 3 months ago (2 children)

15 years ago I got a job where I wasn't allowed to do anything. I hated it. I wanted to learn and be valuable and be valued. I left that job.

I worked for a bank and then Red Hat and I loved what I did and burned myself out trying to make them happy. Only to find out they still didn't value me.

I switched jobs two years ago and increased my pay 30% overnight and back to a job doing nothing. And I'm totally fine with it now. I have a family and I focus on them and during work, if they don't have anything for me to do I make my own happiness.

Fuck corporations. I'll take your money, I'll never again kill myself as I'll never be valued anyway. Jobs aren't worth it. People are.

load more comments (2 replies)
load more comments (3 replies)
[–] WalnutLum@lemmy.ml 82 points 3 months ago (1 children)

“When asked about buggy AI, a common refrain is ‘it is not my code,’ meaning they feel less accountable because they didn’t write it.”

That's... That's so fucking cool...

load more comments (1 replies)
[–] RobotToaster@mander.xyz 81 points 3 months ago (3 children)
load more comments (3 replies)
[–] simplejack@lemmy.world 81 points 3 months ago (5 children)

Me and my team take our site down the old fashioned way. Code copied from some rando on the internet.

[–] echodot@feddit.uk 30 points 3 months ago (3 children)

Reminds me of the time that I took down the corporate website by translating the entire website into German. I'd been asked to do this but I hadn't realized that the auto translation Plug-In actually rewrote code into German, I thought it was just going to alter the HTML with JavaScript at runtime, but nope. It actually edited the files.

It also translated the password into German which was fun because it was just random characters so I have no idea what it translated into.

load more comments (3 replies)
[–] Aceticon@lemmy.world 20 points 3 months ago

It's pretty much the same as AIs do - copy and past random code from Stackoverflow - but they do it automatically.

[–] corsicanguppy@lemmy.ca 16 points 3 months ago (1 children)
load more comments (1 replies)
load more comments (2 replies)
[–] Snapz@lemmy.world 69 points 3 months ago (1 children)

And none of the forced tech support "AI" replacements work. And the companies don't give a shit.

[–] echodot@feddit.uk 49 points 3 months ago (1 children)

I've had this argument with them a few times at work. They are definitely going to replace this all with AI. Probably within the next year and no amount of us pointing out that it won't work and they'll end up having to bring us back, at 3x the rate, seems to have any effect on them.

I'm probably going to have to listen to a lot of arguments about this strawberry thing tomorrow.

Anyway whatever, severance is severance.

[–] stringere@sh.itjust.works 19 points 3 months ago

I was once in a similar position: company merger and they decided to move support offshore. We got 6 months lead notice and generous severance paid out as long as we stayed to the end. Fast forward a year and they took 85% customer approval to 13%. We got hired back at 1.5x our old pay rate, so not quite the 3x you mentioned. Hoping this works out similar for you in the end.

[–] reka@lemmy.world 66 points 3 months ago (8 children)

As stated in the article, this has less to do with using AI, more to do with sloppy code reviews and code quality enforcement. Bad code from AI is just the latest version of mindlessly pasting from Stack Overflow.

I encourage jrs to use tools such as Phind for solving problems but I also expect them to understand what they’re submitting and be ready to defend it no differently to any other PR. If they’re submitting code they don’t understand that’s incredibly unprofessional and I would come down very hard on them. They don’t do this though because we don’t hire dickheads.

[–] Wrench@lemmy.world 21 points 3 months ago* (last edited 3 months ago)

Shift-left eliminated the QA role.

Now we have AI generated shit code, with devs that don't understand the low level details of both the language, and the specifics of the generated code.

So we basically have content entry (ai inputs) and extremely shitty QA bundled into the "developer" role.

As a 20 year veteran of the industry, people keep asking me if I think AI will make developers obsolete. I keep telling them "maybe some day, but today's LLMs are not it. The AI bubble is going to burst, and a few legit use cases will make it through"

load more comments (7 replies)
[–] ShittyBeatlesFCPres@lemmy.world 52 points 3 months ago (12 children)

If I was still in a senior dev position, I’d ban AI code assistants for anyone with less than around 10 years experience. It’s a time saver if you can read code almost as fluently as you can read your own native language but even besides the A.I. code introducing bugs, it’s often not the most efficient way. It’s only useful if you can tell that at a glance and reject its suggestions as much as you accept them.

Which, honestly, is how I was when I was first starting out as a developer. I thought I was hot shit and contributing and I was taking half a day to do tasks an experienced developer could do in minutes. Generative AI is a new developer: irrationally confident, not actually saving time, and rarely doing things the best way.

[–] GetOffMyLan@programming.dev 24 points 3 months ago* (last edited 3 months ago) (3 children)

I've found they're great as a learning tool where decent docs are available. Or as an interactive docs you can ask follow up questions to.

We mostly use c# and it's amazing at digging into the MS docs to pull out useful things from the bcl or common patterns.

Our new juniors got up to speed so fast by asking it to explain stuff in the existing codebases. Which in turn takes pressure off more senior staff.

I got productive in vuejs in a large codebase in a couple days that way.

Using to generate actual code is insanely shit haha It is very similar to just copy pasting code and hacking it in without understanding it.

[–] ShittyBeatlesFCPres@lemmy.world 17 points 3 months ago

You make a good point about using it for documentation and learning. That’s a pretty good use case. I just wouldn’t want young developers to use it for code completion any more than I’d want college sophomores to use it for writing essays. Professors don’t have you write essays because they like reading essays. Sometimes, doing a task manually is the point of the assignment.

load more comments (2 replies)
[–] Windex007@lemmy.world 19 points 3 months ago

Even worse than it being wrong, is that by nature of the tool it looks right.

load more comments (10 replies)
[–] Tylerdurdon@lemmy.world 49 points 3 months ago (2 children)

See? AI creates jobs! Granted, it's specialized mop up situations, but jobs!

It'll be even more interesting in the future! Every now and then a T1000 will lose all hydraulic fluids right out it's prosthetic anus and they'll need someone there with a mop and bucket! Our economy lives on...

load more comments (2 replies)
[–] SuperFola@programming.dev 32 points 3 months ago (2 children)

How come the hallucinating ghost in the machine is generating code so bad the production servers hallucinate even harder and crash?

[–] henfredemars@infosec.pub 21 points 3 months ago (11 children)

I’m not sure how AI supposed to understand code. Most of the code out there is garbage. Even most of the working code out there in the world today is garbage.

load more comments (11 replies)
[–] Telorand@reddthat.com 16 points 3 months ago (3 children)

You have to be hallucinating to understand.

load more comments (3 replies)
[–] _sideffect@lemmy.world 28 points 3 months ago

"AI" is just good for simple code snippets. (Which it stole from Github repos).

This whole ai bs needs to die already, and the people who lie about it held accountable.

[–] prex@aussie.zone 25 points 3 months ago* (last edited 3 months ago)

Sounds like the Sirius cybernetics corporation:

The fundamental design flaws are obscured by the superficial design flaws.

[–] henfredemars@infosec.pub 24 points 3 months ago* (last edited 3 months ago)

AI can be a useful tool, but it’s not a substitute for actual expertise. More reviews might patch over the problem, but at the end of the day, you need a competent software developer who understands the business case, risk profile, and concrete needs to take responsibility for the code if that code is actually important.

AI is not particularly good at coding, and it’s not particularly good at the human side of engineering either. AI is cheap. It’s the outsourcing problem all over again and with extra steps of having an algorithm hide the indirection between the expertise you need and the product you’re selling.

[–] EuCaue@lemmy.ml 22 points 3 months ago (1 children)
load more comments (1 replies)
[–] fluxion@lemmy.world 20 points 3 months ago* (last edited 3 months ago)

Debugging and maintenance was always the hardest aspect of large code bases... writing the code is the easy part. Offloading that part to AI only makes the hard stuff harder

[–] dinckelman@lemmy.world 20 points 3 months ago

I have a lot of empathy for a lot of people. Even ones, who really don't deserve it. But when it comes to people like these, I have absolutely none. If you make a chatbot do your corporate security, it deserves to burn to the ground

[–] werefreeatlast@lemmy.world 17 points 3 months ago (7 children)

Also it is pure junk. Chat-GPT code may come out fast on the screen but it's garbage. I tried python and c++ both just pure garbage. Sure I got it to do what I wanted but only after a day of hair pulling repetitive madness. Simple task, open an image and invert it . Then we'll it opened the image but didn't invert. Or maybe it's upside down. Can you open the image right side up and invert it....fuck fuck, why is the window full screen? Did I ask for full screen, shit heavens no! Anyway it's a fuckin idiot just rambling code at me.

load more comments (7 replies)
load more comments
view more: next ›