this post was submitted on 05 Jun 2024
410 points (96.6% liked)

Technology

59092 readers
3253 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] flop_leash_973@lemmy.world 32 points 5 months ago (2 children)

The real game now is how long will it last before the hype and with the the floor falls out of "AI" and a good chunk of their stock gains with it.

[–] bamboo@lemm.ee 22 points 5 months ago (1 children)

I don’t think generative AI is going anywhere anytime soon. The hype will eventually die down, but it’s already proved its usefulness in many tasks.

[–] neshura@bookwormstory.social 16 points 5 months ago (2 children)

Is AI useful? Maybe. But is it profitable? AI will go the same way .com did: there will be a massive crash and at the end of that you'll see who actually had their pants on

[–] Nighed@sffa.community 4 points 5 months ago (2 children)

Nvidia IS making a profit on it though. It's the whole "in a good rush, sell shovels" thing.

[–] neshura@bookwormstory.social 5 points 5 months ago

My Point is more that their revenue stream will temporarily take a giant hit during that, when everyone is busy going bankrupt the few AI companies that make a profit with it have better things to do than buy new Accelerators right that instant.

[–] Telodzrum@lemmy.world 1 points 5 months ago

nVidia is selling shovels and picks during the AI gold rush.

[–] bamboo@lemm.ee -1 points 5 months ago (2 children)

It can be quite profitable. A ChatGPT subscription is $20/m right now, or $240/year. A software engineer in the US is between $200k and $1m with all benefits and support costs considered. If that $200k engineer can use ChatGPT to save 2.5 hours in a year, then it pays for itself.

[–] neshura@bookwormstory.social 5 points 5 months ago (1 children)

It's quite funny that you think ChatGPT is making a profit on that 20$ subscription if you replace a software dev with it.

The bust won't be because it's not profitable to use AI but because the companies selling the service cannot do so at rates which are both profitable and actually marketable. Case in point: OpenAI has not made a single cent of profit so far (or at least not reported a profit). The way AI is currently shoved in everywhere is not sustainable because the cost of running an AI model cannot be recuperated by most of these new platforms.

[–] bamboo@lemm.ee -1 points 5 months ago (1 children)

OpenAI is a non-profit. Further, US tech companies usually take many years to become profitable. It’s called reinvesting revenue, more companies should be doing that instead of stock buybacks.

Let’s suppose hosted LLMs like ChatGPT aren’t financially sustainable and go bust though. As a user, you can also just run them locally, and as smaller models improve, this is becoming more and more popular. It’s likely how Apple will be integrating LLMs into their devices, at least in part, and Microsoft is going that route with “Copilot+ PCs” that start shipping next week. Integration aside, you can run 70B models on an overpriced $5k MacBook Pro today that are maybe half as useful as ChatGPT. The cost to do so exceeds the cost of a ChatGPT subscription, but to use my numbers from before, a $5k MacBook Pro running llama 3 70B would have to save an engineer one hour per week to pay for itself in the first year. Subsequent years only the electrical costs would matter, which for a current gen MacBook Pro would be about equivalent to the ChatGPT subscription in expensive energy markets like Europe, or half that or less in the US.

In short, you can buy overpriced Apple hardware to run your LLMs, do so with high energy prices, and it’s still super cheap compared to a single engineer such that saving 1 hour per week would still pay for itself in the first year.

[–] neshura@bookwormstory.social 3 points 5 months ago

Yeah I don't know why you keep going on about people using AI when my point was entirely that most of the companies offering AI services don't have a sustainable business model. Being able to do that work locally if anything strengthens my point.

[–] frezik@midwest.social 4 points 5 months ago* (last edited 5 months ago) (1 children)

I've seen pull requests filled with ChatGPT code. I consider my dev job pretty safe.

[–] bamboo@lemm.ee 0 points 5 months ago

ChatGPT isn’t gonna replace software engineers anytime soon. It can increase productivity though, that’s the value LLMs provide. If someone made a shitty pull request filled with obvious ChatGPT output, that’s on them and not the technology. Blaming ChatGPT for a programmer’s bad code is like blaming the autocomplete in their editor for bad code: just because the editor suggests it doesn’t mean you have or should accept it if it’s wrong.

[–] Damage@feddit.it 3 points 5 months ago

Well, they also make good silicon that is apparently useful for different things, that may not change... If it's good for the next fad as well, they'll just stay on top.