this post was submitted on 13 Nov 2024
651 points (95.1% liked)

Technology

59358 readers
6506 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] OsrsNeedsF2P@lemmy.ml 5 points 24 minutes ago

I work with people who work in this field. Everyone knows this, but there's also an increased effort in improvements all across the stack, not just the final LLM. I personally suspect the current generation of LLMs is at its peak, but with each breakthrough the technology will climb again.

Put differently, I still suspect LLMs will be at least twice as good in 10 years.

[–] jpablo68@infosec.pub 9 points 2 hours ago (1 children)

I just want a portable self hosted LLM for specific tasks like programming or language learning.

[–] plixel@programming.dev 7 points 2 hours ago

You can install Ollama in a docker container and use that to install models to run locally. Some are really small and still pretty effective, like Llama 3.2 is only 3B and some are as little as 1B. It can be accessed through the terminal or you can use something like OpenWeb UI to have a more "ChatGPT" like interface.

[–] dejected_warp_core@lemmy.world 11 points 6 hours ago (1 children)

Welcome to the top of the sigmoid curve.

If you were wondering what 1999 felt like WRT to the internet, well, here we are. The Matrix was still fresh in everyone's mind and a lot of online tech innovation kinda plateaued, followed by some "market adjustments."

[–] Hackworth@lemmy.world 6 points 2 hours ago* (last edited 2 hours ago)

I think it's more likely a compound sigmoid (don't Google that). LLMs are composed of distinct technologies working together. As we've reached the inflection point of the scaling for one, we've pivoted implementations to get back on track. Notably, context windows are no longer an issue. But the most recent pivot came just this week, allowing for a huge jump in performance. There are more promising stepping stones coming into view. Is the exponential curve just a series of sigmoids stacked too close together? In any case, the article's correct - just adding more compute to the same exact implementation hasn't enabled scaling exponentially.

[–] TankovayaDiviziya@lemmy.world 7 points 8 hours ago (1 children)

Short on the AI stocks before it crash!

[–] sugar_in_your_tea@sh.itjust.works 11 points 1 hour ago* (last edited 1 hour ago)

The market can remain irrational longer than you can remain solvent.

A. Gary Shilling

[–] masquenox@lemmy.world 39 points 19 hours ago (1 children)
[–] UnderpantsWeevil@lemmy.world 15 points 18 hours ago (1 children)

I've been hearing about the imminent crash for the last two years. New money keeps getting injected into the system. The bubble can't deflate while both the public and private sector have an unlimited lung capacity to keep puffing into it. FFS, bitcoin is on a tear right now, just because Trump won the election.

This bullshit isn't going away. Its only going to get forced down our throats harder and harder, until we swallow or choke on it.

[–] thatKamGuy@sh.itjust.works 4 points 8 hours ago (1 children)

With the right level of Government support, bubbles can seemingly go on for literal decades. Case in point, Australian housing since the late 90s has been on an uninterrupted tear (yes, even in ‘08 and ‘20).

But eventually, bubbles either deflate or pop, because eventually governments and investors will get tired of propping it up. It might take decades, but I think it's inevitable.

[–] recapitated@lemmy.world 21 points 20 hours ago

I think I've heard about enough of experts predicting the future lately.

[–] Blackmist@feddit.uk 42 points 23 hours ago (10 children)

Thank fuck. Can we have cheaper graphics cards again please?

I'm sure a RTX 4090 is very impressive, but it's not £1800 impressive.

[–] bountygiver@lemmy.ml 3 points 1 hour ago

nope, if normal gamers are already willing to pay that price, no reason for nvidia to reduce them.

There's more 4090 on steam than any AMD dedicated GPU, there's no competition

[–] lorty@lemmy.ml 7 points 9 hours ago (1 children)

Just wait for the 5090 prices...

[–] Blackmist@feddit.uk 2 points 9 hours ago (3 children)

I just don't get whey they're so desperate to cripple the low end cards.

Like I'm sure the low RAM and speed is fine at 1080p, but my brother in Christ it is 2024. 4K displays have been standard for a decade. I'm not sure when PC gamers went from "behold thine might from thou potato boxes" to "I guess I'll play at 1080p with upscaling if I can have a nice reflection".

[–] _cryptagion@lemmy.dbzer0.com 1 points 5 minutes ago

Before you claim 4k is the standard, you might wanna take a peak at the Steam hardware survey.

I don’t know anyone I game with that uses a 4k monitor. 1440p at your monitors max refresh rate is the favorite.

[–] Tywele@lemmy.dbzer0.com 3 points 5 hours ago* (last edited 5 hours ago) (1 children)

4k displays are not at all standard and certainly not for a decade. 1440p is. And it hasn't been that long since the market share of 1440p overtook that of 1080p according to the Steam Hardware survey IIRC.

[–] Blackmist@feddit.uk 0 points 3 hours ago (2 children)

Maybe not monitors, but certainly they are standard for TVs (which are now just monitors with Android TV and a tuner built in).

[–] _cryptagion@lemmy.dbzer0.com 1 points 4 minutes ago

Well, people aren’t sticking 4090s in their Samsung smart TVs, so idk that matters.

[–] Tywele@lemmy.dbzer0.com 5 points 2 hours ago (2 children)

That doesn't really matter if people on PC don't game on it, does it?

These are the primary display resolutions from the Steam Hardware Survey.

[–] Blackmist@feddit.uk 1 points 54 minutes ago

I do wonder how much higher that would be if GPUs targeting 4K were £299 rather than £999.

Although some of it is down to monitors being on desks right in front of you and 4K not really being needed. It would also be interesting to for Valve to weight the results by hours spent gaming that month (and amount they actually spend on games), rather than just counting hardware numbers.

[–] NikkiDimes@lemmy.world 0 points 1 hour ago (1 children)

You're so close to the answer. Now, why are PC gamers the ones still on 1080 and 1440 when everyone else has moved on?

[–] Tywele@lemmy.dbzer0.com 1 points 40 minutes ago

Have I said anything in favor of crippling lower end cards or that these high prices of the high end cards are good? My only argument was that 4K displays in the PC space being the standard was simply delusional because the stats say something wholly different.

[–] lorty@lemmy.ml 3 points 8 hours ago (1 children)

I think it's just an upselling strategy, although I agree I don't think it makes much sense. Budget gamers really should look to AMD these days, but unfortunately Nvidia's brand power is ridiculous.

[–] Blackmist@feddit.uk 3 points 6 hours ago

An the issue for PC gamers is that Nvidia has spent the last few years convincing devs to shovel DLSS into everything, rather than a generic upscaling solution that other vendors could just drop their own algorithms into, meaning there's a ton of games that won't upscale nicely on anything else.

[–] explodicle@sh.itjust.works 7 points 17 hours ago

Sorry, crypto is back in season.

load more comments (7 replies)
load more comments
view more: next ›