this post was submitted on 07 Oct 2024
357 points (95.9% liked)

Technology

59092 readers
3253 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] narc0tic_bird@lemm.ee 26 points 4 weeks ago (3 children)

Crazy how quickly NVIDIA went up. I wonder if they'll crash down just as fast should the AI hype either die off or shift to other manufacturers (Intel, AMD etc.) or in-house solutions (ex. Apple Intelligence).

[–] Zorsith@lemmy.blahaj.zone 21 points 4 weeks ago (3 children)

I just want to get a graphics card for less than the rest of a rig combined... shits ridiculous, and AMD doesn't seem to be even trying to compete anymore

[–] Dudewitbow@lemmy.zip 15 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

they do compete, its just users weigh DLSS and Raytracing far more than they should, and devalue VRam in long term situations

for example a 7900 GRE cost about the same as a 4070, but more people will buy the 4070 regardless

[–] Zorsith@lemmy.blahaj.zone 7 points 4 weeks ago* (last edited 4 weeks ago)

I definitely do like raytracing, sadly. I'm more interested in graphics and immersion in a setting/story in a game than competitiveness or ultra-high FPS. Water reflections and mirrors just look absolutely gorgeous to me.

I'm definitely strongly considering AMD regardless for my next build, as I'd like to switch to Linux fully at some point.

[–] sugar_in_your_tea@sh.itjust.works 4 points 4 weeks ago* (last edited 4 weeks ago)

Eh, I got an AMD GPU somewhat recently and it meets all my expectations. I'm not too interested in RTX or compute, and they have a really good value on raster performance.

[–] narc0tic_bird@lemm.ee 3 points 4 weeks ago

I'm coping for RDNA4.

[–] FMT99@lemmy.world 2 points 4 weeks ago

They said the same when the crypto hype came along. If AI dies off there will be other trends in computing that require cutting edge silicon. AI may or may not continue surging but hardware will be needed no matter what. NVIDIA is selling shovels, not panning for gold.

[–] daddy32@lemmy.world 1 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

Apple is not there yet, its models were trained on Google hardware. Though I am surprised it wasn't Nvidia hardware.

[–] narc0tic_bird@lemm.ee 1 points 4 weeks ago (1 children)

What's "Google hardware"? Likely just NVIDIA hardware running in Google's cloud?

[–] daddy32@lemmy.world 7 points 4 weeks ago (1 children)

No no, Google does actually have its own custom proprietary AI hardware - https://en.wikipedia.org/wiki/Tensor_processing_unit

[–] narc0tic_bird@lemm.ee 3 points 4 weeks ago