this post was submitted on 25 Jan 2025
91 points (92.5% liked)

PC Master Race

15242 readers
69 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

"Jensen sir, 50 series is too hot"

"Easy fix with my massive unparalleled intellect. Just turn off the sensor"

If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!

If you buy a Nvidia GPU, you are part of the problem here.

you are viewing a single comment's thread
view the rest of the comments
[–] ravshaul@discuss.tchncs.de 5 points 1 day ago (1 children)

Can AIB's add extra sensors for the OS to read, or will the nVidia driver not provide that level of information?

[–] empireOfLove2@lemmy.dbzer0.com 8 points 1 day ago* (last edited 1 day ago) (1 children)

Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it's microcode. AIB's can only change the PCB around the die. I'd almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB's are going to be able to crack it too.

Also the way Nvidia operates, if an AIB deviates from Nvidia's mandatory process, they'll get black balled and put out of business. So they won't. Daddy Jensen knows best!

[–] ravshaul@discuss.tchncs.de 2 points 1 day ago (1 children)

We'll find out how the 5080 is on Thursday, but I expect that the 5070 Ti should have cool temperatures.

[–] empireOfLove2@lemmy.dbzer0.com 2 points 1 day ago (1 children)

Oh I'm sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that's what hotspot temp reporting helps with diagnosing.

[–] ravshaul@discuss.tchncs.de 3 points 1 day ago (1 children)

Isn't a GPU that pulls 600 watts in the whackjob territory?

The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.

[–] alphabethunter@lemmy.world 3 points 1 day ago

It's not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I've seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.