this post was submitted on 25 Jan 2025
91 points (92.5% liked)

PC Master Race

15242 readers
63 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

"Jensen sir, 50 series is too hot"

"Easy fix with my massive unparalleled intellect. Just turn off the sensor"

If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!

If you buy a Nvidia GPU, you are part of the problem here.

you are viewing a single comment's thread
view the rest of the comments
[–] ravshaul@discuss.tchncs.de 3 points 1 day ago (1 children)

Isn't a GPU that pulls 600 watts in the whackjob territory?

The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.

[–] alphabethunter@lemmy.world 3 points 1 day ago

It's not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I've seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.