this post was submitted on 25 Jan 2025
91 points (92.5% liked)

PC Master Race

15242 readers
63 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

"Jensen sir, 50 series is too hot"

"Easy fix with my massive unparalleled intellect. Just turn off the sensor"

If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!

If you buy a Nvidia GPU, you are part of the problem here.

top 24 comments
sorted by: hot top controversial new old
[–] Vinstaal0@lemmy.world 5 points 20 hours ago (1 children)

Yeah NVIDIA is a bullshit company and has been for a while. AMD and Intel need to get their raytracing game up so they become a real competitor for NVIDIA especially now when there are more games that require raytracing.

[–] potustheplant@feddit.nl -1 points 16 hours ago (2 children)

No games "require" raytracing and the ones that support it may look better when you turn it on but it's not worth the performance cost.

[–] MeaanBeaan@lemmy.world 9 points 15 hours ago* (last edited 15 hours ago) (1 children)

This is incorrect. The new indiana Jones game requires raytracing as does the upcoming doom game. As much as you may or may not like it traditional rasterized graphics are (starting) to be phased out. At least across the AAA gaming space. The theoretical benefits to workload for developers make it pretty much an inevitability at this point once workflows and optimizations are figured out. Though I doubt rasterized graphics will completely go away. Much like how pixel art games are very much still a thing decades after becoming obsolete.

[–] potustheplant@feddit.nl 3 points 13 hours ago* (last edited 3 hours ago) (1 children)

Sorry for not knowing that a game that released 1 month ago makes ray tracing mandatory.

Considering that most gamers don't even have ray tracing capable hardware (at least with decent performance), this seems like a pretty shitty and lazy decision.

My point still stands though. In most games it's not that much better and in the games where it is better, it has a very exaggerated performance impact. Personally, I prefer playing at 100+ fps.

[–] MeaanBeaan@lemmy.world 3 points 10 hours ago (1 children)

To be clear. The only required rt Indiana Jones utilizes is raytraced global illumination which does require tensor cores to work. But as long as you have a 20 series card or later you should be able to get playable performance if you manage your settings correctly. It only becomes super heavy when you enable rt reflections, rt sunshadows, or full path tracing. The latter of which being VERY expensive and what I'd assume most people think when they think of ray tracing. It does look really really good though and personally myself I'd rather play that game at 60 fps (or lower let's be real) in order to play with full pathtracing instead of playing with just the RTGI at a much higher fps. I'd at least recommend turning on the RT sunshadows if you can because shadows without it are very shimmery and aliased. Especially foliage. In games like Indiana Jones that have been designed from the ground up with raytracing in mind it makes a gigantic difference in how grounded the world feels. The level of detail they baked into every asset is insane and path-tracing elevates the whole experience a huge amount when compared to the default RTGI because every nook and cranny on every object casts accurate shadows and bounce lighting on itself and the environment.

I assume Doom is going to be the same way.

[–] Codilingus@sh.itjust.works 1 points 35 minutes ago* (last edited 28 minutes ago)

Just chiming in that I played Indiana Jones with 0 problems and great performance on my 6800 XT. And that was without any FSR, which I'm not sure if it's even available, yet.

[–] MrPoopbutt@lemmy.world 4 points 15 hours ago (2 children)

Not true anymore.

The Indiana Jones game that just came out does require ray tracing. There are a few others coming out that do as well.

[–] potustheplant@feddit.nl 1 points 3 hours ago

Which other currently released games require it? I haven't found any.

Hopefully this doesn't become a trend because it'd be pretty dumb and anti-consumer.

[–] sinceasdf@lemmy.world 1 points 9 hours ago

So anyone with an older or non-nvidia card can just get fucked?

This sounds like an Nvidia monopolistic backroom deal or something. Bet you'll see an Nvidia splash on those games' startup.

[–] RealFknNito@lemmy.world 6 points 1 day ago (1 children)

I've never bought Nvidia but they become more like Apple every day. Why be consumer friendly for niche PC builders? The average gamer already associates Nvidia with performance so it's time to rely on good ol brand loyalty!

[–] Noobnarski@lemmy.world 4 points 17 hours ago (1 children)

The problem is, it's not just an assosiation. NVIDIA cards are the fastest cards hands down. I wish Intel and AMD would provide competition on the high end, but they just don't do it.

Even worse, the best next gen AMD GPU won't even beat AMDs best last gen GPU, they even say this themselves.

[–] RealFknNito@lemmy.world 2 points 7 hours ago

To me, buying Nvidia for performance is like buying an APC as a daily driver for work because of it's safety rating. The cost long term does not at all seem worth it.

[–] Viri4thus@feddit.org 9 points 1 day ago

The drop in clocks in certain situations that a lof of outlets are "conveniently" attributing to CPU limitations, has all the hallmarks of throttling... It's hard to criticise the incumbent monopoly holder when they have a history of blacklisting outlets that espouse consumer advocacy.

[–] edgemaster72@lemmy.world 31 points 2 days ago* (last edited 2 days ago) (1 children)

Surely if the card is damaged due to overheating, the customer won't be blamed since they can't keep track of the hottest part of the card, right? Right?

Haaaaahahahahahahaahahakxjvjfhorbgkfbdjdv

Funniest shit I've read all week

[–] ravshaul@discuss.tchncs.de 5 points 1 day ago (1 children)

Can AIB's add extra sensors for the OS to read, or will the nVidia driver not provide that level of information?

[–] empireOfLove2@lemmy.dbzer0.com 8 points 1 day ago* (last edited 1 day ago) (1 children)

Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it's microcode. AIB's can only change the PCB around the die. I'd almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB's are going to be able to crack it too.

Also the way Nvidia operates, if an AIB deviates from Nvidia's mandatory process, they'll get black balled and put out of business. So they won't. Daddy Jensen knows best!

[–] ravshaul@discuss.tchncs.de 2 points 1 day ago (1 children)

We'll find out how the 5080 is on Thursday, but I expect that the 5070 Ti should have cool temperatures.

[–] empireOfLove2@lemmy.dbzer0.com 2 points 1 day ago (1 children)

Oh I'm sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that's what hotspot temp reporting helps with diagnosing.

[–] ravshaul@discuss.tchncs.de 3 points 1 day ago (1 children)

Isn't a GPU that pulls 600 watts in the whackjob territory?

The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.

[–] alphabethunter@lemmy.world 3 points 1 day ago

It's not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I've seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.

[–] FeelzGoodMan420@eviltoast.org 7 points 2 days ago (2 children)

A little dramatic but okay.

[–] empireOfLove2@lemmy.dbzer0.com 28 points 2 days ago* (last edited 2 days ago)

Is it though?

The Hotspot temp sensors are one of the most critical diagnostic sensors an end user can have. When the thermal interface material begins to degrade (or leak out of the rubber gasket, in the case of the 5090's liquid metal) your package temp may only go up a few C but your Hotspot may increase by 10-20C or more. That indicates problems and almost definitely is one of the leading causes of dead and crashing GPU's- it's also the easiest to detect and fix.

Removing this quite literally has zero engineering reason beyond

  • hiding from reviewers the fact that the 5090 pulls too much power and runs too hot for a healthy lifespan, even with liquid metal and the special cooler
  • Fucking over the consumer so they can no longer diagnose their own hardware
  • Ensure more 5090's die rapidly, via lack of critical monitoring, so that Nvidia funny number can keep going up by people re-buying new GPU's that cost more than some used cars every 2 years.

The sensors are still definitely there. They have to be for thermal management or else these things will turn into fireworks. They're just being hidden from the user at a hardware level.

This isn't even counting the fact that Hotspot also usually includes sensors inside the VRM's and memory chips, which are even more sensitive to a bad TIM application and running excessively warm for longer periods of times.

[–] brucethemoose@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

It looks bad with the insane TDP they run at now. They could cut 33% of it off and probably lose like 5-10% perf depending on the SKU. Maybe even less.

It also looks a lot like planned obsolescence.