this post was submitted on 12 Feb 2025
325 points (96.0% liked)

Comic Strips

13816 readers
2576 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] earphone843@sh.itjust.works 20 points 15 hours ago (2 children)

It depends on the computer, but the power usage could easily be 250W+. While not a ton of power, it adds up quickly.

But that's only if you don't have your computer set to sleep/hibernate

[–] cmnybo@discuss.tchncs.de 14 points 15 hours ago (2 children)

Idle power is not usually that high unless you are talking about a multi socket server.
A gaming PC is usually less than 100W and an office PC is usually less than 25W at idle.

[–] Eheran@lemmy.world 6 points 13 hours ago

Wasting 25 W while entering and leaving sleep mode is a matter of 5 key strokes and 3 seconds?

[–] earphone843@sh.itjust.works 1 points 10 hours ago* (last edited 2 hours ago) (2 children)

~~25W still adds up. General rule of thumb is to add a zero to the wattage to get the cost to run it for a year. I don't want to spend $250 a year letting my computer idle.~~

I definitely misremembered things

[–] TaTTe@lemmy.world 3 points 2 hours ago (1 children)

That's some hella expensive electricity you're buying there. I'm getting mine at 14 cents/kWh, which is roughly 1.2€/W per year. This isn't even close to the cheapest option available.

[–] earphone843@sh.itjust.works 3 points 2 hours ago* (last edited 2 hours ago) (1 children)

You know what, you're right. Idk what the fuck I was thinking. I must have misremembered the math from the last time I did it.

I swear I did the math like a year ago and it added up, but that's clearly a false memory. It's closer to $1 per watt per year. I downvoted my own comment

[–] TaTTe@lemmy.world 1 points 2 hours ago (1 children)

It could've been closer to the truth in 2022. At least in Europe when the energy prices skyrocketed I think I paid closer to 1€/kWh.

[–] earphone843@sh.itjust.works 1 points 2 hours ago

Maybe it was 2022. Working from home has fucked my perception of time.

[–] rumschlumpel@feddit.org 1 points 5 hours ago (1 children)

That calculation only makes sense if you never shut down your computer, instead of only when you accidentally hit "restart" and need to go right away.

[–] earphone843@sh.itjust.works 1 points 3 hours ago (1 children)

Lots of people leave their computers running 24/7, though. The TLC said the power draw would be small, so I just wanted to point out that what might look like a negligible amount of power can add up to be more than youd expect.

[–] rumschlumpel@feddit.org 1 points 1 hour ago

That's not really what's being discussed here, though. There's a big difference between doing it all the time and only doing it once in a blue moon.

[–] tal@lemmy.today 4 points 14 hours ago* (last edited 14 hours ago)

but the power usage could easily be 250W+

I mean, a beefy GPU could be ~400W, and a beefy CPU another ~200W. But that's peak draw from those components, which are designed to drastically reduce power consumption if they aren't actually under load. You don't have to power down the components in sleep/hibernation to achieve that -- they can already reduce runtime power themselves. One shouldn't normally have software significantly loading those (especially after a reboot). If you've got something that is doing crunching in idle time to that degree, like, I don't know, SETI@Home or something, then you probably don't want it shut off.

The reason fans can "spin up" on the CPU and the GPU when they're under load is because they're dissipating much more heat, which is because they're drawing much more power.