this post was submitted on 20 Dec 2023
280 points (97.6% liked)
Linux
48224 readers
898 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would install this if I had made the objectively wrong decision to buy an apple computer.
they are downvoting you, but you're absolutely right.
they can't hardly be repaired and it's impossible to upgrade them at all, even something as basic as swapping the SSD needs desoldering. They are still sold with 8 GB of RAM as the base and they can't be upgraded.
it isn't worth it at all.
Just don't buy an 8gb model, easy fix) But seriously when you get a laptop which allows you to work 8 hours straight from battery and then have 30% capacity left at the end of a day, there is no chance you would get back to the Intel system and plug it in every 2 hours.
I have a 1½-year-old laptop AMD Ryzen 6860Z processor & get 9 hours on the regular running NixOS doing programming/browsing/chat. That’s not quite 8 hours with 30% to spare, but good enough that I don’t worry about carrying my charger (but being lightweight GaN, normally keep it in my bag just in case). Apple folks have this tendency to think all their hardware is massively better, but even if it’s ‘better’, it’s often just by a small margin that doesn’t make a big difference--especially when you factor in cost.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
https://imgur.com/a/C5QuC9v
Can you please take the same measurements on your system? Maybe ryzen system is better than intel, never had one.
I don’t own Minecraft (nope to Microsoft-owned software) nor would I have a reason to do 3D gaming on a battery… are you gaming at a café, the library, or something?
For example when watching 1080p youtube video in Safari the power consumption is only 0.1watt because it's using hardware decoders. (not including display backlight, I can't measure it). But when I play the same video in firefox which is using software decoding the consumption is around 0.7w which is not as good as hw decoders, but still less than a watt
no, it's just an easy sustained load that can be measured accurately. If you have some other application that provides sustained load but doesn't spin all the cores to 100% please suggest it, I will try.
don't be irrealistic. most laptops in the Macbook price range will have 8 hours of usage in low consumption mode or around 6 or 5 if you need more power.
and at that price point they come with at least 32 GB of RAM which can be upgraded, swappable SSDs with more capacity than the macbook's, far better keyboard and more ports.
the Macbooks do have some extra performance per battery usage? yeah I guess. But after 2 years that the battery life is gone, you'll probably be buying the newer model or wishing that you bought a laptop with a replaceable battery.
The thermals and battery life of my Apple silicon MacBooks are unlike any other laptop I’ve owned. When I first got one, I started thinking of recharging it not in hours, but in days. 3-4 days between charges was normal for typical use. Mind you that was not full workdays, but the standby time was so good that I didn’t have to consider that the battery would decrease overnight or in my bag. I’ve used multiple Dell, Thinkpad, and Intel Mac laptops over the past decade as well and none of them come within spitting distance on battery life and thermals. I really hope that Qualcomm can do for other manufacturers what Apple silicon has done for MacBooks.
While I completely agree on the repairability front, which is really quite unfortunate and quite frankly a shame (at least iPhones have been getting more repairable, silver lining I guess? damned need for neverending profits), it’s just… non unrealistic.
That being said, unified memory kind of sucks but it’s still understandable due to the advantages it brings, and fixed-in-place main storage that also stores the OS is just plain shitty. It’ll render all these devices unusable once that SSD gives out.
Anyhow, off the tangent again: I have Stats installed for general system monitoring, as well as AlDente to limit charge to 80% of maximum battery capacity. All that to say, by now after around 1.5 years of owning the M2 MacBook Air (which I’ve been waiting for to buy/to release since late 2019, btw), I know pretty well which wattages to expect and can gauge its power usage pretty well.
I’ll try to give a generalized rundown:
Given the spec sheet’s 52 Wh battery, you can draw your own conclusions about the actual runtime of this thing by simple division. I leave it mostly plugged in to preserve the battery for when it becomes a couch laptop in around 5-8 years, so I can’t actually testify on that yet, I just know the numbers.
I didn’t mean for this to come off as fanboi-y as it did now. I also really want to support Framework, but recommending it universally from my great-aunt to my colleagues is not as easy as it is with the MacBook. Given they’re a company probably 1,000 times smaller than Apple, what they’re doing is still tremendously impressive, but in all honesty, I don’t see myself leaving ARM architecture anytime soon. It’s just too damn efficient.
*At least for my typical usage, which will be browser with far too many tabs and windows open + a few shell sessions + a (may or may not be shell) text editor, sometimes full-fledged IDE, but mostly just text editors with plugins.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
https://imgur.com/a/C5QuC9v
Can you please take the same measurements on your system? I'd like to see how good is the alternative.
My system is 7 years old, it wouldn't be an appropriate comparison. Maybe others can help