973
this post was submitted on 13 Aug 2023
973 points (99.0% liked)
Technology
59446 readers
4775 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But that's, like the one place other than games where consumers are looking for performance. What's left, web browsing and MS Office?
"whew* my horrible bubble sort implementation is safe from performance impacts
May I suggest using miracle sort instead? It has the most stable performance of all.
I just skimmed through the article and it seems like this vulnerability is only really meaningful on multi-user systems. It allows one user to access memory dedicated to other users, letting them read stuff they shouldn't. I would expect that most consumer gaming computers are single-user machines, or only have user accounts for trusted family members and whatnot, so if this mitigation causes too much of a performance hit I expect it won't be a big risk to turn it off for those particular computers.
Would it mean that a malicious application being run in non-admin mode by one user could see data/memory in use by an admin user?
It would indeed imply that which is why this vulnerability is also serious for single user contexts.
This
All these kind of CPU level vulnerabilities are the same, they are only really "risky" if there is malicious software running in the computer in the first place.
The real problem is that these CPU-level vulnerabilities all break one of the core concepts of computers, which is process separation and virtual memory. If process separation is broken then all other levels of security become pointless.
While for desktops this isn't a huge problem (except when sometimes vulnerabilities might even be able to be exploited though browsers), this is a huge problem for servers, where the modern cloud usually has multiple users in virtual machines in a single server and a malicious user could steal information across virtual machines.
Your first paragraph isn't quite right.
Modern hacks/cracks aren't a "do this and suddenly you are in" type deal.
It's a cascade chain of failures of non-malicious software.
Saying "don't have a virus" is absolutely correct, however that's not the concern here.
The concern is about the broadening of the attack surface.
A hacker gets minor access to a system. Leverages some CVE to get a bit more access, and keeps poking around and trying CVEs (known or unknown) until they get enough access to run this CVE.
And then they can escape the VM onto the host or other VMs on the same system, which might then give them access to a VM on another host, and they can escape that VM to get access to another VM, and on and on.
Very quickly, there is a fleet of VMs that are compromised. And the only sign of someone poking around is on the first VM the hacker broke into.
All other VMs would be accessed using trusted credentials.
ETA:
Infact, it doesn't even need to be a hacker.
It could be someone uploading a CI/CD task using their own account. It extracts all API keys, usernames and passwords it can find.
Suddenly, you have access to a whole bunch of repositories and APIs.
Then you can sneak in some malicious code to the git repo, and suddenly your malicious code is being shipped within legit software that gets properly signed and everything.
It allows memory access across virtual machines as well, meaning the all cloud VMs are vulnerable.
The machines that are running cloud VMs should obviously be patched. I wasn't talking about those.
Processes that run on the same system can run as different users (including kernel) which is used for privilege separation. This can still allow a program in userland to peer into otherwise restricted system processes or the kernel. Every system is a "multi-user" system, even if there is only a single human user.
Yes, but all the data that I care about is in my single human user's account already. If I install malicious software then I'm already hooped regardless.
Look, I'm not saying this is no biggie. There are plenty of systems out there that will have to install this patch. Single-user computers probably should too. The situation I'm addressing is the case where a gaming computer has its performance as a gaming measurably harmed by the patch's overhead, which is reportedly significant in some cases. In those cases it's reasonable to weigh the merits and decide that this vulnerability isn't all that big a problem.
Disagree. For non-security conscious users who install that helper tool or plugin for their game, it can now read bank credentials from the browser.
If you're a non-security-conscious user installing malicious software on your computer then I don't think there's much that could help you.
But these are the people we (the security community) should be helping. If we don't help those who don't have the skills to help themselves, scammers have a large target and keep on scamming. We are not a target.
Granted, this post isn't necessarily about that, but they'll be the one's targeted regardless. Sometimes the best way to reduce the attack vector is about people, not software.
Well, that says it all. CPU manufacturers have no incentive at all to secure the computations of multiple users on a single CPU (or cores on the same die)... why would they? They make more cash if everyone has to buy their own complete unit, and they can outsource security issues to 'the network' or 'the cloud'...
Years ago when I was in University this would have been a deathblow to the entire product line, as multi-user systems were the norm. Students logged into the same machines to do their assignments, employees logged into the same company servers for daily tasks.
I guess that isn't such a thing any more. But wow, what a sh*tshow modern CPU architecture has become, if concern for performance has completely overridden proper process isolation and security. We can't even trust that a few different users on the same machine can be separated properly due to the design of the CPU itself?
I'm not happy with what's happening and I know that corporations are money making evil machines.
But to say that chip makers have no incentive at all to secure their hardware is quite the hyperbole.
Fair enough, probably was hyperbole :) But performance does seem to be a higher priority than security; they can always spin PR after the next exploit, after all, users already have the CPU in their system, they've made their money; what are users really gonna do if an issue comes up after they've bought their box?
What they will do is not buy from that company again.
Yeah, but we live in cpu monopoly. Intel and Amd Both companies put backdoors and all sort of shit in their cpus.
We don't live in CPU monopoly. Arm and SoCs are also in the game.
Im out of the loop with those. Are Arm and socs viable alternative for home computing?
Last time I checked I could not build a pc with Arm. Post above is right intel and amd are dominating home user market.
I have a macbook air m1 and this arm chips is imo just amazing. No fan no issues, fast as fuck. Id like to build a pc with arm. Maybe when Linux and windows show more support for arm64?
Oh, for desktops? I don't know. I was referring to macbooks and mac minis.
Linux supports ARM64 very well. Windows also has had ARM support for a quite a while. The main obstacles are 3rd party binary software (particularly on Windows) and lack of available hardware.
Are you aware that the majority of cpus sold today go to cloud computing? Believe it or not, but that is an application space with multiple users on the same machine.
Even on a single user machine, multiple users are very much a thing. Even Apple has left behind the DOS-like architecture where everything runs with the same rights. All current systems run with multiple concurrent users, notably root (or the Windows equivalent) and the keyboard operator (as well as dedicated ones for the various services, although that's maybe more a thing in Unix/Linux than Windows).
Good point. But I think performance is still a greater priority for those who make purchasing decisions, rather than basic security, and that's the problem.
Not at the enterprise level.
Security means compliance, which means getting/keeping contracts and not getting sued.
And they care more about performance-per-watt and density.
Processor manufacturers target their devices and sales towards cloud computing so they have a huge incentive to avoid having issues like these. It’s ridiculous to suggest otherwise.
I see the reasoning, fair enough. Just grumpy this evening I suppose :p.
You're reading an awful lot into what I said that wasn't put in there.
There's nothing wrong with multi-user systems existing, there's plenty of use for such things. This bug is really bad for those sorts of things. I was explicitly and specifically talking about consumer gaming computers, which are generally single-user machines. Concern for performance is a very real and normal thing on a gaming computer, it's not some kind of weird plot. An actual multi-user system would obviously need to be patched.
I am so incredibly happy that those terrible multi-user systems are a thing of the past. Multiple seconds wait time for every mouse click are no fun.
Hey! I'll have you know that a 68000 based server was good enough for about 60 users running X11 desktops back in the day!
Kids today with their vodoo cards and whatnot.
When I was in university, they were probably running the same server, but with Ubuntu and for 500 sessions at the same time. That crap was totally unusable.
It’s not they aren’t impacted only you “don’t see the impact” as noticeably.
As a programmer: compile times