this post was submitted on 07 Jan 2025
577 points (96.6% liked)

linuxmemes

21736 readers
431 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, <loves/tolerates/hates> systemd, and wants to interject for a moment. You can stop now.
  •  

    Please report posts and comments that break these rules!


    Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.

    founded 2 years ago
    MODERATORS
     

    The NSA, the original primary developer of SELinux, released the first version to the open source development community under the GNU GPL on December 22, 2000.[6] The software was merged into the mainline Linux kernel 2.6.0-test3, released on 8 August 2003. Other significant contributors include Red Hat, Network Associates, Secure Computing Corporation, Tresys Technology, and Trusted Computer Solutions.

    https://en.wikipedia.org/wiki/Security-Enhanced_Linux

    top 50 comments
    sorted by: hot top controversial new old
    [–] LodeMike@lemmy.today 13 points 15 hours ago (1 children)

    Same thing with SHA-1, 2, and I'm not sure about 3.

    [–] heavydust@sh.itjust.works 3 points 2 hours ago

    Same for the improvements they did on DES, and for open-sourcing Ghidra. Sometimes they are the good guys.

    [–] possiblylinux127@lemmy.zip 72 points 1 day ago* (last edited 23 hours ago) (2 children)

    I'm not sure why that's a problem. The NSA needed strong security so they created a project to serve the need. They are no longer in charge of SELinux but I wouldn't be surprised if they still worked on it occasionally.

    There are a lot of reasons to not like the NSA but SELinux is not one of them.

    [–] Honytawk@lemmy.zip 2 points 1 hour ago (1 children)

    So, how many backdoors do you think they implemented into the kernel?

    [–] possiblylinux127@lemmy.zip 1 points 10 minutes ago

    None

    There are always exploits to be used. Also there isn't a lot of use in kernel specific exploits

    [–] frezik@midwest.social 38 points 23 hours ago (1 children)

    That's the trubble with the NSA. They want to spy on people, but they also need to protect American companies from foreign spies. When you use their stuff, it's hard to be sure which part of the NSA was involved, or if both were in some way.

    [–] possiblylinux127@lemmy.zip 17 points 23 hours ago (2 children)

    The NSA has a fairly specific pattern of behavior. They work in the shadows not in the open. If they target things with low visibility so it is hard to trace. Backdooring SELinux would be uncharacteristic and silly. They target things like hardware supply chains and ISPs. There operations aren't even that covert as they work with companies.

    [–] The_Decryptor@aussie.zone 6 points 12 hours ago

    They were a bit too public with "Dual_EC_DRBG", to the point where everybody just assumed it had a backdoor and avoided it, the NSA ended up having to pay people to use it.

    [–] frezik@midwest.social 15 points 22 hours ago (1 children)

    The specific example I'm thinking of is DES. They messed with the S-boxes, and nobody at the time knew why. The assumption was that they weakened them.

    However, some years later, cryptographers working in public developed differential cryptanalysis to break ciphers. Turns out, those changed S-boxes made it difficult to apply differential cryptanalysis. So it appears they actually made it stronger.

    But then there's this other wrinkle. They limited the key size to 56-bits, which even at the time was known to be too small. Computers would eventually catch up to that. Nation states would be able to break it, and eventually, well funded corporations would be able to break it. That time came in the 90s.

    It appears they went both directions with that one. They gave themselves a window where they would be able to break it when few others could, including anything they had stored away over the decades.

    [–] possiblylinux127@lemmy.zip 4 points 22 hours ago

    Honestly I think it ultimately comes down to the size of the organization. Chances are the right hand doesn't know what the left hand is doing.

    I do like the direction the US is heading it. Some top brass have finally caught on that you can't limit access to back doors.

    [–] mariusafa@lemmy.sdf.org 20 points 1 day ago (2 children)

    If they afterwards released it under a Free (Libre) Software licence then it's fine. The licence itself prohibites against any obfuscation or combination of obfuscated code with libre one. If you have the entire code, not just some part, as most companies do when go Open Source (not free software), then you don't have to worry about unknown behavior because everything is in the source.

    [–] BorgDrone@lemmy.one 10 points 15 hours ago (1 children)

    If you have the entire code, not just some part, as most companies do when go Open Source (not free software), then you don't have to worry about unknown behavior because everything is in the source.

    Hahaha, good joke

    [–] mariusafa@lemmy.sdf.org 5 points 12 hours ago

    I mean if you have the entire source then you have everything to reproduce the program. Finding a malicious part does not only depend on the source but on the inspector, that is true.

    But anyways having the entire code and not just the part that a company feels they may share is better anyways. Even if it's literally malware.

    The free software community users depend on the community in order to detect malicious code. But at least there's a source code way of doing so.

    If I tell you that this building has a structural deformation, having the possibility of accesing the architect blueprints and list of materials is better than just being able to go inside the building and try to search for it, no?

    [–] possiblylinux127@lemmy.zip 14 points 23 hours ago

    Also it is no longer under the NSA. The original NSA branding was also removed due to concerns from the community.

    [–] ricecake@sh.itjust.works 109 points 1 day ago (4 children)

    While they created a set of patches that would implement the security features that selinux provides, what was actually merged was the result of several years of open collaboration and development towards implementing those features.

    There's general agreement that the idea that the NSA proposed is good and an improvement, but there was, and still is, disagreement about the specific implementation approaches.
    To avoid issues, an approach was taken to create a more generic system that selinux would then take advantage of. That's why selinux, app armor and others can live side by without it being a constant maintenance and security nightmare. Each one lives in their little self contained auditable boxes, and the kernel just makes the "check authorization" function call and it flows into the right module by configuration.

    The Linux community was pretty paranoid about the NSA in 2000, so the code definitely got a lot more scrutiny than the typical proposal.

    A much easier way to introduce a backdoor would be to start a tiny company that produces some arbitrary piece of hardware which you then add kernel support for.

    https://github.com/torvalds/linux/tree/master/drivers/input/keyboard - that's just the keyboard drivers.

    Now you're adding code to the kernel and with the right driver and development ability you can plausibly make changes that have non-obvious impacts, and as a bonus if someone notices, you can just say "oops!" And not be "the god-damned NSA" who everyone expects to be up to something, and instead be 4 humble keyboard enthusiasts with an esoteric set of lighting and input opinions like are a dime a dozen on Kickstarter.

    [–] the_crotch@sh.itjust.works 8 points 1 day ago (4 children)

    The Linux community was pretty paranoid about the NSA in 2000, so the code definitely got a lot more scrutiny than the typical proposal.

    It's not paranoia if it's true. Snowden showed us that they really are spying on all of us all the time

    [–] ricecake@sh.itjust.works 4 points 18 hours ago

    Paranoia in the sense of being concerned with the ill intent of others, not the sense of an irrational worry about about persecution. Much like how the intelligence community itself is said to have institutional paranoia.

    load more comments (3 replies)
    [–] possiblylinux127@lemmy.zip 4 points 21 hours ago

    It is also important to note that it is pretty easy to do surveillance these days. People care around cell phones and there are massive camera systems that can track someone with high detail.

    [–] winterayars@sh.itjust.works 40 points 1 day ago

    We saw a very sophisticated attack on Linux earlier this year with the XZ exploit. That stuff is terrifying and the sort of thing people should be worried about. SELinux is tame, by comparison.

    load more comments (1 replies)
    [–] technohacker@programming.dev 127 points 1 day ago (1 children)

    I mean, leaving aside their surveillance tasks, it's still their job to ensure national security. It's in their best interest to keep at least themselves and their nation safe, and considering how prevalent Linux is on servers, they likely saw a net benefit this way. They even open sourced their reverse engineering toolkit Ghidra in a similar vein

    [–] carpelbridgesyndrome@sh.itjust.works 24 points 1 day ago (1 children)

    Ghidra was about hiring and cost savings. Its easier to hire when people already know your tools. Also people are more willing to use your tools rather than expensive ones if they can still use them when they leave (go into contracting). Also interoperability with contractors may improve.

    [–] technohacker@programming.dev 17 points 1 day ago

    And we're all the better for it! Needs polish and development of course, but it's a decent alternative already

    [–] kekmacska@lemmy.zip 45 points 1 day ago (2 children)

    Only because they use Linux too and had to make it public as Linux is a public, open-source kernel

    [–] axx@slrpnk.net 2 points 2 hours ago

    Not really. Do you know how many proprietary, company-specific extensions and modules there are of the Linux kernel out there?

    Loads of companies choose not to contribute their stuff back upstream. I don't know why the NSA did originally in the case of SELinux, but I would guess it had to do with transparency, national defense and not carrying the burden of a module / fork solo. They were also not the only contributors even early on, according to the Wikipedia page

    Also, if I recall correctly, there was no other option for MAC back them (no AppArmor or Tomoyo).

    [–] lengau@midwest.social 60 points 1 day ago (1 children)

    GPLv2 only says that people with access to the binary need access to the source code too. If they only used it internally they'd never have to make it public.

    load more comments (1 replies)
    [–] SnotFlickerman@lemmy.blahaj.zone 148 points 1 day ago* (last edited 1 day ago) (10 children)

    I mean, it's still Open Source, right? So it would be pretty hard for them to hide a backdoor or something??

    I guess I don't know what's so sus when it's easily auditable by the community and has been for two decades now.

    If it's just because it's memes and you're not being that serious, then disregard please.

    [–] byrtzr@lemmy.world 118 points 1 day ago (3 children)

    I mean, it’s still Open Source, right? So it would be pretty hard for them to hide a backdoor or something??

    Right but maybe it combined with other tools they have is what helps them with some exploit.
    Like they figured out an exploit but needed SELinux as a piece of the puzzle. It's open source
    and we can all read the code but we can't see the other pieces of the puzzle.

    Come on, put your conspiracy hat on! ;)

    [–] ocean@lemmy.selfhostcat.com 105 points 1 day ago (1 children)

    This is why I use templeOS

    [–] Natanox@discuss.tchncs.de 74 points 1 day ago (1 children)

    Dude, that one got a spy built-in! Direct telemetry to god.

    [–] ocean@lemmy.selfhostcat.com 33 points 1 day ago

    That explains why it doesn’t need internet

    [–] Shareni@programming.dev 5 points 1 day ago

    I mean, they almost certainly have built in backdoors like IME. When you can force hardware manufacturers to add shit, you don't have to think up convoluted solutions like that.

    load more comments (1 replies)
    [–] spacecadet@lemm.ee 41 points 1 day ago (1 children)

    I maintain open source software on a much smaller codebase that is less security critical. We have dozens of maintainers on a project with about 3k stars on GitHub. Stuff gets by that are potentially security vulnerabilities and we don’t know until upstream sources tell us there is a vulnerability

    load more comments (1 replies)
    load more comments (8 replies)
    [–] onlinepersona@programming.dev 16 points 1 day ago (2 children)

    Wasn't Signal messenger also funded by the NSA+DARPA? And TOR too?

    Anti Commercial-AI license

    [–] MajorHavoc@programming.dev 12 points 23 hours ago* (last edited 23 hours ago) (2 children)

    Signal is weird about actually allowing others to reproduce the APK builds.

    Specifically, they are the kind of weird about it that one would expect if the Signal client app had an NSA back-door injected at build time.

    This doesn't prove anything. It just stands next to anything and waggles it's eyebrows meaningfully.

    [–] FooBarrington@lemmy.world 7 points 20 hours ago (1 children)

    Do you have more recent information by Signal on the topic? The GitHub issue you linked is actually concerned with publicly hosting APKs. They also seem to have been offering reproducible builds for a good while, though it's currently broken according to a recent issue.

    [–] MajorHavoc@programming.dev 5 points 20 hours ago* (last edited 19 hours ago)

    I had a hard time choosing a link. Searching GitHub for "F-Droid" reveals a long convoluted back-and-forth about meeting F-Droid's requirements for reproducible builds. Signal is not, as of earlier today, listed on F-Droid.

    F-Droid's reproducibility rules are meant to cut out the kind of shenanigans that would be necessary to hide a back door in the binaries.

    Again, this isn't proof. But it's beyond fishy for an open source security tool.

    Edit: And Signal's official statements on the topic are always reasonable - but kind of bullshit.

    Reasonable in that I alwould absolutely accept that answer, if it were the first time that Signal rejected a contribution to add it to F-Droid.

    Bullshit in that it's been a long time, lots of folks have volunteered to help, and Signal still isn't available on F-Droid.

    [–] kadup@lemmy.world 6 points 23 hours ago* (last edited 23 hours ago) (1 children)

    There was a "ultra private" messaging app that was actually created by a US state agency to catch the shady people who would desire to use an app promising absolute privacy. Operation "Trojan Shield".

    The FBI created a company called ANOM and sold a "de-Googled ultra private smartphone" and a messaging app that "encrypts everything" when actually the device and the app logged the absolute shit out of the users, catching all sorts of criminal activity.

    I have no proof, but I do have a small list of companies I actually suspect of pulling a similar stunt... perhaps not necessarily attached to the FBI or any other agency, but something about their marketing and business model screams "fishing for people who have something to hide"

    [–] SnotFlickerman@lemmy.blahaj.zone 4 points 23 hours ago

    The fact that it is a paid product should have been their first clue it was a honeypot.

    [–] xor@lemmy.dbzer0.com 5 points 22 hours ago

    no. and tor was originally funded by the navy…
    ….

    [–] el_bhm@lemm.ee 3 points 21 hours ago

    For people interested in the subject. Read This Is How They Tell Me the World Ends: The Cyberweapons Arms Race

    TLDR current day software is based upon codebases that have houndreds of thousands lines of code. Early NSA hacker put forward an idea 100k LoC program will not be free of a hole to exploit.

    To be a target of a 0-day you would have to piss off state level actors.

    load more comments
    view more: next ›