this post was submitted on 10 Mar 2024
430 points (96.7% liked)

linuxmemes

21311 readers
381 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.
  •  

    Please report posts and comments that break these rules!


    Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.

    founded 1 year ago
    MODERATORS
     
    top 50 comments
    sorted by: hot top controversial new old
    [–] Shady_Shiroe@lemmy.world 66 points 8 months ago (2 children)

    I started using AMD cuz it was the "more bang for your buck" option and because of my cheapness I have always had a great experience with Linux, excluding wifi breaking every few months.

    [–] Rustmilian@lemmy.world 23 points 8 months ago (1 children)
    [–] Virgo@lemmy.world 63 points 8 months ago

    Leave my wife’s icard out of your goddamn mouth

    [–] MeanEYE@lemmy.world 4 points 8 months ago (1 children)

    I went with AMD because I got fed up with nVidia, similarly like OP did or at least guy in the screen shot. Never looked back. Sure, AMD requires binary blob to initialize card, but it just works and zero issues since then. Upgrade hardware, just transfer drive to a new machine and voila you are ready to go.

    [–] bigmclargehuge@lemmy.world 2 points 8 months ago (1 children)

    I was on a GTX1080 for a long time. Nothing absolutely dealbreaking, but lots of small naggling issues that took lots of annoying troubleshooting to fix. Plus, abysmal DX12 performance (which is a limitation of the cards Pascal architecture as far as I know, not everyone experiences it but it's common enough).

    Switched to an RX 7600XT and wow. Night and day. Zero configuration, zero weird issues, games perform fantastic at high settings (CP2077 at 1440p/High settings across the board is a pretty stable 80+ FPS, compared to 50fps at low and medium and 1080p with the old card, even on Windows). Complete gamechanger.

    load more comments (1 replies)
    [–] pH3ra@lemmy.ml 45 points 8 months ago
    [–] otacon239@feddit.de 45 points 8 months ago (11 children)

    I feel like I’m from an alien planet. I’ve been using nVidia cards exclusively since around 2014 and while I’ve certainly not had a perfect track record, 90% of the time, I’ve been pretty plug-and-play. Maybe I’ve been lucky or maybe it’s because I stick to the popular distros.

    In either case, from the perspective of openness, I do agree with the community that drivers shouldn’t be shrouded in mystery.

    [–] AProfessional@lemmy.world 36 points 8 months ago* (last edited 8 months ago) (2 children)

    You just don’t notice what doesn’t work, like video decoding in your browser. You probably didn’t use a laptop with hybrid graphics. And you might not use GNOME, which has defaulted to wayland which was broken for many years. And you might use an outdated kernel so it never broke. And you don’t use software that used modern linux features like dmabuf.

    Its fair to not have this situation but its an easy one to happen.

    [–] someacnt_@lemmy.world 19 points 8 months ago (1 children)

    Welp. My gnome defaults to X11, and I am using laptop. That said, it does not use hybrid graphics, but honestly only using dedicated card works well enough. That said, fk nvidia. Their greed is overwhelming..

    [–] AProfessional@lemmy.world 5 points 8 months ago

    Yes it did fallback to x11 but it was truly a fallback, no developer used x11, features were ignored there, and it was just a worse experience.

    [–] nekusoul@lemmy.nekusoul.de 4 points 8 months ago (6 children)

    Or you might want to use G-Sync or other forms of VRR on a multimonitor setup, which you can't do under X11 and is broken on Wayland.

    load more comments (6 replies)
    [–] Cossty@lemmy.world 6 points 8 months ago (1 children)

    I have 1060, I bought it when it came out. Three years after that I completeny switched to Linux. There were some problems with it on rolling distributions. And I still cant figure out hardware acceleration in Firefox. It either doesn't work or it is baraly noticable from software acceleration. I still have a lot of skipped frames.

    load more comments (1 replies)
    [–] kbal@fedia.io 6 points 8 months ago (1 children)

    90% of the time, I’ve been pretty plug-and-play.

    If it only works 90% of the time that's not so good really.

    load more comments (1 replies)
    [–] dinckelman@lemmy.world 5 points 8 months ago

    With a 1080ti, i've had my fair share of issues, but compared to how it used to be, it's a night and day difference. If you're still an X11 purist, everything works perfectly, and on Wayland, everything works even better than that, assuming you can launch your software in native Wayland mode

    [–] neidu2@feddit.nl 3 points 8 months ago* (last edited 8 months ago)

    Exclusively nvidia card (or at least nvidia based card) since I got my GeForce 256 in 2001 after ditching my Voodoo2. No major issues beyond the ones I've caused myself.

    load more comments (6 replies)
    [–] elxeno@lemm.ee 41 points 8 months ago (1 children)
    [–] possiblylinux127@lemmy.zip 3 points 8 months ago (1 children)

    For how long? When you fly near the sun you will be burned

    load more comments (1 replies)
    [–] lemmeee@sh.itjust.works 34 points 8 months ago (3 children)

    It's important to point out that AMD isn't perfect either (I don't know about Intel), since it requires you to install proprietary firmware. But it's obviously a huge improvement, since it doesn't require proprietary drivers. If we forced Nvidia to do what AMD does, we would be in a much better position. So if you care about freedom, Nvidia is the last company you should choose.

    [–] possiblylinux127@lemmy.zip 21 points 8 months ago (2 children)

    If you are that concerned about firmware modern hardware is not your friend. Everything from your CPU to WiFi to integrated graphics requires proprietary software

    load more comments (2 replies)
    [–] nexussapphire@lemm.ee 4 points 8 months ago (1 children)

    Things are changing fast. Nvidia has their own "open source drivers" that are almost identical to the proprietary ones and the NVK project has open source drivers that might outperform the proprietary drivers in most games.

    Now the only reason you might install the property drivers by the end of this year is cuda and potentially open CL.I think they're protective of their drivers because about the only thing separating their rt cards from their quatro cards are their drivers and software locked features. Quartos probably get put in more Linux systems than any other type of system.

    [–] baseless_discourse@mander.xyz 13 points 8 months ago (8 children)

    Open source driver is maintained by the community and spend a long time just to get over the man-made barrier posted by nvidia.

    If you think locking down hardware is a practice against the spirit of open source, don't throw money at them.

    load more comments (8 replies)
    load more comments (1 replies)
    [–] HappyFrog@lemmy.blahaj.zone 26 points 8 months ago (1 children)

    I'd react the same if I was forced to install arch

    [–] nexussapphire@lemm.ee 8 points 8 months ago

    You say that like it's a bad thing.😄 Whatever he learns on arch he can bring with him to any other distro. Heak he could have tried it on the other distros to get his system working.

    I'm not trying to be mean but this sounds like someone who didn't understand his system at all and he's about to learn a lot.

    [–] queue@lemmy.blahaj.zone 25 points 8 months ago (8 children)

    I never understand why in 2024 you'd buy nvidia, unless you like paying more for less, or buying from scalpers for even more money. I guess some people really just go "More money spent on it, more better" no matter what.

    [–] finkrat@lemmy.world 28 points 8 months ago (3 children)

    People just think "gaming?? OH NO I NEED MY NVIDIA!!!!" while AMD is sitting there like "hey. Hey I have a card that'll work. Hey. Card. Right here. Works better in Linux. Less headaches. Hello. Hey person. Card. Hi."

    [–] mr_right@lemmy.dbzer0.com 6 points 8 months ago (1 children)

    Only if you consider ray tracing to be a gimmik (which it is) then AMD is the obvious way to go.

    In reality It's because people bought their laptops and their Desktops before switching and want to Use their already existing graphic cards.

    [–] finkrat@lemmy.world 3 points 8 months ago

    This is a very good point, I forgot gaming laptops are almost exclusively nvidia

    load more comments (2 replies)
    [–] Bye@lemmy.world 23 points 8 months ago (4 children)
    [–] angel@iusearchlinux.fyi 3 points 8 months ago

    Haven’t tried it, but might be worth looking into: https://github.com/vosen/ZLUDA

    load more comments (3 replies)
    [–] bjoern_tantau@swg-empire.de 13 points 8 months ago

    A 1070 is hardly a card anyone would buy in 2024. Maybe they were running Windows before that and didn't care that much.

    Also, hard to believe, but for a long while nVidia actually gave you the better experience on Linux. Before AMD had bought ATI. And probably a good while after the sale. The ATI drivers sucked ass.

    [–] eager_eagle@lemmy.world 5 points 8 months ago* (last edited 8 months ago) (3 children)

    I'd like to buy AMD, but I have all these use cases

    • HDMI 2.1 (4K @ 120Hz) - relevant after recent news, if planning to use open source drivers
    • CUDA + Machine Learning applications
    • DLSS still visually better than FSR
    • Ray Tracing still better on GeForce cards
    load more comments (3 replies)
    [–] twinnie@feddit.uk 5 points 8 months ago

    Because the features are better. That’s why most FPS comparisons of AMD and Nvidia always turn off the ray tracing.

    [–] Aurenkin@sh.itjust.works 4 points 8 months ago

    NVIDIA still have the best performing cards if you care about ray tracing. I honestly think that's the only reason to consider buying NVIDIA but you pay a heck of a premium for that.

    [–] rtxn@lemmy.world 2 points 8 months ago (1 children)

    Blender Cycles on Linux does not work with an AMD GPU. Updating either the kernel or ROCm has a 50-50 chance of completely breaking Cycles. By comparison, I had zero issues with Cycles, either CUDA or OptiX, on my 2060. OptiX is also a better denoiser that runs on the GPU, while AMD only has OpenImage that runs on the CPU (GPU support is questionable at this point).

    [–] lemmeee@sh.itjust.works 2 points 8 months ago

    Updating either the kernel or ROCm has a 50-50 chance of completely breaking Cycles

    That sucks, but that doesn't mean that it doesn't work. I'm on Debian stable and it works fine for me, except for weird crashes from time to time.

    load more comments (1 replies)
    [–] Aurenkin@sh.itjust.works 24 points 8 months ago

    I haven't had any serious problems on PopOS but I've still experienced the shittiness that is NVIDIA on Linux. Starfield was broken for months due to a graphics driver bug, then when it was finally fixed, that driver version broke Cyberpunk... Fucking hell NVIDIA.

    [–] octoblade@lemmynsfw.com 9 points 8 months ago

    I switched from a GTX 1080 to an Arc A770 for this exact reason. I was sick of putting up with the bullshit NVIDIA drivers. I am much happier with the Intel card, with the exception of it not having VR support.

    [–] dinckelman@lemmy.world 8 points 8 months ago (2 children)

    There is 0 doubt that Nvidia has staggered a lot of progress in the Linux on desktop scene, however half of what this guy is describing is pure misunderstanding and lack of knowledge

    [–] Cossty@lemmy.world 3 points 8 months ago

    I mean... If after whole day of troubleshooting and googling it didnt help to fix the problem on 2 distributions, then I don't think the problem is with them. I had my fair share of nvidia shenanigans with nvidia over the years. When I couldn't fix it or didnt didn't want to deal with it I just switched distro until it worked.

    load more comments (1 replies)
    [–] Cringe2793@lemmy.world 8 points 8 months ago

    That cursor in the center of the screenshot is really annoying

    [–] apt_install_coffee@lemmy.ml 8 points 8 months ago* (last edited 8 months ago) (1 children)

    I recently bought a 7800 XT for the same reason, NVIDIA drivers giving me trouble in games and generally making it harder to maintain my system. Unfortunately I ran headfirst into the 6.6 reset bug that made general usage an absolute nightmare.

    Open source drivers are still miles ahead of NVIDIA's binary blob if only because I could shift to 6.7 when it released to fix it, but I guess GPU drivers are always going to be GPU drivers.

    load more comments (1 replies)
    [–] palordrolap@kbin.social 7 points 8 months ago (1 children)

    There was a period, however brief, about, oh, 13 or so years ago where the recommendation was to avoid AMD entirely and go Intel and NVIDIA. Guess when I bought the parts for my PC?

    My system before that was entirely AMD / ATI, but then, that was never a Linux machine. Nonetheless, the fashion when I built that was to avoid Intel and NVIDIA.

    Literally the only real problem I've had on Linux with my ancient setup is the fact that one time two or three years back, a kernel and the legacy NVIDIA driver didn't play nice and I had to stick with an older kernel for a while.

    Now my problem is that my NVIDIA card is so old that Debian stable doesn't support it any more and so neither do any distros descended from it. The OEM driver from NVIDIA themselves is a pain to install by comparison to the old .deb method, but compared to what I hear about other NVIDIA users, I'm a living miracle.

    It might also help that I haven't played anything more modern than Minecraft, but I have no trouble with YouTube and streaming sites that I've noticed, nor with any of the old games.

    You can guarantee that by the time I get it together enough to buy a new system with AMD processor and graphics, that will mark the turning point when something happens to cause everyone to swing back the other way again, at least for graphics.

    load more comments (1 replies)
    [–] Pantherina@feddit.de 7 points 8 months ago

    This dude should totally use a ublue-nvidia image before the next Arch update kills their system again.

    [–] hoanbridgetroll@midwest.social 4 points 8 months ago (1 children)

    I tried doing a TimeShift restore last week for another software issue, and nvidia drivers crapped the bed (again). Decided it was time to bite the bullet and swapped out my 1080 Ti with a 7700 XT. Did a clean install of Manjaro, and it was eerie how simple it was to get everything including Wayland to work. Should’ve done it two years ago.

    [–] Cossty@lemmy.world 2 points 8 months ago* (last edited 8 months ago)

    I have nvidia gpu, but more and more I am playing on my steam deck. Even though games have lower awg fps, they feel a lot smoother and frametime graph is a clean line. On my nvidia gpu, frametime graph is like noise visualizer for rock song. When next amd series comes out I will probably buy 8800 or maybe I will even wait for 9k series. I am in no rush.

    [–] Alfons@feddit.de 4 points 8 months ago* (last edited 8 months ago)

    Needed to switch from Debian to Manjaro because of some gcc version conflicts regarding the linux Kernel and the nvidia driver kernel module. The only fix was to install a newer or older linux kernel. Which is a pain in the ass with Debian but is easy with Manjaro :)

    Also switching between newest „gaming“ drivers and cuda always broke my system and drove me crazy. So many hours lost because of nvidia.

    I also have to work with some nvidia edge devices. No fresh install without new issues, i can assure you.

    Edit: Fyi although I am somewhat teck-savvy, I just recently switched completely to linux. Hence, there might be a good way to handle cuda drivers and „gaming“ drivers

    load more comments
    view more: next ›