I have currently a RX 6700XT and I’m quite happy with it when it comes to gaming and regular desktop usage, but was recently doing some local ML stuff and was just made aware of huge gap NVIDIA has over AMD in that space.

But yeah, going back to NVIDIA (I used to run 1080) after going AMD… seems kinda dirty for me ;-; Was very happy to move to AMD and be finally be free from the walled garden.

I thought at first to just buy a second GPU and still use my 6700XT for gaming and just use NVIDIA for ML, but unfortunately my motherboard doesn’t have 2 PCIe slots I could use for GPUs, so I need to choose. I would be able to buy used RTX 3090 for a fair price, since I don’t want to go for current gen, because of the current pricing.

So my question is how is NVIDIA nowadays? I specifically mean Wayland compatibility, since I just recently switched and would suck to go back to Xorg. Other than that, are there any hurdles, issues, annoyances, or is it smooth and seamless nowadays? Would you upgrade in my case?

EDIT: Forgot to mention, I’m currently using GNOME on Arch(btw), since that might be relevant

  • Inui [comrade/them]@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Fan control is either impossible or a pain in the ass since stuff like Green With Envy use something only available on x11. For me, it means my GPU fans spin up and down repeatedly at idle because the minimum fan speed is something like 33% and I can’t pin it there without a program to do so, let alone set reasonable fan curves.

  • null@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    On my 1660 Ti – tons graphical glitches. Lots of extreme stuttering in Steam games.

  • StrawberryPigtails@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Mostly good, though I’ve got a bug on my desktop. It’s a two monitor setup and if I am running a game like Minecraft full screen on the second display and close out the game Plasma crashes to the login screen. Works fine if I disable the second display. That system is running Plasma 5 - Wayland on Nixos 23.11.

    Otherwise, I occasionally run into an app that just doesn’t work, but that’s about all. Sometimes it’s a Plasma on Wayland thing (like with Element) sometimes not.

  • d3Xt3r@lemmy.nzM
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    What sort of ML tasks exactly, and is it personal or professional?

    If it’s for LLMs then you can just use Petals.

    If it’s for SD / image generation, there are three ways you can go about it: rent a GPU cloud service like vast.ai or runpod.io, vagon.io etc, then run SD on the PC/gpu you’re renting. It’s relativelt cheap, generate as much as you want in the duration you’ve rented. Last I checked, the prices were something like ~0.33 USD per hour, which is a far cheaper option than buying a top-end nVidia card for casual workloads.

    Another option is using a website/service where SD’s UI is presented to you and you usually generate images through a credit system. Buy X amount of credits and you can generate X amount of images etc. Eg sites like Rundiffusion, dreamlike.art, seek.art, lexica etc.

    Finally, there are plenty of free Google collabs for SD. And there is also stable horde, uses distributed computing for SD, with an easy WebUI called ArtBot.

    So yeah, there’s plenty of options these days depending on what you want to do, you no longer need to actually own an nVidia card.

  • Gabriel Martini@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    It depends. GNOME on Wayland + Nvidia runs great. But if you try the tiling manager camp, you will run into several issues in sway, hyprland. Things like having to use software mouse because insert nvidia excuse and high cpu usage by just moving the mouse.

    Well… I don’t know, I would recommend GNOME on Wayland or maybe KDE, haven’t tried the latest Plasma 6 release, but outside that, avoid it.

    • Brewchin@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      high cpu usage by just moving the mouse.

      This sounds like co-operative multi-tasking on a single CPU. I remember this with Windows 3.1x around 30 years ago, where the faster you moved your mouse, the more impact it would have on anything else you were running. That text scrolling too fast? Wiggle the mouse to slow it down (etc, etc).

      I thought we’d permanently moved on with pre-emptive multi-tasking, multi-threading and multiple cores… 🤦🏼‍♂️

  • foreverunsure@pawb.social
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Native Wayland apps run great. Can’t say the same about those using XWayland, as most of them suffer from graphical glitches and flickering (especially Steam and Minecraft). Secure Boot works with some manual configuration.

  • RedWeasel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Plasma 6(arch) is pretty excellent. There is the bug mentioned in other comments with Xwayland that won’t be (fully) fixed until the Explicit sync wayland protocol is finalized and implemented, but that should apply to any wayland compositor.

    As to wayland vs x11, if you want to game or anything else that is only X11, use X11, otherwise most everything else can be use wayland.

  • arcidalex@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Better, but still shit. The main holdup right now to what I see is wayland-protocols and the WMs adding Explicit Sync support as the proprietary driver does not have implicit sync support. Its part of a larger move for the graphics stack to move to explicit sync:

    https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/90

    Once this is in, the flickering issues will be solved and NVIDIA wayland being a daily driver in most situations

    • GlowHuddy@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Yeah, was just reading about it and it kind of sucks, since one of the main reasons I wanted to go Wayland was multi-monitor VRR and I can see it is also an issue without explicit sync :/

      • arcidalex@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Yeah. I have a multi-monitor VRR setup as well and happened to have a 3090 and not being able to take advantage of Wayland really sucks. And its not like Xorg is any good in that department either so you’re just stuck between a rock and a hard place until explicit sync is in

        Lets see what will happen first- me getting a 7900xtx or this protocol being merged

        • Russ@bitforged.space
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          I haven’t kept up with the explicit sync support since I eventually did migrate over to AMD in October after the 545 Nvidia driver came out and didn’t impress me at all - however I did hear in passing that you can get the explicit sync patch already in some ways, just a quick search reveals that Arch has this in the AUR already as xorg-xwayland-explicit-sync-git and that Nobara might already have it (I can’t find official confirmation on this).

          I noticed there was also some debate as to whether you would need a patched version of the compositor as well - but someone claims that just the XWayland patch worked for them (links to Reddit, as a heads up).

          So your mileage may vary and it might require a varying level of work depending on what distro you run, however it might be worth looking into a bit more.

          • arcidalex@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            5 months ago

            The patches exist for Wayland/Xwayland but the Compositor itself has to be patched as well for it to completely work. KDE does not have its fix for explicit sync merged in yet so its not patched as of Plasma 6.0.1. It did make things slightly better but all it did was make the flickering less frequent, but its still there.

        • GlowHuddy@lemmy.worldOP
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          Now I’m actually considering that one as well. Or I’ll wait a generation I guess, since maybe by then Radeon will at least be comparable to NVIDIA in terms of compute/ML.

          Damn you NVIDIA

          • filister@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Yes, I am pretty much in the same boat, running Linux as a daily driver and currently having an ancient AMD GPU and was thinking to buy NVIDIA for exactly ML but I really don’t want to give them my money, as I dislike the company and their management but AMD is subpar in that department, so not much of a choice

  • Kanedias@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Same here, but it turned out a lot of frameworks like tensorflow or pytorch do support AMD ROCm framework. I managed to run most models just by installing a rocm version of these dependencies instead of the default one.

    • GlowHuddy@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Yeah, I’m currently using that one, and I would happily stick with it, but it seems just AMD hardware isn’t up to par with Nvidia when it comes to ML

      Just take a look at the benchmarks for stable diffusion:

      • AnUnusualRelic@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Aren’t those things written specifically for nVidia hardware? (I used to e a developer, but this is not at all my area of expertise)

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I run a 1080 on Ubuntu and it works fine. Need to disable secure boot, install the proprietary drivers, and maybe change a setting or two, but it works. I have no idea how well it’d work on one of the “you go figure it out yourself” distros like Arch or its derivatives.

    On Wayland, my setup had all kinds of hardware acceleration issues, mostly on Firefox. Games seem to work fine, but I couldn’t be bothered to fix Firefox issue. One of my HDMI cables is going bad and I’m too lazy to buy new ones, which causes weird issues sometimes; those issues cause weird freezes on X11 and crash the Wayland session entirely. So until I stop being lazy, I’ll stick to X11 for a while.

    My laptop has a Pascal GPU and it mostly works. There are some weird display issues, but they’re also present when I have the Nvidia GPU disabled entirely. I can’t get more than 30fps on the HDMI when I disable the laptop display, though, so something about that setup is breaking my ability to hook it up to a TV and close it. Maybe it’s the mux chip? Maybe it’s the GPU? I can’t tell. It’s running Manjaro, so driver installation wasn’t as bad as I would’ve expected. I did need to edit a text file in /etc to get the Nvidia driver to load properly, though, or Wayland wouldn’t detect any external screens.

    From what I’ve read, the modern cards are in kind of a weird situation. On the one hand, you have open source drivers now, on the other hand, there still seem to be many issues. Nvidia has tons of tooling for machine learning and such, but AMD cards have much more VRAM for the price. If I would buy a GPU now, I’d probably go AMD and fight ROCm to make the ML stuff work.

  • astrsk@piefed.social
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Discord and Steam flicker / render weird and I get massive input lag for seemingly no reason just trying to use almost any app. I stick with x11 and have little to no issue now that the Firefox offset cursor regression was fixed. I’m running a 3090 on EndeavourOS.

  • FluffyPotato@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Depends on the moon stage, your horoscope and your palm reading.

    I have tried it on several setups and I get different results every time.

  • warmaster@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I have an Nvidia 3080TI and an AMD RX 8900 XTX.

    The AMD runs great on any distro, I love it. The Nvidia is so much of a huge pain that I installed Windows on that PC.

    • mmus@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Are you from the future? Please tell us more about the RDNA4 architecture if so :)