lemmy.dudeami.win
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
0nekoneko7@lemmy.world to Technology@lemmy.worldEnglish · 1 year ago

Jensen Huang says even free AI chips from his competitors can't beat Nvidia's GPUs

www.tomshardware.com

external-link
message-square
24
fedilink
69
external-link

Jensen Huang says even free AI chips from his competitors can't beat Nvidia's GPUs

www.tomshardware.com

0nekoneko7@lemmy.world to Technology@lemmy.worldEnglish · 1 year ago
message-square
24
fedilink
Jensen says free 'isn't cheap enough' to stand a chance against the green team in data center AI.
  • Nighed@sffa.community
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    1 year ago

    And AMD or Intel are better? Everyone complains about the drivers.

    • snaggen@programming.dev
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      2
      ·
      1 year ago

      For Linux it is a huge difference. AMD and Intel have great open source drivers, while Nvidia have binary drivers with a lot of issues.

      • pearsaltchocolatebar@discuss.online
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        I find this strange, because I had nothing but trouble getting my R9 390 working with any Linux distro, but my RTX 3060 hasn’t given me a single issue on like 6 different distros.

        • GarlicToast@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 months ago

          deleted by creator

          • pearsaltchocolatebar@discuss.online
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Desktop, both Intel and AMD builds. And no, I just had multiple SSDs that I played with distros on.

            • GarlicToast@programming.dev
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              3 months ago

              deleted by creator

              • pearsaltchocolatebar@discuss.online
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I guess I’ve been lucky with my laptop, although it is fairly old at this point.

                • GarlicToast@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 months ago

                  deleted by creator

      • PersonalDevKit@aussie.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        And for AI at home? Since this is a story about AI DataCenters

        I want to get an AMD but the integration of Nvidia GPUs for processing ML/AI stuff is much higher. So if I want to mess with running AI at home I only have 1 choice.

        I hope AMD release something that competes on that front, and can still play games on the weekend, but currently, he is right there is no competition

        • Fisch@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I run AI stuff just fine on my AMD GPU using HIP. At least LLMs and Stable Diffusion work perfectly fine but that’s the only things I’ve tested.

          • PersonalDevKit@aussie.zone
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            All the benchmarks put the equivalent Nvidia cards almost 2x more in Stable Diffusion https://www.tomshardware.com/pc-components/gpus/stable-diffusion-benchmarks

            I hope that is an old benchmark and times have changed, but I can’t find anything like it that is more recent.

            For a bit of my soul I get a lot more ai power.

            AI works on AMD but the speed doesn’t seem to be anywhere near Nvidia.

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.12K users / day
  • 9.65K users / week
  • 17.9K users / month
  • 37.6K users / 6 months
  • 1 local subscriber
  • 69.9K subscribers
  • 10.6K Posts
  • 372K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • L4s@hackingne.ws
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org