• darcmage@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 hours ago

    "Intel will continue to have GPU product offerings,”

    The headline is misleading. GPU offering could refer to their integrated GPUs rather than discrete cards.

  • Pycorax@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    32
    ·
    10 hours ago

    Considering their trend of cutting everything other than the bonuses for their executives, I find that hard to believe.

  • CriticalMiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    ·
    12 hours ago

    I think they’ll scrap it after two generations. Just to keep the anti trust investigations at bay.

  • addie@feddit.uk
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 hours ago

    Reads like Intel will be using Nvidia’s stuff for integrated systems, and doesn’t say anything at all about discrete graphics cards.

    If you’re integrating a GPU, then it’s going to be either for a laptop, in which case performance-per-watt and total die size are very important, or it’s for a generic business PC, in which case ‘as cheap as they can get away with’ takes over. A B580 might be the best mid-range graphics card, but those aren’t the areas where it shines. Using someone else’s tech makes sense.

  • Wispy2891@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    ·
    14 hours ago

    Yes, sure, it’s the same Intel that is prioritizing immediate money returns over long term gains, right?

    Nvidia definitely didn’t do this investment to kill a new competitor. They just happen to find too many billion dollars in their bank account and didn’t know what to do with that and thought “you know what? Let’s invest them in a competitor to spice up the market”

    • fmstrat@lemmy.nowsci.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      Actual headline: “Intel says what they were told to say to avoid antitrust after US government pushes AI agenda and gains ownership in Intel.”

    • just_another_person@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      13 hours ago

      It’s to try and stage off AMD who has cornered the other half of the data center. Nvidia just had the come up on GPUs in DC’s, but lost a decade to trying to compete with AMD there. Now AMD has the most sought after high density chips and FPGA platform with no competition. Nvidia thinks this will slow AMDs sales there I’m sure.

  • ominous ocelot@leminal.space
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    13 hours ago

    What? Ah here it is:

    Nvidia CEO Jensen Huang confirmed today that Nvidia will contribute “GPU chiplets” that Intel can place alongside its x86 CPU cores instead of the Arc integrated graphics it develops in-house today.

    They could buy AMD Radeon chiplets too, that would be something.:)

    • Thorry@feddit.org
      link
      fedilink
      English
      arrow-up
      22
      ·
      edit-2
      12 hours ago

      Intel actually bought AMD Radeon GPUs for their Hades Canyon (Kaby Lake G) platform. It was a NUC mainboard with a full Intel platform, combined with an AMD Radeon GPU. The Intel CPU and the GPU (including HBM2 memory for the GPU) was all on one package soldered to the mainboard.

      I think they did a couple of follow ups on that as well, because it worked very well.

        • sorghum@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          8 hours ago

          There was a time when you could have a platform that ran with components from all 3. Intel CPU, AMD GPU, and nvidia’s nForce chipset on the mobo.

        • AnyOldName3@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          AMD’s GPUs were much faster than Intel’s, and making GPUs for this kind of application was something AMD already did. Nvidia didn’t, so would have to design a whole chip from scratch, and didn’t really have a power efficiency advantage (in recent generations where AMD’s desktop cards have run hot, it’s because they’ve been clocked high to keep up with Nvidia’s cards, but the same architecture runs cool when clocked lower for mobile applications, e.g. Vega was notoriously inefficient on the desktop due to being delayed two years and having to compete with a different generation than it was designed to, but was great in laptop APUs). Intel would also have gained experience with chiplets and packaging a fast GPU with a CPU. It let everyone involved make more money than doing it any other way.

    • CallMeAnAI@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      11 hours ago

      And it fixes nothing. Who is going to buy an already less powerful chip now half powered by Intel. Not that this is find to fix anything.

  • floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    ·
    13 hours ago

    “We’re not discussing specific roadmaps at this time, but the collaboration is complementary to Intel’s roadmap and Intel will continue to have GPU product offerings,” Intel told PCWorld, reiterating the commitment that Intel’s Michelle Johnston Holthaus made before she abruptly left the company.

    I don’t see any commitment in that statement. Indeed it seems carefully worded to avoid making any particular commitment.

    • Wispy2891@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      13 hours ago

      If they fire the whole arc GPU department, stop any development and exclusively sell graphics made with Nvidia chiplets, that statement “Intel will continue to have GPU product offerings” is still true

  • Squizzy@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    12 hours ago

    Could anyone advise here for some rading on this industry and companies? I am out of the loop on how the different chips and makers interact and where their miches are. I only recently realised these companies whitelabel their gpus