These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.

Still, the audacity of saying “we’re going to invest $1 trillion” is Dr. Evil-level humour.

OpenAI is signing about $1 trillion (€940 billion) in deals this year for computing power to keep its artificial intelligence dreams humming.

On Monday the outfit inked a deal with AMD which follows earlier tie-ups with Nvidia, Oracle and CoreWeave, as Sam Altman’s outfit scrambles to secure enough silicon to keep ChatGPT online and the hype machine alive.

The latest commitments would give OpenAI access to more than 20 gigawatts of computing capacity over the next decade, roughly the output of 20 nuclear reactors. At about $50 billion per gigawatt, according to OpenAI’s estimates, the total tab hits that $1 trillion figure.

Analysts are not convinced this financial engineering makes any sense. DA Davidson analyst Gil Luria said: “OpenAI is in no position to make any of these commitments,” adding that it could lose about $10 billion this year.

  • chromodynamic@piefed.social
    link
    fedilink
    English
    arrow-up
    26
    ·
    6 days ago

    It’s strange that the concept of efficiency seems to have been abandoned. Is consumption of vast computing resources no longer seen as indication of a design flaw?

    • tangentism@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      Read elsewhere that openai are trying to buy themselves ahead enough to outlast the bubble popping.

      Seems that everything else is sacrificed from that end.

      Will be hilarious if they are one of the first to go!

        • ryannathans@aussie.zone
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          6 days ago

          Do you reckon openai could get cheaper power or gpus? Or something else? Could nvidia get lower production costs for these?

          • chromodynamic@piefed.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 days ago

            I’m talking about the software side of things. Generative “AI” seems to be a “brute force” approach to artificial intelligence - just throwing hardware at the problem instead of finding a better approach. Given the limitations of GenAI, it just feels crazy to keep going this way. Like a sunk-cost fallacy. These are just my thoughts though, not a real scientific analysis.

            • ryannathans@aussie.zone
              link
              fedilink
              arrow-up
              2
              ·
              5 days ago

              Recent advancements using “dynamic sparsity” or “selective activation” approaches increase efficiency beyond “brute force”. This is how China began to compete without anywhere near the number of GPUs or power.

  • lichtmetzger@discuss.tchncs.de
    link
    fedilink
    arrow-up
    18
    ·
    6 days ago

    Ed Zitron’s gonna have a field day with this. OpenAI’s motto seems to be scaling “to infinity and beyond”. But what can you expect from a techbro CEO that takes Dyson spheres seriously.

  • ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    18
    ·
    6 days ago

    These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.

    Data center scale is usually given in terms of power consumption, not computing power. The trillion dollars is meant to buy enough hardware to suck up 20GW of power, and probably none of the money will go towards power generation.

  • jarfil@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    4 days ago

    $8.76/kWh is a terrible fucking investment

    Not kWh, it says watts not watt-hours.

    Still a silly way to refer to computing power, though.

  • Hirom@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    5 days ago

    Meanwhile, Nvidia has promised to pump $100 billion into OpenAI over the next decade, a move that will conveniently help OpenAI pay for Nvidia’s own chips.

    OpenAI and NVIDIA’s future are getting tied together more than they already were