Why do people host LLMs at home when processing the same amount of data from the internet to train their LLM will never be even a little bit as efficient as sending a paid prompt to some high quality official model?

inb4 privacy concerns or a proof of concept

this is out of discussion, I want someone to prove his LLM can be as insightful and accurate as paid one. I don’t care about anything else than quality of generated answers

  • Brylant@discuss.onlineOP
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    37
    ·
    6 天前

    I’m not saying in advance that it’s pointless, it’s just your far-fetched preconception. I want to know if there are any further arguments for hosting LLMs besides the two I ruled out in advance for being too obvious. Furthermore, I think that since you moved so quickly to mockery, there are no other reasons than privacy and tinkering

      • themurphy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        6 天前

        To be fair, he is just asking for more reasons than privacy. No reason not to give him them, and also okay to say we don’t think there is any.

        • 3abas@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          5 天前

          No, they said they “ruled out” privacy for “obvious reasons”.

          Obviously mockable statement.

    • papertowels@mander.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 天前

      The self hosted community is probably the worst place you could ask about something while ruling out “privacy and tinkering”. Those words might as well be the motto here.