You can take “justifiable” to mean whatever you feel it means in this context. e.g. Morally, artistically, environmentally, etc.

  • fork@feddit.online
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    2 days ago

    It’s never justifiable because it can and will output incorrect information. It’s made my job worse because it means confidently incorrect people bug me when it’s wrong and I have to explain why it’s wrong.

    • MerryJaneDoe@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      Human beings have been outputting incorrect information for years. Get a high school textbook in literally any subject (except possibly math) from the 1970s. You’ll be amazed at how much of it is oversimplified or politicized or just plain wrong.

      I do agree that AI has compounded the problem. There’s a limit to how much inaccuracy/incompetence a given system can tolerate. An organization that relies on AI for critical processes better have a way to monitor and intervene.

      • fork@feddit.online
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I mean, in my specific case, it’s a matter of the person asking an LLM to read a PDF verses them using their stupid fucking eyeballs. Just lazy shits.

    • thinkercharmercoderfarmer@slrpnk.net
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      2 days ago

      That’s not really new, or unique to AI. The whole “field” of eugenics was created to give racism the mantle of scientific legitimacy. People will pick through a haystack of data to find a needle that supports (however tenuously) whatever they want to be true. LLMs are just a more convenient way to find or invent those needles.

      • Corngood@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        The difference now is the machine can churn out way more data (e.g. pull requests) than a human can ever deal with.