• Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      14 hours ago

      Seriously. Even if an AI gets “nuclear codes” how the fuck are they going to jump into an air gapped system with about a 60th of the CPU cycles and RAM they need to operate? This doomerism is fucking unhinged and literally requires pretending that AI can somehow just jump into any old computer and that we totally don’t have massive server farms just to have them function at all with high-level computers supporting them with high speed CPUs and massive amounts of RAM and storage. They don’t have a fucking body, for one thing, and for two no current iterations do anything at all like thinking, they all have to he prompted. They don’t start conversations on their own.

      It’s so fucking stupid, it’s the fantasies of childish fucking nerds.

      • Jason2357@lemmy.ca
        link
        fedilink
        arrow-up
        8
        ·
        15 hours ago

        I’ve come to equate AI doomerism with just being cloaked AI boosting. It’s just another way to convince people investors that AI is as powerful as claimed. It’s not.

      • kiagam@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        14 hours ago

        I am not worried abou the launch codes, I’m worried some incompetent middle manager will replace key staff at a water treatment plant or something (because the CEO told them to put AI somewhere or be fired). Then the fucking AI will have a meltdown after it reads data 10 times in a row and will poison the water supply of millions. Nobody will catch it because quality control analysis was also replaced by the same AI and it decided to delete the alert system for some reason (it had to do something, that is how it works).

        I’m not worried about AI being good, I’m worried about the people that think it is good (which already proves they are stupid af) giving it a shot at anything critical