Do you have any ideas or thoughts about this?

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    7
    ·
    1 day ago

    Not if you use it correctly. You don’t write code with AI, you get inspiration to get over sticking points. You pick out the relevant bits, make certain you understand how they work, save hours of banging your head.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      12 hours ago

      Not if you use it correctly.

      Ah! “Git gud” elitism to paper over the risk.

      The issue still stands: what few seniors you still have at the shop who can tell people WHY something is a bad idea, are now distracted with juniors submitting absolute shit code for review and needing to be taught why that structure is a bad idea.

      “Well everyone else is doing it” was a bad rebuttal when you wanted to go to Chuck’s party and Mom said no. Laundering “this is what everyone else writes” through an Ai concentrator when 2 generations of coders are self-taught and unmentored after the great post-y2k purge of mentors and writers, isn’t a better situation.

      • FreedomAdvocate@lemmy.net.au
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        Well, yeah. Saying you should be competent at your job isn’t “elitism”. It’s like how someone can write some code by copy pasting from stack exchange, but not know how or why it works and it might not be well written. That’s no different to using AI and not knowing how or why it works and it might not be well written.

        AI is a tool. The AI haters like yourself need to understand this, instead of thinking it’s a person.

    • Siru@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 hours ago

      AI for the win in figuring out how to use code libraries with minimal to non-existent documentation scattered accross the entire web.

    • expr@programming.dev
      link
      fedilink
      arrow-up
      12
      arrow-down
      4
      ·
      1 day ago

      Ah yes, “just use it correctly”. All these programmers convinced that they are one of the chosen few that “get it” and can somehow magically make it not a damaging, colossal waste of time.

      “Inspiration”, yeah, in the same way we can draw “inspiration” from a monkey throwing shit at a wall.

      • shalafi@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        edit-2
        1 day ago

        Not in IT, huh? Because you missed my entire point. This isn’t like making a lame email that screams fake.

        I got stuck on a Google Calendar/Sheets integration. Almost no documentation or examples out there. After banging my head for hours it occurred to me to try this new AI thing.

        ChatGPT spit out some code, didn’t work of course, but I saw a new path I hadn’t considered and one I’d never knew existed! Picked out the bits I needed, got the script stood up within an hour, after wasting hours trying to do it from scratch.

        People like you were criticizing the use of fire back in the day. “Oog burned hut new fire thing!” “Oog antelope shit head, no use fire good.” “Fire bad FIRE BAD!

        • expr@programming.dev
          link
          fedilink
          arrow-up
          15
          arrow-down
          4
          ·
          1 day ago

          Cute. I’m a senior software engineer that has trained many different models (NLP, image classification, computer vision, LIDAR analysis) before this stupid fucking LLM craze. I know precisely how they work (or rather, I know how much people don’t know how they work, because of the black box approach to training). From the outset, I knew people believed it was much more capable than it actually is, because it was incredibly obvious as someone who’s actually built the damn things before (albeit with much less data/power).

          Every developer that loves LLMs I see is pretty fucking clueless about them and think of them as some magical device that has actual intelligence (just like everybody does, I guess, but I expect better of developers). It has no semantic understanding whatsoever. It’s stochastic generation of sequences of tokens to loosely resemble natural language. It’s old technology recently revitalized because large corporations plundered humanity in order to brute force their way into models with astronomically-high numbers of parameters, so they now are now “pretty good” at resembling natural language, compared to before. But that’s all it fucking is. Imitation. No understanding, no knowledge, no insight. So calling it “inspiration” is a fucking joke, and treating it as anything other than a destructive amusement (due to the mass ecological and sociological catastrophe it is) is sheer stupidity.

          I’m pissed off about it for many reasons, but especially because my peers at work are consistently wasting my fucking time with LLM slop and it’s fucking exhausting to deal with. I have to guard against way more garbage now to make sure our codebase doesn’t turn into utter shit. The other day, an engineer submitted an MR for me to review that contained dozens of completely useless/redundant LLM-generated tests that would have increased our CI time a shitload and bloated our codebase for no fucking reason. And all of it is for trivial, dumb shit that’s not hard to figure out or do at all. I’m so fucking sick of all of it. No one cares about their craft anymore. No one cares about being a good fucking engineer and reading the goddamn documentation and just figuring shit out on their own, with their own fucking brain.

          By the way, no actual evidence exists of this supposed productivity boost people claim, whereas we have a number of studies demonstrating the problems with LLMs, like MIT’s study on its effects on human cognition, or this study from the ACM showing how LLMs are a force multiplier for misinformation and deception. In fact, not only do we not have any real evidence that it boosts productivity, we have evidence of the opposite: this recent METR study found that AI usage increased completion time by 19% for experienced engineers working on large, mature, open-source codebases.

        • Phoenixz@lemmy.ca
          link
          fedilink
          arrow-up
          5
          ·
          24 hours ago

          I am in IT. CTO, yet also still doing development

          Anyone that would deliver a pure AI project I would reject immediately and have them first look at what the hell it is

          That is the biggest issue with AI, people only use it for ready to go solutions. Nobody checks what comes out of it

          I use AI in my IDE exactly like you mentioned; it gives me a wrong answer (because of course) and even though the answer is wrong, it might give me a new idea. That’s fine.

          The problem is ready to go idiots that will just blindly trust AI, ie, the 90% of humans on this world

          • FreedomAdvocate@lemmy.net.au
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 hours ago

            The other problem is idiots (who tend to pretend they’re experts and the only ones who understand what AI is) that don’t understand that AI is a tool to be used, and don’t realise that people who are actually good at their job are able to figure out how to use it properly to benefit them. They think because an idiot uses it incorrectly, that that’s the only way to use it.