• Jankatarch@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    3 days ago

    When I walk around in my uni people openly talk about using chatgpt to pass their classes. When I ask for help on some lecture groupchat first 4 answers are “I just used chatgpt.”

    They gave me a whole speech about how they take academic dishonesty so seriously at the beginning but I am honestly just disappointed now. Even using solution manuals make you considered a “good student”

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      it has gotten so bad? no wonder my state uni have been complaining review sites about the schools lack of direction for its direction.

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    2
    ·
    4 days ago

    Once these AI companies go belly-up, those people with critical thinking and research skills will be able to name their price.

    Those abilities have been in high demand for millenia. Focus on the basics.

    • chaosCruiser@futurology.today
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      edit-2
      4 days ago

      Probably not going to go belly-up in a while, but the enshittification cycle still applies. At the moment, investors are pouring billions into the AI business, and as a result, companies can offer services for free while only gently nudging users towards the paid tiers.

      When the interest rates rise during the next recession, investors won’t have access to money any more. Then, the previously constant stream of money dries up, AI companies start cutting what the free tier has, and people start complaining about enshittification. During that period, the paid tiers also get restructured to squeeze more money out of the paying customers. That hasn’t happened yet, but eventually it will. Just keep an eye on those interest rates.

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        4 days ago

        Probably not going to go belly-up, in a while

        Don’t be so sure about that, the numbers look incredibly bad for them in terms of money burned per actual revenue, never mind profit. They can’t even pay for the inference alone (never mind training, staff, rent,…) from the subscriptions.

        • chaosCruiser@futurology.today
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          4 days ago

          As long as they can convince investors of potential future revenue, they will be just fine. In the growth stage, companies don’t have to be profitable because the investors will cover the expenses. Being profitable becomes a high priority only when you run out of series F money, and the next investors can’t borrow another 700 million. It’s a combination of having low interest rates and convincing arguments.

          BTW I don’t think this is a good way to run a company, but many founders and investors clearly disagree with me.

          • taladar@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 days ago

            The difference between AI companies and most other tech companies is that AI companies have significant expenses that scale with the number of customers.

            • chaosCruiser@futurology.today
              link
              fedilink
              English
              arrow-up
              3
              ·
              4 days ago

              That’s a very good point. Actually, video hosting services also suffer from a similar problem, and that’s one of the main reasons why it’s so hard to compete with YouTube. Since there are so many LLM services out there at the moment, it makes me think that there must be a completely ridiculous amount of investor money floating around there. Doesn’t sound like a sustainable situation to me.

              Apparently, the companies are hoping that everyone gets so hooked on LLMs that they have no choice but to pay up when the inevitable tsunami of enshittification hits us.

                • chaosCruiser@futurology.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 days ago

                  Wow, those are some pretty big numbers! About 10x bigger than what I was thinking. I knew these things can get pretty weird, but this is just absolutely wild. When expectations fly that high, the crash can be all the more spectacular.

                  When you notice that your free account can’t do much, that’s a sign that OpenAI is beginning to run out of money. When that happens, the competitors will be ready to welcome all the users who didn’t feel like paying OpenAI.

  • LupusBlackfur@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    2
    ·
    4 days ago

    Thank fuck I graduated college decades ago…

    Actual education/teaching is under assault in the US from all sides these days… Not certain today’s students have any chance. 🙄 🤦‍♀️ 🖕 💩

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 days ago

      i graduated just before the fuckery, and they were at the forefront of using software/weeding out software for jobs already. when i was HS, students already had given up on doing homework, and they were passing people with failing/D grades to graduation. eveyrone that graduated during the pandemic, or is taking classes, said my old college is pretty bad now. because most of them elected to be online classes. theres also other prevailing issues that never were solved when i was still in college. them USING CHATGPT, is probably a step up from just copy and pasting content from various sites with the same exact question or essay.

        • 14th_cylon@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          ·
          4 days ago

          no, it means they will try to somehow apply the knowledge they acquired to real life problem. creating a project, instead of writing a text.

      • Sidyctism II.@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        well how would you verify wether a thesis was written by AI? Mind that accusations are a serious matter, so “i guess it sorta looks like AI” or a percent number spat out by some unreliable LLM-detection AI isnt going to cut it

        • 14th_cylon@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 days ago

          well, not sure if it works the same everywhere in the world, but here, you first write the graduation thesis and then you have to publicly defend it.

          if the defense committee (or is it an attack committee, since it is the student who is on defense? :D) can’t ask questions in a way to find out whether the student actually wrote the paper and understands the topic, then what fucking pseudo-scientific field is that? (and the answer indeed is - it is economics 😂)

          • danzania@infosec.pub
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 days ago

            Then the student could just ask the AI to simulate a thesis defense and learn answers to the most likely questions.

            The funny thing is, they would actually learn the material this way, through a kind of osmosis. I remember writing cheat sheets in college and finding I didn’t need it by the end.

            So there are potential use cases, but not if the university doesn’t acknowledge it and continues asking for work that can be simply automated.

            • 14th_cylon@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Then the student could just ask the AI to simulate a thesis defense and learn answers to the most likely questions.

              while my opinion of economy as a field is not very high, i still have high enough opinion of any teacher to believe they do outperform shitty ai…

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          from what ive seen on some reddit posts, they USE AI to accuse the student of writing in AI.

        • danzania@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Students are now prompting the AI to make it sound like a student wrote it, or putting it through an AI detector and changing the parts that are detected as being written by AI (adding typos or weird grammar, say). Even kids who write their own papers have to do the latter sometimes.

        • mineralfellow@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Perfect grammar and slightly unusual words in a paragraph. Could be a weird formulation from a student’s mind, could be AI. No way to really know.

          • Lodespawn@aussie.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            If the peer review are unable to differentiate between student output and AI output then they are either incompetent or they are inundated with absolute garbage. The latter also suggests the former is true.

            • mineralfellow@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 days ago

              I just finished marking student reports. There are some sections clearly written without AI, some that clearly are written by AI, and then some sections where the ideas are correct, the grammar is perfect, and it is on topic, but it doesn’t seem like it is written in the student’s voice. Could be AI, could be a friend editing, could be plagiarism, could be written long before or after the surrounding paragraphs. It is not always obvious, and the edge cases are the problem.

          • Lodespawn@aussie.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 days ago

            Yeah given the quality of AI outputs they could just read the papers to spot it … you know … do their jobs? I mean there’s a few layers here for thesis review, the supervisor, the professor, the other peer reviewers. They are all supposed to review the paper and at least some of the data that led to its production.

    • chunes@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      There are people who want to do their own thinking and those who don’t. The ratio hasn’t changed much over time. Only the possibilities.

  • SocialMediaRefugee@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    “Hi, what version of chatgpt did you use in your surgical training? Great!”

    “You say the engineering team that designed this suspended walkway just used chatgpt during their training? Sounds good!”

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    4 days ago

    I kinda dont like ChatGPT because it trains everything you feed it into it (goes the same with any LLM)
    and hopefully it wont be forced Right??? (Probably)

  • scarilog@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    edit-2
    3 days ago

    I’m not supporting higher education becoming reliant on for-profit companies like this, but AI tutors and the like, if properly implemented, would be kinda awesome. For example, it’s usually not feasible to have real life staff on hand to answer student questions at all hours of the day. Especially at the more early years of university, where content is simpler, AI is more than capable of meeting needs like this.

    I don’t fully agree with most of the people on this thread. I also hate AI slop being forced into what feels like all aspects of our life right now, but LLMs do have some genuine uses.

    • qbus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      Yeah man for profit companies should be banned in higher ed also unrelated did you renew the license for your textbook?

    • MadMadBunny@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      4 days ago

      Just ask ChatGPT what this means, it will explain it to you like you’re five.