• SparroHawc@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      Students that embrace AI might outperform traditional students in situations where AI performs well. However, they will perform worse in all other metrics, and significantly worse when they don’t have access to AI.

      This isn’t theory. Teachers are already seeing it.

      The point of school isn’t to do work. The point is to learn, and typing prompts into genAI isn’t learning.

      • lemmyartistforhire@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        edit-2
        4 days ago

        The point is to learn, and typing prompts into genAI isn’t learning.

        I don’t agree 100%, and I’m still trying to find out why this line in particular stood out to me so much.

        Will you allow me to twist your words a little to play devils advocate?

        Pressing buttons in Photoshop isn’t learning.

        Of course I agree with you that schools are for learning, and by prompting GenAI to generate an image, you don’t learn the basics of photography. But AI can be used as a tool, and usage of tools can be taught. So you still do learn something when using an AI, and that is how to use that AI. When to use that AI, and when usage of that AI does not help at all.

        And I think that is what krisevol was trying to say. Students that know how/when/if to use AI for something, have an advantage over students that don’t have that knowledge.

        • SparroHawc@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          Students with access to AI are already over-relying on it. It is a dangerous tool for children to have unfettered access to, just like social media. It’s sold, misleadingly, as a panacea for having to do research yourself, having to write things yourself, having to think for yourself.

          Sure, learning how to use AI and when to use AI is useful - but its actual usefulness is far, far below the level that people who say things like “Students that embrace AI will outperform traditional students” believe it to be. Everything an AI spits out HAS to be double-checked. AI corps don’t want you to think about that aspect. Unless this is hammered home in every class that AI is used in, then it shouldn’t be used in school.

          Let me present an example from OP’s image. Julia complains that AI can write essays better than the teachers who grade the essays students write. Leaving aside the extraordinary skepticism I feel about this statement, Julia is implying that writing essays in school is a waste of time - but learning how to write a functional, persuasive essay, with proper research, sources, etc. is very, very important, because it teaches you to extract the facts you need from sources and cite them to prove your point in a way that can’t be refuted offhand. It teaches you to examine your premise, find what makes it tick, and take it apart to find the constituent parts so you can shore them up with critical details that prove it. English courses teach you about grammar, spelling, tone, context, tense… I could go on.

          Prompting an LLM to write the essay for you teaches you none of this, and claiming that the two are equivalent is the height of folly. In fact, if someone ‘writes’ an essay with an LLM, I am much more likely to simply reject it out of hand, because it means they didn’t want to take the time to make their own argument. I would rather see the prompt that they used rather than the essay itself, because there will be, in essence, zero additional information in the essay - and asking me to read LLM output as if it provided value of some sort is an insult. I can ask ChatGPT to spew out details on a topic just as well as you can. At least if someone wrote the essay themselves - even if it is poorly written - there’s the possibility of them having a novel approach to the topic.

          • lemmyartistforhire@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            I don’t disagree with you, and you don’t seem to disagree with me, as we’re both saying the same thing: You are not learning the thing you are trying to cut short with AI.

            I’m wondering if the downvotes I’m getting really are for saying that AI can have its uses, and that exposure to AI can reveal those.

        • slacktoid@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          I agree with you part way. But that’s why I’ll add till a certain age you shouldn’t be allowed to. Like you shouldn’t use a calculator till a certain age, you shouldn’t use ai till a certain age.

    • balsoft@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      In school exams when they have access to AI? Sure. In actual real world afterwards? I doubt it.

      If all you do during school is ask LLMs to do most of the work for you, all you’ll know by the end is how to prompt LLMs. Which is not actually a difficult skill to learn, by design, so if you focus on it instead of everything else you’ll lose out.

      Hopefully the education system adapts and invents ways to meaningfully integrate AI into classwork while forcing students to learn to think for themselves still. Otherwise the next generations will be even more cooked than mine.

      • krisevol@lemmus.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        That’s the same, “you with have a calculator everywhere you go” argument. A lot of company’s let you use AI on the job, and everyone has AI in there phone today.

        • balsoft@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 days ago

          Well, yes, it is. Kids who used calculators to cheat with basic arithmetic will often struggle with learning more advanced concepts later on because they didn’t get a “feel” for numbers, and I strongly suspect the same will happen with kids who start using LLMs before they know how research works.

          It is totally appropriate to use calculators when you already have an intuition for small numbers, and in just the same way students should learn to use LLMs, but only when they already know how to write and think and research stuff. Curriculum needs to adapt to this quickly, otherwise we will end up with a generation that outsources all their thinking to techbros.