• CanadaPlus@lemmy.sdf.org
    link
    fedilink
    arrow-up
    7
    arrow-down
    20
    ·
    edit-2
    7 months ago

    Does anyone really know what a guitar is, completely? Like, I don’t know how they’re made, in detail, or what makes them sound good. I know saws and wide-bandwidth harmonics are respectively involved, but ChatGPT does too.

    When it comes to AI, bold philosophical claims about knowledge stated as fact are kind of a pet peeve of mine.

    • CasualPenguin@reddthat.com
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      7 months ago

      It sounds like you could do with reading up on LLMs in order to know the difference between what it does and what you’re discussing.

    • Zron@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      7 months ago

      You’re the one who made this philosophical.

      I don’t need to know the details of engine timing, displacement, and mechanical linkages to look at a Honda civic and say “that’s a car, people use them to get from one place to another. They can be expensive to maintain and fuel, but in my country are basically required due to poor urban planning and no public transportation”

      ChatGPT doesn’t know any of that about the car. All it “knows” is that when humans talked about cars, they brought up things like wheels, motors or engines, and transporting people. So when it generates its reply, those words are picked because they strongly associate with the word car in its training data.

      All ChatGPT is, is really fancy predictive text. You feed it an input and it generates an output that will sound like something a human would write based on the prompt. It has no awareness of the topics it’s talking about. It has no capacity to think or ponder the questions you ask it. It’s a fancy lightbulb, instead of light, it outputs words. You flick the switch, words come out, you walk away, and it just sits there waiting for the next person to flick the switch.

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        edit-2
        7 months ago

        No man, what you’re saying is fundamentally philosophical. You didn’t say anything about the Chinese room or epistemology, but those are the things you’re implicitly talking about.

        You might as well say humans are fancy predictive muscle movement. Sight, sound and touch come in, movement comes out, tuned by natural selection. You’d have about as much of a scientific leg to stand on. I mean, it’s not wrong, but it is one opinion on the nature of knowledge and consciousness among many.

        • Zron@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          I didn’t bring up Chinese rooms because it doesn’t matter.

          We know how chatGPT works on the inside. It’s not a Chinese room. Attributing intent or understanding is anthropomorphizing a machine.

          You can make a basic robot that turns on its wheels when a light sensor detects a certain amount of light. The robot will look like it flees when you shine a light at it. But it does not have any capacity to know what light is or why it should flee light. It will have behavior nearly identical to a cockroach, but have no reason for acting like a cockroach.

          A cockroach can adapt its behavior based on its environment, the hypothetical robot can not.

          ChatGPT is much like this robot, it has no capacity to adapt in real time or learn.

    • FruitLips@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      7 months ago

      Feels reminiscent of stealing an Aboriginal, dressing them in formal attire then laughing derisively when the ‘savage’ can’t gracefully handle a fork. What is a brain, if not a computer?

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        7 months ago

        Yeah, that’s spicier wording than I’d prefer, but there is a sense they’d never apply these high measures of understanding to another biological creature.

        I wouldn’t mind considering the viewpoint, on it’s own, but they put it like it’s an empirical fact rather than a (very controversial) interpretation.