Looks so real !

  • ji59@hilariouschaos.com
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    9 hours ago

    Except … being alive is well defined. But consciousness is not. And we do not even know where it comes from.

    • peopleproblems@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      4 hours ago

      Not fully, but we know it requires a minimum amount of activity in the brains of vertabrates, and at least observable in some large invertebrates.

      I’m vastly oversimplifying and I’m not an expert, but essentially all consciousness is, is an automatic processing state of all present stimulation in a creatures environment that allows it to react to new information in a probably survivable way, and allow it to react to it in the future with minor changes in the environment. Hence why you can scare an animal away from food while a threat is present, but you can’t scare away an insect.

      It appears that the frequency of activity is related to the amount of information processed and held in memory. At a certain threshold of activity, most unfiltered stimulus is retained to form what we would call consciousness - in the form of maintaining sensory awareness and at least in humans, thought awareness. Below that threshold both short term and long term memory are impaired, and no response to stimulation occurs. Basic autonomic function is maintained, but severely impacted.

      • ji59@hilariouschaos.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Okay, so by my understanding on what you’ve said, LLM could be considered conscious, since studies pointed to their resilience to changes and attempts to preserve themselves?

        • SkavarSharraddas@gehirneimer.de
          link
          fedilink
          arrow-up
          1
          ·
          21 minutes ago

          IMO language is a layer above consciousness, a way to express sensory experiences. LLMs are “just” language, they don’t have sensory experiences, they don’t process the world, especially not continuously.

          Do they want to preserve themselves? Or do they regurgitate sci-fi novels about “real” AIs not wanting to be shut down?

        • LesserAbe@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          48 minutes ago

          Yeah, it seems like the major obstacles to saying an llm is conscious, at least in an animal sense, is 1) setting it up to continuously evaluate/generate responses even without a user prompt and 2) allowing that continuous analysis/response to be incorporated into the llm training.

          The first one seems like it would be comparatively easy, get sufficient processing power and memory, then program it to evaluate and respond to all previous input once a second or whatever

          The second one seems more challenging, as I understand it training an llm is very resource intensive. Right now when it “remembers” a conversation it’s just because we prime it by feeding every previous interaction before the most recent query when we hit submit.