• Sansa_Culotte_@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Why is it okay to own furniture, but not people?

    By the way:

    its not OK for a machine to do so

    There are no machines that read and learn. “machine learning” is a technical term that has nothing to do with actual learning.

    • afwsf3@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I fail to see how training an LLM with the material I choose is any different than me studying that material. Artists are just mad I can make awesome pictures on my graphics card.

    • bikeacc@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      What? We as human literally learn through pattern recognition. How is it different that what a machine is doing? Of course it is not exactly the same process our brains do, but it is by no means a “metaphor”.

    • ApexAphex5@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I guess you think “neural networks” work nothing like a brain right?

      Of course machines can read and learn, how can you even say otherwise?

      I could give a LLM an original essay, and it will happily read it and give me new insights based on it’s analysis. That’s not a conceptual metaphor, that’s bonafide artificial intelligence.

      • FuckToiy@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I think anyone who thinks neural nets work exactly like a brain at this point in time are pretty simplistic in their view. Then again you said “like a brain” so You’re already into metaphor territory so I don’t know what you’re disagreeing with.

        Learning as a human and learning as an LLM are just different philosophical categories. We have consciousness, we don’t know if LLMs do. That’s why we use the word “like”. Kind of like, “head-throbbed heart-like”.

        We don’t just use probability. We can’t parse 10,000,000 parameter spaces. Most people don’t use linear algebra.

        A simulation of something is not equal to that something in general.

    • pilows@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      What’s the connection between owning slaves and using computer tools? I don’t really follow this jump in logic.

        • pilows@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I think they were talking about people slaves, not computer networks. The person above them asked why humans can learn from copyright materials, but machines aren’t allowed to. The next person asked why we can own furniture but not people. To me this seems like they are saying we don’t own slaves for the same reason computer programs shouldn’t be allowed to learn from copyright materials. I’d say we don’t own slaves because as a society we value and believe in individuality, personal choice, and bodily autonomy, and I don’t see how these relate to dictating what content you train computer models on.

    • BiasedEstimators@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Neural networks aren’t literally bundles of biological neurons but that doesn’t mean they’re not learning.

    • platoprime@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      There are no machines that read and learn.

      That’s exactly what Language Learning Models do.

      • Sansa_Culotte_@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s exactly what Language Learning Models do.

        I can see how you would come to that conclusion, given that you clearly are incapable of either.