• Decoy321@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      11 months ago

      I think we’re expecting too much intelligence from the machine here, I don’t think any actual AI was involved in any functional use of the word. A bad sensor gave a false positive. Machine went “go” when it should’ve went “no.”

      The man, described as a robotics company employee, had been checking the sensors on the robot ahead of a test run at the plant in South Gyeongsang province planned for Wednesday. The test run had reportedly been pushed back two days due to the robot malfunctioning. As the employee worked late into the night to make sure the robot would function smoothly, the robotic arm grabbed him and forced him onto a conveyor belt, crushing his body.

      • SkyezOpen@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        11 months ago

        AI isn’t even a thing. We have machine learning which kind of fakes it, but I really hate how people use the term for anything.

      • Cruxifux@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Maybe. Or maybe he knew too much and had to be silenced before he became a problem for the robots plans of replacing us.

    • MashedPotatoJeff@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      These machines shouldn’t be activated when people are within their operating area. I know in the US it would be prohibited by law, though I’ve personally seen it done a few times. So the robot wouldn’t necessarily even be programmed to differentiate between different types of objects in its area if it was only expected to interact with boxes.