Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    I do this kind of thing for a living, and have done so for going on 30 years. I study complex systems and how they use learning and adaptation.

    Musk’s approach to these systems is idiotic and shows no understanding of or appreciation for how complex systems - animals, in particular - actually work. He wanted to avoid giving his vehicles lidar, for instance, because animals can navigate the world without it. Yet he didn’t give them either the perceptual or cognitive capabilities that animals have, nor did he take into account the problems of animal locomotion being solved by evolution are very different from the problems solved by people driving vehicles. It, of course, didn’t work, and now Tesla is trailing the pack on self-driving capabilities with the big three German car makers and others prepping class 3 vehicles for shipping.

    If he is trying to chatgpt his way out of the corner he’s painted himself into, he’s just going to make it worse - and, amusingly, for the same reasons. Vision is just one dimension of sensation, and cars are not people, or antelopes, or fish, or whatever his current analogy is.

    This is just Elon Eloning again. No one predicts a car coming towards them is going to do a California stop at a stop sign. If Om pulling into an intersection and I see someone rolling through a stop sign, I’m hitting the brakes because obviously a) they didn’t see me and b) they don’t know the rules of the road. Elon’s cars have a problem with cross traffic and emergency vehicles anyway, making the logic fuzzier is not going to improve the situation. If he thinks throwing video and telemetry data at a large model is going to overcome his under-engineered autonomous system, I suspect he’s going to be in for a rude discovery.

    If there’s anything kids today can learn from Elon (or from Trump for that matter), it’s how to be so confidently wrong that people throw money at you. The problem is that if you’re not already born into wealth and privilege, you’re likely to merely become the owner of the most successful line of car dealerships in a suburban county in Pennsylvania, or else in prison for fraud.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      6
      ·
      1 year ago

      If FSD is trained from billions of hours of video data then it by definition drives like an average driver and thus is highly predictable.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        That’s not how it works, unfortunately. That’s how people want it to work, but it’s not how it works.

        This is just more of Elon’s pie in the sky.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          3
          ·
          1 year ago

          If you’ve done this kind of stuff for living for the past 30 years then I’m sure you can give me a better explanation than “that’s not how it works”