Per corporate policy in each of these murderbot companies, every vehicle has a responsible party at all times. Every car in so-called “self”-driving mode has some human meat-sack in a call center who is nominally in charge of that vehicle, and if something goes wrong, they are responsible. They are contractually in control of that remote machine. If the paint gets nicked, they’re the one getting written up by their manager. They might not be physically inside it, and they might be in charge of more than one, but they are the drone operators. These people are are the “1.5 operators per vehicle, intervening every 2.5 miles” that we have heard about.
Tonight I was chatting with an old acquaintance who now works for one of these murderbot companies (the conversation was… fraught) and they told me that it was commonly known amongst their staff that [San Francisco Police Department] is well aware that every vehicle has a responsible party, but they pretend not to know, because it sounds complicated and annoying to deal with.
Coincidentally, Amazon has this service called Mechanical Turk.
Yeah. I’ve only ever heard that it’s absolute shit work.
A lot of “AI” projects have hidden labor behind the curtains. “Driverless” cars always have a driver