Monty Python Meets Tesla Self-Driving
The following represents such an extraordinary lapse in legal mental acuity that it seems more like a scene from Monty Python than a newspaper article. That it happens concurrent with Boris Johnson’s term as British Prime Minister is hugely and comically appropriate. The metaphor of self-driving-gone-wrong and Johnson’s occupation of Number 10 is just the rib-tickler we need to assuage the disaster brewing in Ukraine.
Bring it on, John Cleese. Give it a go.
Unfortunately, Monty is a character rather than a person and John was busy elsewhere. At least I suppose he was, he didn’t return my calls. So, you and I will have to brave it out ourselves. Here’s the article:
(Guardian UK) Self-driving car users should have immunity from offences – report: Law commissions recommend that vehicle users should not face regulatory sanctions if something goes wrong.
I am not making this up for a cheap gag, so help me god.
Users of self-driving cars should have immunity from a wide range of motoring offences, including dangerous driving, speeding and jumping red lights, Britain’s law commissions have jointly recommended.
Should they now? So, I can jump in my Tesla (if only I owned one), slip it into Ludicrous mode, hit self drive on the screen, dial in a destination and crawl into the back seat for a nap, in order to keep myself fresh for the accident. Ludicrous, by the way, is defined as “broadly or extravagantly humorous; resembling farce.”
You gotta hand it to Elon, the man has a sense of humor. If only the same could be said of the various British Law Commissions.
The Law Commission for England and Wales and the Scottish Law Commission propose creation of an Automated Vehicles Act to reflect the “profound legal consequences” of self-driving cars. The person in the driving seat would no longer responsible for how the car drives; instead, the company or body that obtained authorisation for the self-driving vehicle would face regulatory sanctions if anything went wrong.
Profound legal consequences, if anything went wrong?
Perhaps that would include decapitation from driving under a lane-turning semi-trailer because the self driving system read the space under the truck as open road.
Teslas that have had Autopilot in use also have hit a highway barrier or tractor-trailers that were crossing roads.
The National Highway Traffic Safety Administration (NHTSA) sent investigative teams to 26 crashes involving Autopilot since 2016, involving at least 11 deaths.
From a 2020 article from VOX on a fatal accident in Mountain View, California:
Here’s the background: Two years ago, a 2017 Model X that had its Autopilot feature engaged was driving along a highway in Mountain View, California, when it struck a concrete barrier at a speed over 70 miles an hour. The crash was ultimately fatal for the driver, who died of injuries related to blunt force trauma.
After a months-long investigation, the agency identified seven safety issues related to the crash, including limitations to Tesla’s crash avoidance system and driver distraction. Among them, it appears that the driver was playing a game on an iPhone provided by his employer, Apple, and that he didn’t notice when the Autopilot steered the electric vehicle off-course.
Distraction, huh? That’s just Darwin, sorting out the wheat from the chaff in the human species.
“The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity,” reads the report.
Don’t you love the language? Absolutely bloodless. Guess they didn’t want to say “the asshole was playing a game when he should have been watching his steering wheel do the self-driving-polka.”
Now don’t get me wrong, I’m a huge fan of Tesla.
But I’m not a fan of some Tesla features that I don’t think should be trusted to a largely uninformed and unqualified public. Tesla is constantly featured in Videos, outperforming the fastest cars on Earth—and that’s okay by me, it’s interesting stuff. I just don’t think such modes as Ludicrous and Insane should be available to the public. “Hey, Dad, can I borrow the car” now has serious implications.
The first thing I did after getting my driver’s license (at age fifteen) was to get an ignition key made. The second thing I did was to take off in Dad’s huge ’49 Packard convertible whenever he and Mom were off elsewhere in her car. I could write a book about the consequences of those opportunities, but refer to my rights against self-incrimination, protected by the 5th Amendment of the U.S. Constitution.
So, that’s my gripe against the British Law Commissions.
Essentially, technology has left Darwin in the rear-view mirror. Silicon Valley has given us tools way beyond our slow-moving genetic advances to use them safely. To quote Martin Luther King, Jr., “we have guided missiles and mis-guided men.” And he said that a hell of a long time before Tesla, Ludicrous or Insane. We sure as hell are not capable of safely controlling an automobile at 0-100 kilometers per hour in 2.5 seconds. I’ve driven well over two million miles in my lifetime and think I’m a pretty capable driver, but if I ever do score a Tesla, you can keep the bells and whistles, thank-you very much.
Yet British Law commissions recommend that vehicle users should not face regulatory sanctions if something goes wrong. In America we’ve seen what removing regulatory sanctions from the ownership and use of firearms has brought us. This is not a Monty Python sketch, it’s serious business and it opens a crack in the door that tends to un-regulate human responsibility.
Self-driving automobiles are coming, there’s no doubt of that. And it will be a good thing when they do. But the development and refinement of those systems is the business of the auto makers. Personally, I’m against having these unperfected systems available today to the public in ‘beta’ development. We are not Tesla’s proving grounds.
We have not yet seen a Tesla at 100mph run a traffic light and cut a school bus in half with 35 children on board. Personal immunity from a wide range of motoring offences, including dangerous driving, speeding and jumping red lights, is not going to prevent that from happening.
There is, as always, a huge difference between law and justice. What is the justification for this flight of fancy on the part of the Law Commissions?
Like its close cousin "justice," justification is derived from the Latin justificare, which means "to make right." When you offer a justification, you're trying to make something right—or, perhaps, even just.
The British Law Commissions have missed that target by a mile.