Man accused of dangerous driving for sleeping in self-driving, speeding Tesla
globalnews.ca> “They are not self-driving systems,” said Supt. Gary Graham of Alberta RCMP Traffic Services.
> “These technologies assist drivers, but do not replace them,” said the statement from the federal institution responsible for transportation policies and programs ...
> Tesla’s website states, “All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future.”
... full self-driving capabilities in the FUTURE ...
So it is a DISHONEST marketing issue that creates a wrong perception in the consumer, and that's what the government should address.
Create regulation that do not allow such vehicles to be marketed as "self-driving". Create some kind of certification to earn the right to advertise a vehicle as "self-driving". And define and standardize better labels for the current technologies used, and publicize it well.
> So it is a dishonest marketing issue that creates a wrong perception in the consumer
German courts share this opinion and Tesla is prohibited from naming their product Autopilot or referencing any FSD.
So people keep claiming, but I've yet to see a single example of an actual Tesla owner who was confused about this.
The person in this story wasn't confused and didn't even attempt a defense that Tesla had mislead them.
So he and his passenger went to sleep fully knowing that Tesla's driving assist could fail at any moment? You mean, they were both suicidal? Much more plausible is that he is scared to implicate Tesla into this.
Tesla marks all these features as beta on the menu where you activate them, and pops up a long warning text explicitly stating the limitations. I don’t think Tesla could do much more than they already do.
They’ve oversold the tech and the capabilities in multiple public statements. Musk said they would have a fully autonomous cross country trip by 2017, and said there would be 500,000 robo-taxis on the road by the end of this year. These claims make the general public assume the tech is almost ready and that disclaimers are just a legal CYA. Why else would people be trusting their life to something that could fail at any moment.
The only thing that makes me question is that only around 5 people have been killed by a Tesla on autopilot or FSD, which frankly is shocking to me given the multiple videos of abuse of these systems floating around the internet.
Describing driving assist as "beta FSD" is already grossly misleading, and still my point stands, either they were mislead or they were suicidal.
I was highly skeptical of these systems too: for normal freeway driving, Tesla’s systems are much better than I expected them to be. There’s no way I would be comfortable enough to sleep in my car, as a programmer, but I could imagine someone getting overconfident. This has always been a major risk with semi-automated systems: I believe airplanes dialed back the functionality of autopilots to prevent pilot complacency.
OK that is a good point. So you are saying, maybe the marketing wasn't what misled the driver, but the abilities of the system. A system that fails 1% of the time will give a the user overconfidence up until the first time it fails.
Yeah, this is a good way to put it.
>airplanes dialed back the functionality of autopilots to prevent pilot complacency.
Or maybe because the potential for disaster is too great, like the 737 max attempting to nosedive into the ocean because of a single sensor failure.
This is a well-known issue: when automation gets good enough that the human in the loop starts to get bored, it can be more dangerous than either “all human” or “all computer”
Automated flying reminded me of this incident in India - some pilots used to cover the windows of their cockpit with newspapers after take off, presumably to have a short nap!
Source: https://www.nytimes.com/2011/05/26/business/global/26airindi...
Disengage all driver assist tech when the seats are reclined beyond a certain point.
If you feel like having some ‘fun’ (in a Dwarf Fortress sense) read about Leran Cai’s driving record and some of the arguments they have previously used in court. I’m nowhere near qualified to know what’s up with this dude but holy crap, someone make popcorn...:)
You're ignoring the range of risk between safe and suicidal. Someone who drives down the highway at 100 MPH may be reckless but not suicidal. It's the same sort of thing. A stupid risk but far from a certainty of death.
You think it's regulation issue. I say it's because it's overtrust for an almost working system.
If something gets good enough you will rely on it more and more. It will work, until it doesn't.
And the systems fail in ways that a human wouldn’t which makes it more dangerous. Teslas often “shadow brake” when driving under overpasses or overhead signs because the radar system gets confused.
This is what gets me. A big part driving safety is defensive, ie monitoring the other drivers around you. It's relatively easy to pick out distracted, erratic or risk-taking drivers from a handful of simple queues. Very much not the case with software driven autopilot routines from an increasing number of vendors. So accidents may go down where humans are at fault, but I find it more unsettling overall, as I have less control to take acceptable safety precautions.
Yes, defensive driving is the norm because we humans are fallible. If all the cars on the road were self-driven, then it would be a different story.
Assuming self driving cars will be infallible.
>And the systems fail in ways that a human wouldn’t which makes it more dangerous
Not necessarily, because humans fail in ways that Tesla autopilot wouldn't.
1.5 million people are killed in road accidents every year. 54 million people are injured.
Even with certification, person behind the wheel will still be responsible (as long as he / she has control over vehicle). Once vehicles ship without steering wheels / pedals, that will not be the case anymore.
As someone else suggested, manufacturer should be made liable in case of an accident for "self-driving" vehicles.
I find it a little concerning that the car is allowed to drive in autopilot with the seats reclined. I thought there were warnings for taking your hands off the steering wheel? I suppose they can be ignored. But full recline is a bit OTT.
A car is not your mother. It has no obligation to prevent you from speeding or driving the wrong way down a one way street or otherwise being an idiot. And it shouldn't try, because sometimes the car has an inaccurate map or doesn't know that something unusual is happening or the sensor is broken etc.
When I'm driving I get to decide how loud the radio is or whether all the windows are down, not my passengers. If the car is doing the driving I have no problem with the car stipulating its own requirements to ensure a safe trip.
The whole point is that the car is NOT doing the driving the same way cruise control is not doing the driving either. It is not fully, self driving car, yet. It can automate some portion of the driving task, just as cruise control can, but the limits of operation, as stated by the producer, do not allow it to control the car unattended.
No. The car isn't a person. It has no awareness of external context. You can't change its mind using evidence or reasoning. The person in the vehicle should always be able to override the car.
> A car is not your mother.
Rhetoric like this works until it won't anymore. And that rhetoric breaks when innocent bystanders start getting killed in numbers. Line up enough grieving families in front of congress and congress will act, and then the new rhetoric will be "The car should be your mother."
> Rhetoric like this works until it won't anymore.
Let it continue to work forever then.
> And that rhetoric breaks when innocent bystanders start getting killed in numbers. Line up enough grieving families in front of congress and congress will act, and then the new rhetoric will be "The car should be your mother."
Do I get to line up mine? Cars doing what idiots tell them to is remedied by punishing those idiots. Cars not doing what people need them to results in fatalities.
Bob is a big dude. He weighs 400 pounds and can only fit in the car by putting the seat all the way back. There are a million Bobs and you just prohibited any of them from ever using driver assist, which would have improved safety and saved lives.
Your car won't let you drive more than 10MPH over the speed limit. But then it thinks you're on a service road with a 30MPH speed limit when you're really on the highway with a 70MPH speed limit, forces you to decelerate to a speed 30MPH slower than the traffic flow, and you get rear-ended and cause a twelve car pile up with nine fatalities.
You're on a family vacation driving through a deserted area in the middle of nowhere and stop to stretch your legs. When you get back in the car, some sensor has failed and the vehicle's computer won't let you put the vehicle back into motion again, but there is no wireless coverage or anyone else around, so you and your family die in the wilderness.
You need to drive your car the wrong way down a one way street because you need to get to a hospital and the other road is blocked, or you're currently impeding a fire truck answering an emergency call, or you need to get away from a forest fire. The computer refuses.
"I can't let you do that, Dave." -> People die.
While you make a bunch of good points about the unintended consequences of regulations, it's clear to me that cars work as is without an auto-pilot -- and have since the original Model T.
That said, Ralph Nader's "Unsafe At Any Speed" is a great example of why regulation of automobiles is needed. There's also a reason why cars have safety glass on the windshield.
I'd like to see your entire post again, but related only to self-driving cars.
Then IMO it doesn't seem too invasive to have a breathalyzer test attached to all vehicles, which will certainly save a lot of lives. The data can be all local and regularly wiped to mitigate the privacy concern.
Are the numbers behind alcohol-related vehicular injuries and deaths not high enough to be a political or public priority?
Looks like there's plenty of laws and support surrounding drunk driving.
There are more accidental deaths from firearms than Teslas. Clearly we need better regulation of firearm handling and ownership.
If so,why aren't cars locked to the maximum speed in a country?
I mean, there's interlocks between the shift mechanism and brakes, at least in part because it's safer.
Which there's two things to separate out; the lack of a feature on a vehicle doesn't particularly free the driver from responsibility for things they do, and then there's what sort of idiotic vehicles we are willing as a society to share the road with.
"A car is not your mother" doesn't really answer the second part.
Very much appreciated.
People need to understand THEY are responsible for what they are doing, not the companies for not preventing them doing stupid things.
"A car is not your mother"
I am definitely going to use it:)
"Witnesses claim the man behind the wheel and his passenger were asleep with their seats fully reclined, as their Tesla travelled up to 150 km/h on the freeway near Ponoka..."
Given this and the man's other past incidents, the Tesla was only a minor contributing factor in the guy's global stupidity and danger that he inflicts on the world daily?
Edit: Btw, what would be the end accident scenario here? Presuming that the car wouldn't hit any other car on the road, would it lose control on a curve that was too tight for the speed, or fail at the end of the freeway or something? Would it alarm and then come to a gradual stop? Not a Tesla owner here.
"Presuming that the car wouldn't hit any other car on the road" - it absolutely would though. The tech is good enough to avoid hitting another car that slows down in front of you, but if someone was broken down(and fully stopped) in your lane, the car will hit them at full speed. It might start breaking just before impact, but not fast enough to slow you down from 150km/h. No self driving or adaptive cruise tech can save you from this situation, because they are all trained to ignore fully stationary objects. I've also seen situations where road markings disappear on the road for whatever reason and Tesla goes from being 100% confident and in control to "take over the wheel NOW" in about 1 second flat. Best case scenario here is that the car would eventually realize that you aren't in control and come to a gradual stop - the problem there is that now he is fully stopped on an active highway and chances of getting plowed from the rear are astronomical.
I think the biggest problem with all of these self driving systems is that in cars they can require you to be fully 100% aware of your surroundings and take over in a split second. People compare it to plane autopilot, but I don't think that's quite right - in a plane, you will have at least a minute or two before you hit the ground, even if the plane takes a straight nose dive. In a car, you go from being fine to being headed for a head-on collision in no time at all.
This [1] was the famous 2016 crash where a Tesla broad-sided a basically stationary semi-truck. Are you suggesting Tesla hasn't solved this type of problem yet?
[1] https://www.reuters.com/article/us-tesla-crash/tesla-driver-...
Yes, of course Tesla hasn't solved it yet, because no one has, and in fact it probably isn't possible to solve without some magic tech that doesn't exist yet - it might be possible to fix with LIDAR, but obviously Tesla doesn't have it.
All of these systems have to ignore stationary objects ahead of them, or the car would be emergency braking in too many daily situations to be usable.
Simple case, at 150km/h you are travelling at 41m/s. The stopping distance at that speed is roughly given as 130m[0]. So the car would need to see and recognize objects at least ~150m ahead to stop autonomously from that speed. That's just physically not possible, the cameras on the Tesla don't have enough resolution to do such a recognition. Instead, a radar-based distance measure is used - but again, even if the car detects that you are rapidly approaching "something" 150m ahead of you, that information is nearly useless. At that distance, you cannot differentiate between a stopped car 150m ahead of you, an overhead sign, or a large rock next to your lane which poses absolutely no danger whatsoever - it all gives the same signature. LIDAR doesn't have that range either. And then of course even if the car could reasonably detect that there is something ahead of you that you are absolutely 100% definitely heading towards, it has no idea if the road doesn't curve in such a way that you would avoid it. Famous case of adaptive cruise systems freaking out at bends, because according to the radar/image recognition you are CLEARLY heading for that telephone pole standing next to the road - but of course the road curves so you aren't actually going to hit it. Problems like that.
[0] https://www.random-science-tools.com/physics/stopping-distan...
Sorry, coming in late, but gotta object here...
My Subaru has no problem detecting stationary objects. It uses stereo cameras, for which object detection and locating the object is a thoroughly solved problem, whether the object is moving or stationary. No magic needed. It just works. For example, there is a sharp turn near my home with prominent turn signs along the curve, which puts them directly in front of you as you approach the turn. Those stationary signs quite reliably set off my collision warning if I'm approaching them too fast.
Tesla has the problem because it uses radar for object detection. The radar can tell them the presence of an object, but cannot tell them the object's location. An overhead sign or overpass or whatever looks the same as an object in the road. So to avoid countless false positives, they need to ignore signals from stationary objects because they chose to use radar. It's a self-inflicted problem that other vehicles do not have.
Aren't there fully self-driving vehicles undergoing testing on normal city streets with stoplights? We could say that's a different mode than high-speed highway driving, but there are a number of locations where those modes blend into each other. Are the experimenters extra careful to avoid those locations? How long before the tech exists to address this issue?
Sure there are, and yes, they are different sets of problems though. Have a look at the British Tesla Driver Youtube channel, some of his videos are eye opening. Basically the car is in full autopilot mode, approaches an intersection, correctly slows down, waits for its turn, starts moving.....and in the middle of the turn goes BEEP BEEP BEEP and disengages entirely because the road markings aren't there and it wasn't entirely sure where to go. And now of course you're in a moving vehicle that's heading for a collision with someone else and requires IMMEDIATE attention to continue. One could(and I'm sure will) argue that the system "shouldn't be used this way". But that's a moot point, if the system is there and lets you do this, then people will use it this way.
"How long before the tech exists to address this issue?" I'm not sure if that's a problem with tech as such. We have fantastic cameras, yet famously Google's best image recognition algorithm just couple years ago would reply, with 100% confidence, that a sofa in a zebra print is in fact a Zebra, after all the stripes are there, it has 4 legs.....it must be a zebra.
So in my(personal) opinion, self driving will face the same challanges image recognition has faced - we will rapidly get 90% of it right, then the last 10% will be a massive pain to get right for decades if ever.
It's an active research field. E.g. from October 2020: Calibrating Deep Neural Networks using Focal Loss: https://arxiv.org/abs/2002.09437
> Miscalibration - a mismatch between a model's confidence and its correctness - of Deep Neural Networks (DNNs) makes their predictions hard to rely on. Ideally, we want networks to be accurate, calibrated and confident. We show that, as opposed to the standard cross-entropy loss, focal loss [Lin et. al., 2017] allows us to learn models that are already very well calibrated. When combined with temperature scaling, whilst preserving accuracy, it yields state-of-the-art calibrated models. We provide a thorough analysis of the factors causing miscalibration, and use the insights we glean from this to justify the empirically excellent performance of focal loss. To facilitate the use of focal loss in practice, we also provide a principled approach to automatically select the hyperparameter involved in the loss function. We perform extensive experiments on a variety of computer vision and NLP datasets, and with a wide variety of network architectures, and show that our approach achieves state-of-the-art calibration without compromising on accuracy in almost all cases.
Calibration will be practically solved in couple of years. Then a bit longer for addressing adversarial robustness.
There have been multiple reports of Teslas on autopilot hitting stationary roadside emergency vehicles.
My Tesla has regularly come to a full stop from freeway speeds in response to traffic ahead. This may not be 100% reliable, but it’s not true that “the system is trained to ignore stationary objects”.
It also beeps when it thinks I’m going to hit a stopped car and other things that demonstrate that this claim is false.
The question was specifically about this situation as described in the article - the car moving at 150km/h. No, your Tesla wouldn't stop in time for a fully stopped car in that situation.
It does slow down for traffic, but I think many people don't realize something - it works with moving traffic, because then it can definitely recognize that you are approaching a car(or at least something that moves in the same direction you do), so it knows it has to slow down for it.
>>It also beeps when it thinks I’m going to hit a stopped car and other things that demonstrate that this claim is false.
Read up on it, I'm sure the upper limit for this function is when the delta speed is <50km/h. It won't work with a delta of 150km/h because it's physically not possible.
>> but it’s not true that “the system is trained to ignore stationary objects”
These are not my words, that's exactly what Tesla said after the "trailer across a highway" accident, saying that of course they have to ignore stopped objects otherwise the car would emergency brake for overhead signs since they reflect radar the same way a stationary car does - at large enough distance there is no difference.
>> These are not my words, that's exactly what Tesla said after the "trailer across a highway" accident
>> Read up on it, I'm sure the upper limit for this function is when the delta speed is <50km/h. It won't work with a delta of 150km/h because it's physically not possible.
I think both of these have the same explanation: if you want to release a feature like this specced at a 50kph delta, you design for a safety factor of 2-3 (100-150) so that you can be confident that it’s safe at 50kph. The claim that “it’s physically impossible” doesn’t make sense to me: humans drive safely with such deltas using only “a video feed” and sound.
Anyways, I’m fairly certain I’ve come to a stop on autopilot from at least 60mph (100kph).
>>The claim that “it’s physically impossible” doesn’t make sense to me: humans drive safely with such deltas using only “a video feed” and sound.
My logic is as follows - at 150km/h, you are covering 41m per second, and an approximate stopping distance from that speed is about 130m. Human eyes are much better at recognizing objects from a distance than computer based vision is, and Tesla is in fact relying on cameras for its forward object detection, plus a rudimentary distance-based radar. There is no chance(that's why I said "physically impossible") that whatever camera is mounted in the Tesla can reliably recognize an object(and tell that it's stopped!) at 130m. Of course the system needs to do the processing, make a decision, send a signal to the brake actuators and actually engage them. Let's be generous and add a full second to this - so to stop from that speed Tesla would need to recognize a car, identify it as a hazard, and make a critical "all brakes at maximum strength" decision from 170m away. There's no chance.
>>Anyways, I’m fairly certain I’ve come to a stop on autopilot from at least 60mph (100kph).
Ok, but there will be an upper limit to this, and I'd love to know what it is. I know that Deimler's solution only guarantees full avoidance at deltas up to 50km/h, and "reduced" impact at higher deltas - it just doesn't see far enough. Tesla's technology is fundamentally very similar, so I'd love to know what they consider as reasonable distance for full autonomous stop.
Tesla's have crashed and obliterated their owners, even right here in California on major highways on the way to their Silicon Valley office park.
Alarmed engineers taking the exact same route to reproduce the bug successfully.
> Provincial motor vehicle acts don’t address self-driving vehicles
Right now, in Canada, we have to meet certain criteria and pass multiple tests to get our drivers license without which we cannot legally drive on public roads.
Until there is some kind of similar certification for self driving systems I don't understand why there would be any legal difference between doing this in a Tesla Model S and doing it in a '94 Miata.
The last time this hit HN the article was a bit more informative.
Two people in the car with both front seats reclined, “appearing to be asleep.”
The asleep part is almost certainly sensational drivel. This is a driver with a history of reckless driving who was screwing around with his friend.
But because he was screwing around in a Tesla it is therefore newsworthy.
The seats were reclined, but there is no evidence that the driver and passenger were both asleep. They “appeared asleep” because the seats were reclined.
This was undoubtably a stupid prank. I’ll be glad for the day when kids can’t pull stupid pranks in their cars. The trade-off is we’re unlikely to truly “own” our own cars at that time, they will operate as essentially private taxis.
Tesla sells an upgrade literally named “Full Self-Driving Capability”. The owner/driver is clearly liable but a strong argument could be made that Tesla should not be allowed to name things the way they do.
x (40?) years from now:
12-car accident triggered by manual mode
defense attorneys stated a poor decision due to fatigue, while prosecutors provided evidence that the driver had taken a 25-minute power nap, woke up and performed well in a class she was taking, and then proceeded to override the autonomous vehicle system in a manner which showed specific and malicious intent to cause the accident.
I have told my kids many times that I expect piloted cars to be a rarity in the future. Once reliable self-driving cars are common, the difficulty in obtaining a driver license will likely increase.
It’s entirely possible that self-driving cars could be realistic about the time we all get flying cars.
The reason we want self driving cars is the same reason we don't have flying cars.
Aren't flying cars and self-driving ground cars different problems, and thus likely to be solved at different times?
Self-driving is a sensing and intelligence problem, while flying seems like a mechanical, aero, and policy one.
One solution is to put a camera inside to monitor the driver behaviour. Mazda is already selling cars with this technology.
Tesla makes you apply force to the steering wheel occasionally, but there are hacks that to try to circumvent it.
Yes it's too easy to circumvent it. The camera solution is a lot safer.
I’m a huge fan of Elon Musk so take this with a grain of salt, but the company has totally misled and overhyped the self driving feature. It’s almost like a wink and a nod, giving people a false perception, and that’s wrong.
Under a new proposal from the UK‘s Law Commission, the “user in charge” of the vehicle would not be prosecuted for careless or dangerous driving, speeding, or breaking red lights when the car was in self-driving mode, the Telegraph reports.
The Law Commission instead suggests that responsibility should fall on the developer or manufacturer of the hardware that enables self-driving functions on the vehicle.
see https://thenextweb.com/shift/2020/12/18/autonomous-vehicle-m...
That's a pretty decent idea - if a manufacturer claims his vehicle is "self-driving" then they should be the one accountable for any accidents in case the vehicle malfunctions. (Ofcourse, this would make the vehicle costlier to own - manufacturers may demand regular check and servicing of the vehicle, driving up the total cost of ownership.)
Here's a link to the documents. This is a consultation document.
https://s3-eu-west-2.amazonaws.com/lawcom-prod-storage-11jsx...
Does this mean we can get number plates which are issued to the manufacturer, and the same for every vehicle from that manufacturer, so that they're not able to be used for mass surveillance of individual vehicles anymore?
Tesla doesn't have self-driving though..
If you go to Tesla's UK page and try to place an order for a brand new Model 3, it uses both "Self-Driving" and "Autopilot" on the same page. How this company hasn't been fined to oblivion yet is beyond me:
The word 'autopilot' never meant reacting to anything happening around the vehicle, much less intelligently. Not a single autopilot can do more than keep the vehicle going in the direction you set (using a knob). Autonomous features are not called autopilot anywhere, ever.
'Full self driving capability' is a hardware capability, not a feature that you can turn on - it is there so you don't need to replace the hardware when the autopilot software can be replaced with fully autonomous driving software.
Edit: Downvoters, please explain. It is not okay to redefine words for your purposes.
Imagine I am a John Smith that doesn't know anything about Tesla and how the tech works. I want to buy their car because maybe I like the idea of an electric car and they seem to be well reviewed. I go through the configurator, then tick the box next to the option called "Full Self-Driving capability".
In your opinion, in front of the court of law, would you consider that John Smith has purchased a car that has something called "Full Self-Driving Capability", or not? In front of the same court of law - would you consider that Tesla advertises such a thing when presenting their cars?
>>Autonomous features are not called autopilot anywhere, ever.
You are absolutely correct, other car companies call this what it is - (adaptive) cruise control. Not autopilot.
>>'Full self driving capability' is a capability, not a feature that you can turn on
Well, but the website itself says that at the moment, as-is(not in the future!) the car will be able to autonomously drive from on-ramp to an off-ramp on the motorway - so.....that sounds like a feature to me? It even has a button for it! What is it about turning it on or off?
Adaptive cruise control is one component of autopilot. Mercedes has "drive pilot" (and ACC is one of the features included in that package), why is that okay? To me that sounds way more autonomous than autopilot - sounds like I don't need to drive (I still need to drive with autopilot, both in a car and in an aircraft).
https://en.wikipedia.org/wiki/Autopilot#Autopilot_for_ILS_la... Specifically in the case of fail-operational autopilots "[...] the approach, flare and landing can still be completed automatically." There is a whole organization responsible for defining these features of autopilots!
You can argue wikipedia isn't a valid source, but IMO that's not in good faith. So, 'autopilot' has a shifting meaning depending on aircraft capability.
Well yeah, that's in case of failure - the Tesla autopilot can also save you in case of failure.
Full Self Driving Capability suggests that the car has Full Capability to drive itself.
That it does so as safely and legally as a human driver is implied by the word "full"
No, you should read that website, and perhaps try to buy a Tesla and see the materials. It says clearly that it's the hardware and that the software is not there yet.
That is a ludicrious distinction irrelevant to the end customer. I can't find that distinction on the website, mind sharing a link or screenshot?
Besides, how would you even establish that the hardware is self driving capable if the software isn't? You might need to change the hardware once you realise the software needs changes too?
Maybe you should read it again, because that's clearly not what it says. It lists several features which are available right now under the "full self-driving" moniker, and two which will arrive in the future. As it is right now, the system has more functions available than it is planned to have.
The screenshot I replied to says it exactly, near the bottom. It's not about number of bullet points, it's about what the bullet points say.
There is no "full self driving" button in a Tesla, and the Autopilot button will first produce a notice about the capabilities and it says you absolutely can not stop driving, you have to confirm it before it engages.
Correct. But distracting driving is happening today without allowing the driver to physically disconnect from the driving system. The manufacturers, willing to sell the new Advanced Driver Assisting Systems (ADAS) are saying that it would be safe for the drivers to actually disconnect from the driving systems (hands off the steering wheel and feet away from the pedals) which is simply insane.
ADAS are designed as an additional safety level for ACTIVE driving, when the driver is 100% driving the car, and not for a potentially much more dangerous situation, when the driver (as encouraged by the manufacturer) is completely disconnected but supposedly "alert" at all times.
Meanwhile, stock go brrr and world go round.
Empirically, you’re absolutely right, except that it’s not wrong.
I’ve come to terms with it. How long are we going to bleat about morals while the world ignores it?
I’m asking genuinely, for what it’s worth. It’s one of the central questions I’ve faced. What are your morals worth? Why cling to them? It feels good to call out Tesla as immoral, but both legally and practically this seems to be mistaken.
Curious: Why would the Tesla be speeding, since it was in “autopilot” mode? Wouldn’t it know about the speed limits? Or dis the driver set on cruise control first?
You can happily set the autopilot speed to above the speed limit. There are a few gotchas and it won’t always let you, but in general, it is fine. I normally set it to 10% higher than the posted speed limit as an example.
You intentionally wish to drive 10% higher than the speed limit.. So, you desire to be in car crash? Also one at faster speeds and therefore exponentially more deadly?
Everyone has to use public roads, not just you. You are endangering all other diligent road users with your behavior, not just yourself.
While I don't condone speeding (and acknowledge the relatively greater damage when accidents happen), I think you need to distinguish between that and unsafe driving which are two very different things.
You can be driving below speed limits very unsafely (and above them safely) and the constant focus on speeding, especially via automated mechanisms like cameras at the expense of traffic officers means we risk a much lower standard of driving and much less safe roads.
Note though my concern is not the crackdown on those who flout the law and/or break speed limits; but the misunderstanding of what unsafe driving is (as implied by your comment), and headline anti-speeding measures that are used to reduce costs and increase risk.
I wonder, do you drive on highways? Because I suspect you don't often. I can't think of a time, sans traffic, that I was cruising under 72 on our local highways (speed limit 65). And I'm usually being passed.
Here in Seattle, in the absence of traffic forcing slower speeds, most people are going ~10mph over the speed limit. On the freeway, although the speed limit is 60mph most people will be going about 70mph. If you use Tesla's autopilot it will match the speed of the car in front of you, up to the maximum specified by the "speed limit". So I usually set the speed limit to something like 75mph so that the car follows the flow of the traffic around it, instead of forcing other drivers to swerve around me because I am going 10mph slower than everyone else.
Curious where you live that people don't speed. Here in Ohio you're expected to go 5 over at all times, people will tailgate and honk if you don't. 10% is lower than that at all but highway speeds
Australia (as an example) has hard speed limits. If you go over their speed limits by 1kph (kilometers), you can get a ticket and they put speeding cameras in abandoned prop cars on roads to catch speeders. I learned this from a coworker who moved here (Chicago) from Melbourne as he was remarking about driving on the highways here.
Most of the US drives faster than the speed limit. It is unsafe to drive the speed limit if everyone is going 10-15mph over the posted limit. Assuming you know the context you clearly don’t makes you look like a jerk. Just because some places (like Australia) strictly enforce speed limits doesn’t mean everyone everywhere does.
Teslas still cannot reliably read signs. It's working in good weather conditions but dirty signs or rainy conditions are still a problem for Tesla's allegedly just around the corner full self driving ability.