Tesla Autopilot ‘easily’ tricked into working without anyone in the driver seat
theverge.comI own this exact tractor
https://commons.wikimedia.org/wiki/File:New_Holland_TC30_HST...
and even it has a sensor in the driver's seat that will stop it if there is less than 100 lbs in the driver's seat. (When my son weighed 80 lbs he couldn't drive it unless I put 2 10 lb weightlifting plates and put them in a backpack.)
The purpose there is so that the tractor doesn't drag or crush you if you fall out of the seat, but that kind of sensor is also commonly used in the passenger seat of cars to scale the force of the passenger side airbag to the size of the occupant.
AFAIK, Teslas do have weight sensors in the drivers seat. Now the question is, why is that not used in this scenario? A bug or some actual reason not to continuously check driver weight (=presence)?
I actually find that these nanny features can be to be annoying. My car will beep continuously if I put something in the front passenger seat but don’t buckle the seat belt, because it thinks a passenger is riding without the seat belt. I would rather it just say something once and get out of the way, letting me use my car how I want.
For my car, you can turn on the ignition and buckle/unbuckle the drivers belt 3 times in a row. Doing that turns off the constant alarm and only chimes when you first start and shows the light on the panel. Maybe your car has something similar.
I am torn by this. On the one hand a weighted chain and buckled seat belt is a simple hack. On the other hand it is clear that they were intentionally attempting to get around Tesla's safeguards. I fail to see how this is the company's fault.
What could Tesla possibly do to provide more safegaurds? Measure resistance on the steering wheel to prove it is flesh on there? Driver-facing camera? Weight sensor in the seat? Each of these has their own issues including defeatability and privacy invasions.
Anything can be defeated eventually. However any attempts to be defeated should automatically put the liability on the person making that attempt. Example: It is not Samsung's fault if I intentionally disable safeguards and over charge my battery causing a fire or explosion.
> I fail to see how this is the company's fault.
Because other companies do it better. Tesla likes to publicly position its driver assistance as being so good that they refer to it as "Full Self Driving", and yet the system doesn't include a number of basic safety measures. That's part of the reason why they tell regulators that "Full Self Driving" is only a level 2 system:
https://www.thedrive.com/tech/39647/tesla-admits-current-ful...
> Measure resistance on the steering wheel to prove it is flesh on there? Driver-facing camera? Weight sensor in the seat?
Yep.
> Each of these has their own issues including defeatability and privacy invasions.
Tesla cars in general are privacy invasions. For example, a recent car crash resulted in Musk tweeting about what Tesla's logs supposedly reveal about the car before it crashed. Tesla is quite happy to rummage through your data and use it to protect themselves. In this case it earned Tesla a search warrant:
https://www.thedrive.com/tech/40250/elon-musk-denies-autopil...
> Tesla likes to publicly position its driver assistance as being so good that they refer to it as "Full Self Driving"
No, Tesla very clearly says that full self driving is a future feature and not one that is available today (https://www.tesla.com/autopilot). All it says on there is that the hardware is present to enable it. There’s also a disclaimer on there:
> Current Autopilot features require active driver supervision and do not make the vehicle autonomous.
They spell this out even more clearly elsewhere in their documentation. The FSD Beta release notes explicitly tell users that it can do the wrong thing at the worst time, that they need to pay extra attention, and that they shouldn’t get complacent (https://www.newsweek.com/tesla-full-self-driving-beta-releas...).
I do agree Tesla’s practices seem like they’re normalizing massive privacy invasions. But I really don’t think the company is at fault if irresponsible drivers use their vehicle in an incorrect manner, which they’ve been abundantly warned against.
> No, Tesla very clearly says that full self driving is a future feature and not one that is available today
But this is the sort of thing Tesla says with a megaphone:
https://www.theverge.com/2019/4/22/18510828/tesla-elon-musk-...
Where are all those robotaxis?
The fine print doesn't defend their grubby marketing and it's little wonder people believe the cars to be capable of more than they are. Just using the phrase "full self driving" in relation to a system not capable of it is bad enough.
If I wanted to trick GM supercruise, I could just tape a picture of a face onto the driver's headrest and put a sandbag on the seat, no?
I am fairly certain I could "trick" my 2006 Toyota Camry into driving without anyone in the driver seat.
Musk said the logs show that the car wasn't even in autopilot mode. Driver maybe was trying to escape the fire and thus was not found in the driver seat. Although this is just as much speculation as assuming the autopilot was tricked so we should just wait and see what the investigation will show.
Govt has witnesses saying the dead passengers were showing off the Autopilot.
I see no reason to believe Musk, a serial liar with a massive incentive to lie here.
He even hedged by saying "data recovered so far" doesn't show Autopilot engaged.
Did you read my post? I didn't say I believe Musk. I said both stands are equality useless speculation and we should simply wait for the investigation result.
Either way one thing is rather certain, stupid humans where involved and are most likely the sole reason for the accident regardless of which story turns out to be true.
No one thinks Tesla murdered these men. They're worried that Tesla 1) lied to them about the capabilities of their car, and 2) failed to do very easy things to keep their car from endangering other people on the road.
Maybe it was on beforehand and shut-off, causing the accident as there was no-one to take back control?
Both this article and the Ars Technica article (https://arstechnica.com/cars/2021/04/consumer-reports-shows-...) are prime examples of misinformation and fake news. Unfortunately they won’t be flagged as such by Twitter or Facebook, even though they very regularly flag other articles for having misleading headlines or missing context.
The Consumer Reports people went out of their way to bypass Tesla’s existing safety features, by placing a fake weight on the wheel (a heavy chain) and by buckling the seat belt without anyone sitting there. So what - it’s a contrived experiment.
Lots of people in social media seem to hate Elon Musk and are using this as some sort of indictment of him or Tesla. I don't see anything particularly interesting or outrageous about this. People can misuse cars already, and they will come up with clever workarounds in the future as well. It’s not exclusive to Tesla. It’s not even exclusive to cars. You could misuse a kitchen knife as a toothbrush but that doesn’t mean Cutco need to be regulated.
> Both this article and the Ars Technica article are prime examples of misinformation and fake news. Unfortunately they won’t be flagged as such
The articles won't be flagged as such because they are neither misinformation nor fake news.
The Ars headline is: “Consumer Reports shows Tesla Autopilot works with no one in the driver’s seat”
That is misinformation. Facebook and Twitter have flagged articles for much less, because they felt the headlines were misleading or incomplete.
So Teslas Autopilot does not work with no one in the driver seat?
It isn't and they haven't. Reading past the headline is a basic skill of news reading. It doesn't make it misinformation just because you don't like the headline.
You know full well many people will share this article, never read past the headline, and repeat it. The sentiment held in the headline will be the one that persists and the damage to Musk and Tesla’s reputation will be done, even though the truth is that you would need to artificially rig up the car to defeat its safety measures.
> You know full well many people will share this article, never read past the headline, and repeat it
People not reading an article does not make the article misinformation or fake news.
> the damage to Musk and Tesla’s reputation will be done
I wouldn't worry about that. Musk does enough reputational damage all by himself.