I suspect a lot of commenters will miss this part, so it's probably good to emphasize it.
> In the footwell of the car, the officers found what is known as a steering wheel weight, which is used to trick the security systems. It attaches to the steering wheel and gives the car the illusion that your hands are on the wheel.
This seems pretty clear-cut, and Autopilot's capabilities and other safeguards (or lack therefore of) are irrelevant. The driver was mitigating safety features. It'd be pretty much the same story with adaptive cruise control and lane-keep assist.
At the end of the day, there's only so much you can do to stop someone from shooting themselves in the foot. Once you have to deal with drivers mitigating the driver awareness detection, it's probably a lost cause.
Considering that human lives are at stake, shrugging at the issue is not a good solution. I think two solutions are 1) criminal liability for drivers disabling safety features that endanger others (e.g. on traffic stops) and 2) adversarially strengthening against safety workarounds.
I'm sure GP is talking about in the context of what is really Tesla's fault here, as the article clearly is trying to bait a reaction on it. Not sure about criminal liability, but at the very least there should be a fine for using it.
I also wouldn't want the AI to decide i'm not touching the wheel because my hands are cold that day or something said AI training might end up.
Well, yeah. It should be a crime. In fact, I'm pretty sure it IS a crime in some places. That's the point though. You can do anything you want to try to bring the number as close to zero as possible, but there's going to be a non-zero amount of people who make unbelievably irresponsible, potentially deadly decisions. I'm pretty sure that's a fact of life. It's not exactly shrugging it off.
I seriously doubt the person in the vehicle was not aware that they could lose their license for this...
Tesla currently have visual driver monitoring in beta. Right now they are relying on steering torque sensing. People have worked out how to trick the torque sensor with weights. People will work out how to trick the visual monitoring too.
In pretty much every car on sale today, the driver's seat seatbelt sensors can be trivially defeated. Should Ford be blamed if people realise that you can latch the belt before sitting down and the car will operate as though your belt is on? Surely Ford should be responsible if they failed to correctly interlock the occupancy (weight) sensor and seatbelt sensor?
Is Ford selling cars with the tagline "the driver is doing nothing, they don't even need a seatbelt" and implementing seatbelt sensors such that a wink and nudge is all it takes for car owners to receive the seatbelt-less features they were sold in marketing?
Because right now, Tesla vehicles are being sold with the promise that the "driver is there only for legal reasons, the driver is doing nothing" with a figleaf of driver monitoring that makes it incredibly easy to bypass and access the driverless features drivers were sold via Tesla's marketing.
If you know your driver monitoring is faulty and lets drivers easily operate vehicles while asleep, then yes, you have a duty to address that as a vehicle manufacturer.
> People will work out how to trick the visual monitoring too.
I believe you are right. The DMS (Driver Montering System) needs to work on so many outliers: damaged face, porslin eye, niqab and other health related issues. It is more important that the system let these people be able to drive the car than it is to stop abuse.
Many car manufacturers test the DMS alot regarding of how to trick them when they select the supplier but so far they don't want to pay for better protection (to my knowledge). (Well. they don't want to pay for anything to be honest but still have all the features)
I believe there will be a cat and mouse situation for a while until it get to expensive to catch them or to inconvenient/costly/looking stupid for the driver to trick the system.
But I am sure it is harder to trick DMS system than Teslas system today. And harder means fewer people will do it.
disclaimer: one of many engineers working with this.
(It is btw one of the most interesting problems I have encountered. Just think for a second of all engineering expertise you need: biological, optics, electronics, medecin, high performance computing in C and assembly (you need to get it to run on a "toaster") , AI (a lot), etc. etc. Just collecting ground truth of drowsiness is a tricky topic by it self... if someone here has a boring job, I highly recommend to think about this field)
If a known exploit exist against safety features the car should not pass inspection and the feature be grounded for however long it takes for the company to fix it, this beta bullshit with actual living being taking part of the test environment by should be stopped.
I see in your viewpoint how we could end up in a truly suffocating biosecurity state, with the government tasked from preventing every possible danger.
people can trivially disable the seatbelt security features and the world doesn't come crashing down, so it would seem by your logic they do not actually need to do more of this...
i think as soon as it's possible "autopilot" that relies on the driver should be banned outright and we can stick to a binary full self driving vs manual driving. The in-betweens do not work well, humans adapt to ignoring the road too quickly to be a useful security override for a faulty system.
There are countless illegal things people can do while sitting in the driver’s seat of a moving vehicle. Many of them are dangerous for others. This isn’t any different.
So what about Nissan ProPilot, Honda Sensing, Mercedes-Benz Active Steering Assist and other systems? Your super simple line would include half the cars on sale today.
> Autopilot's capabilities and other safeguards (or lack therefore of)
The car had an incapacitated driver and no one got hurt! Surely that particular capacity is very relevant here. If all incapacitated drivers had autopilots that drove safely, the world would be a safer place, right?
I really don't understand the resistance to this feature.
False premise. The driver might not have been on the road if he couldn't rely on autopilot.
It's clear from the steer weight what it's intentions were.
You cannot compare it just with the worst case scenario. The driver, knowing his condition as demonstrated by the weight, could have taken a different form of transport, or none at all.
So you need to compare the pretty high chance of a unsupervised tesla crashing into a highway divider or parked lorry with the other modes of travel safety.
Instead you're picking an anecdote and running wild with it.
While I think this is a solid point in general, if someone has installed weights to trick autopilot into thinking they're awake and holding the wheel, they probably know that they're supposed to be awake and holding the wheel.
Tesla didn't invent DUI. Consider it this way: if your friend were to stumble out of the pub drunk and drive home, would you rather they did it in a civic or a model 3?
Not the point I’m making. If you misunderstand the capabilities of a product named “Autopilot” you might choose that over a taxi in a way you wouldn’t a totally manual drive. So the scenario would be “your friend stumbles out drunk and drives Autopilot, drives manually or takes a taxi” I’d rather they took a taxi. But if I were similarly mistaken about what “Autopilot” does I might not stop him driving home when I might otherwise.
I question the claim that there are people who are misled. Anyone who has used Autopilot for more than a few minutes will be left in no doubt that it is nothing more than adaptive cruise control with lane centering. Its behaviour describes itself.
The problem in this instance wasn't the driver misunderstand the capabilities of Autopilot, it was the driver being irresponsible with drugs, fatigue and the installation of a safety defeat device.
Only solution is to regulate the companies. Musk is using his army of cultish followers to do QA with other people's lives. In my opinion this should be highly illegal, and prosecuted.
Imagine your kid gets killed by a Tesla influencer trying to prove that autopilot is everything Elon promised them.
[disclaimer] i worked for a while at Cruise, and that company cares deeply about security and works hard with the government to move regulation forward as we move the technology forward.
> Only solution is to regulate the companies. Musk is using his army of cultish followers to do QA with other people's lives. In my opinion this should be highly illegal, and prosecuted.
In this case, relevant to this news story, what do you think should be regulated? I agree driver assistance systems, especially ones like Autopilot that are clearly not near human levels of driving proficiency, should prevent operation from drivers that are not aware. It seems such measures were taken, but they were defeated intentionally.
> Imagine your kid gets killed by a Tesla influencer trying to prove that autopilot is everything Elon promised them.
I suppose that would make me rather upset, but I'm struggling to make the connection to this news story. It doesn't seem to talk about the driver in question or their intent. I'd had guessed it was less about being a Tesla fan and more about being able to sleep while driving.
It is exactly the story that the headline makes out. The driver fell asleep. The autopilot didnt respond to signals from the police to pull over. This is because it doesn't have that feature. But since it's on the road and advertised as self driving, it really should have this feature because drivers will try to get around whatever protections manufacturers put in place and fall asleep because they're on drugs, have a medical condition or are just arseholes.
Why would autopilot, the normal driver assist all of the cars have, would respond to police? No where is that advertised as a standard feature. Now, the car should have eventually stopped after multiple failed driver inputs. But the part about ignoring police is a bit silly to hit on.
Edit: Newer software versions do detect wheel weights as quite a few people have reported on the various Tesla forums.
Edit 2: They didn't crash. That's the real story here.
Shouldn't FSD understand how to drive properly when emergency vehicles are approaching from behind? Otherwise an ambulance could get stuck behind a Tesla blocking the road. Or was this just adaptive cruise control?
This doesn't say what mode the driver was using, FSD could arguably want to understand this, but if he was just using "Autopilot" (aka keeps your lane + adaptive cruise control) with a steering wheel weight to disable the "are you there?" checks, I wouldn't expect that to need or understand anything about emergency vehicles.
FSD isn't Autopilot. Autopilot does detect emergency lights (in the US at least) but it only slows down.
I've wondered why the car doesn't have detection for approaching cars behind to get out of the way but people say the rear camera angles are not good enough - I think that's a terrible excuse.
In most jurisdictions, you have a legal obligation to pull over when emergency vehicles are operating their sirens. Perhaps they need to pull you over, or perhaps perhaps they just need you out of the way so they can get to some other emergency. If you are selling a vehicle with autopilot/self-driving, then it needs to respond to common driving situations like emergency vehicles.
I am well aware the driver was on drugs and deliberately defeated the safety mechanism. That is not relevant. If s/he had a heart attack or stroke and was slumped over the wheel, the vehicle should still be able to stop in response to emergency vehicles. Sirens and emergency lights are easy to detect, by design.
> If s/he had a heart attack or stroke and was slumped over the wheel, the vehicle should still be able to stop in response to emergency vehicles
It should? Does your car do this? Do any cars do this? Cars don't do this.
Traditionally, cars with passed-out drivers crash and kill people. Teslas don't. Are you saying you'd prefer they crash and kill people if they can't detect emergency commands from other vehicles?
I think you understand full well that this was offered in the context of self-driving cars, and that you did not in fact need it spelled out for you again.
It should be quite easy to stop a Tesla on Autopilot. You just drive in front of it and you slow down. And the Tesla behind you slows down automatically. Just like in a traffic jam. Autopilot is not switching lanes on itself.
2. Promises are best-effort intentions made by autonomous agents, and made known to an audience
3. Assessment of how agents keep promises is local to the observer
4. Only autonomous agents can make promises. If an agent is not capable of making promises, it is not autonomous.
5. Non-autonomous machines and systems are proxies for humans making and keeping promises with each other
6. Tesla’s driver assist technology is a proxy for promise-making humans
7. Tesla, the company, composed of humans, makes certain promises on the safety properties of their cars and driver assist
8. Human drivers are making promises to other drivers, as well as the governing body, in regards to operating the vehicle (with penalties defined by obligations defined by the governing body)
9. Since the driver assist tech is not fully autonomous, it is a proxy for the human driver’s promises, as well as Tesla
10. Therefore, a human driver falling asleep is breaking his promises regarding the operation of the vehicle
11. But Tesla has also made promises on the _autonomy_ of the vehicle.
12. It isn’t so much that the promise (the intention) is false, but that Tesla has not been keeping its promises regarding safety
13. Contrast this Waymo, who spent a lot of effort to make and keep promises about safety, including documenting a framework to work with first responders.
Promise Theory[0] is an interesting way to analyze how autonomous agents make and keep promises. It's interesting to think about how this applies to Tesla's driver assist technology. While the driver assist is not fully autonomous, it is still a proxy for the human driver's promises, as well as Tesla's. For example, when a human driver falls asleep, they are breaking the promises they have made regarding the operation of the vehicle. Contrast this to Waymo, who have taken extra steps to make and keep promises about safety, including having a framework to work with first responders.
I mean, police, horns and that guy is still sleeping? Maybe some health issue or what? How can someone have such a deep sleep in a car?
Anyways, from the news it appears that the car didn't crash. So this is actually positive news. What would have happened if that guy did fall asleep (I mean, really deep sleep) on a basic car....
> When the police wanted to stop and check the man, his Tesla just kept driving. "He didn't respond to stop signals or repeated horns from the officers," police said. The driver was leaning in the seat with his eyes closed and his hands were not on the steering wheel. "After about 15 minutes, the man finally woke up and followed the instructions of the police. During the check, he showed typical drug failure symptoms," the statement continues.
After about 15 minutes, the man finally woke up and followed the instructions of the police. During the check, he showed typical drug failure symptoms.
Well, I imagine passing out and driving usually gets people in a more dangerous situation such as a car accident.
Wow, a drug user fell asleep at the wheel and the car saved his life and probably other road users by keeping the vehicle operating safely until he woke up? Modern technology is remarkable!
My Subaru Forester 2018 would drive off the road, probably crashing into other cars as it does so. If I fell asleep in a drug haze, I would likely just die, even in a modern Subaru. It emits a beep-beep-beep and then switches to manual drive.
By the way, I am not sure exactly what happened (I do not speak German) but if the police are worried that a sleeping driver is using autopilot they can drive up with a police car in front of it and gently slow down to a stop in front of it. The autopilot should slow down as well.
So this guy may have been high, but I've heard of other drivers falling asleep and relying on self driving features.
It is a long, long time before my body would even physically allow me to fall asleep when I'm in the driver's seat. Maybe it's just me, I just wouldn't be able to even if I tried.
Happened to me once, after a really long day, on a German autobahn of all places. Luckily, it was at night without any traffic. I driffted all the way from the left lane into the barrier besides the emergency lane on the right side, skipped three lanes in total doing it. Scary as fuck, if it wasn't for the barrier I would have hit the trees. Even luckier, there wasn't anything else there at that moment as I would have hit anything and everything. Ever since, I am super concious when I feel tired and sit behind the wheel.
Wow, you're lucky! It happened to my mother as well, driving home from a long day when she was directing a performing arts center here in NY -- so I understand it isn't always a choice, and this was before Uber.
She drifted and took out the guardrail on the old Tappan Zee bridge (around 11PM on a weekday, little traffic as well). Got a bill for it from the state -- I'm just glad she didn't make it all the way through, would have been quite a plunge (and I wouldn't exist).
“Even luckier, there wasn't anything else there at that moment as I would have hit anything and everything. Ever since, I am super concious when I feel tired and sit behind the wheel.”
Payments: requires KYC
Retail: requires the delivery of a product and some truth to the product
There are performance metrics put out for vehicles and they are generally agreed upon after accidents have already occurred meaning people were already injured or killed. If this was a Toyota or a different make of car, the statement would be the same.
"The driver had apparently tricked the car's safety systems. … In the footwell of the car, the officers found what is known as a steering wheel weight … A preliminary investigation into the criminal offense of endangering road traffic was initiated against the Tesla driver."
Sounds more like the driver defeated the safety systems which stop the vehicle if the driver is incapacitated.
> In the footwell of the car, the officers found what is known as a steering wheel weight, which is used to trick the security systems. It attaches to the steering wheel and gives the car the illusion that your hands are on the wheel.
This seems pretty clear-cut, and Autopilot's capabilities and other safeguards (or lack therefore of) are irrelevant. The driver was mitigating safety features. It'd be pretty much the same story with adaptive cruise control and lane-keep assist.
At the end of the day, there's only so much you can do to stop someone from shooting themselves in the foot. Once you have to deal with drivers mitigating the driver awareness detection, it's probably a lost cause.