From what I can tell, the car behind the tesla didn't crash into the tesla but stopped before. And so did the car behind that one. It is the cars after that that crashed into each others.
While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances and not able to stop when there is a traffic jam ahead.
In my opinion 90% of the reason we don’t have more accidents is because people are so predictable. Watch dangerous drivers weave through traffic - it’s only possible because everyone follows spoken and unspoken rules. Randomly stopping is not some “you need to be prepared, it’s your fault” moment.
A car switching into an empty lane and then stopping hard even more.
If a human driver did it deliberately as the Tesla did (since it was designed to pull over and stop) then I would consider it a criminal level traffic violation on the part of the stopping car.
I had an accident exactly like that. Car in front of me fully stopped all of a sudden as I was accelerating. This was right passed the lights. No traffic or anything in front him. I don't know why he stopped. My insurance told me he was 100% at fault.
Here in the UK, that happened to me on a roundabout. Elderly driver in front pulled out and then, for no visible reason, slammed on the brakes. I ran into the back of her.
It was deemed to be be 100% my fault. Here, if you run into the back of the car in front, it is always your fault. No exceptions.
It makes sense: you should always leave enough room to stop, so I couldn't really complain. But, if everybody did that on that particular busy roundabout, it would result in gridlock.
I've heard a story from a friend that was in an accident like that. A police officer who did the paperwork smiled and said "It's the most common type of accident - people just stop paying attention to the car in front that started moving, and develop sort of a blind spot to that".
I drive for several years, and can say my brain relies heavily on this sort of predictable stuff like "that car has free intersection in front of it, it will move forward".
Crashed at a intersection into the driver in front of me. Started driving, and i looked wethere there was traffic coming from the left, the other driver in front of me stopped for no reason. Still my fault, but in some situations, with multiple dangers, you rely heavily on the people behaving predictable.
In a country north of UK the law was once the same. People with worn down cars would find the perfect opportunity to slam their brakes, intentionally get hit from behind and claim insurance.
The other day I was driving behind a car on a country road, he did an emergency stop to avoid a rabbit in the road. I avoided hitting him, I stopped about 2m from him because I was careful to leave enough space. If I had hit him I'm sure the insurance companies would have placed blame on me and it would have been my insurance that paid out. But ultimately it would have been his "fault" for doing something utterly unpredictable and dangerously stupid.
If it had been a large deer, of course he should have stopped, at that point it's the safety of the people inside the car.
In law for insurance purposes it needs to be clear cut, person behind is almost always at fault. But that doesn't mean they are the cause of the accident in all cases. There is nuance to these things, and part of that is that braking for a rabbit, or using Level 2 automation, is increasing the chance of an accident happing on the road.
If the way you drive increases the risk of someone driving into the back of you, even if they haven't left enough space, you are at fault in my mind.
This reasoning makes very little sense to me. Why drive assuming that it will be a rabbit most of the time and not a child or a deer or whatever you deem worthy of stopping for?
If the possibility exists that a driver in front of you may stop suddenly why not always leave a minimum amount of space for a safe stop considering there is virtually no downside to doing so?
The reasoning doesn't make sense because it is applied retroactively. The parent poster doesn't want to take responsibility for their unsafe driving practice and tries to place the blame on the other driver.
Cars suddenly stopping in front of you isn't some unpredictable rare event. You are told this as part of the licensing procedure, and this is the reason minimum safety distance exists. The solution is obvious and known to any licensed driver. As such, it is unreasonable and unacceptable, to not follow the minimum safety distance.
Thats not my point. You should, and I do, always leave enough space to stop, but not everyone does.
Being a safe driver is being aware that other people, including the people behind you, may not be leaving enough space. Suddenly stopping for a rabbit, from a speed of 50mph, is f-ing stupid and massively increasing the risk of a serious accident on the road.
Using a Level 2 automation that can randomly stop for no reason is not safe driving, you are increasing the chances that someone who is too close drives into the back of you.
Just because someone else is driving unsafely doesn't absolve you of the responsibility of being aware of that and not driving safely yourself.
You should be able to stop at any time for any reason and not worry about the car behind crashing in to you. They should leave enough space. A rabbit is a very good reason. Maybe it's not a rabbit, maybe it's a rabbit-shaped rock that would cause serious damage to your vehicle, causing you to leave the road and die.
As long as there are valid reasons to do an emergency stop, the actual reason should be irrelevant for the person driving behind. You should always assume it is a child chasing their ball or something and it will be your car pushing the car that managed to properly stop into that child if you rear-end it. There just isn't a good reason to put yourself in a situation like that and it is perfectly in your power to avoid it.
I cycle a lot in an area that is quite touristy. There is a lot of shared infrastructure with pedestrians, rollerbladers, scooters and skateboarders. You get very good at spotting people who don’t know what they’re doing that are potentially dangerous usually because they’re oblivious to their surroundings.
For example, pedestrians crossing a bike path. Because a lot of people clearly don’t walk often they will just walk out without looking. People aware of their surroundings will look both ways. As soon as you see someone do that you can pass close to them without spooking them because you know they’re aware.
My point here is a ton of this comes down to acting predictably. Even a simple act like looking at someone will alleviate a ton of uncertainty.
The barriers to fully autonomous self-driving are huge and not necessarily technical. Acting predictably, being able to explain actions, drivers driving differently because another vehicle is automated and cultural differences.
Suddenly stopping at high speed is generally considered (at least in the UK) dangerous driving and will get you prosecuted and probably mean you lose your license at the very least. If it results in a death then you can get a life sentence in prison for it.
Intentionally stopping inside a tunnel is a pretty clear cut case of dangerous driving over here.
However there can be many reasons why a car has to stop suddenly: for example, a child might suddenly cross the road, or the driver might feel he or she is having a heart attack.
My driving school teacher used to say: always remember that the car ahead of you could suddenly stop at any time for a reason that you might not know.
This happend on a highway. The Tesla switched lanes into oncoming traffic moving faster than itself. It immediately started braking for no reason. On top of this, it all happened in a tunnel entrance.
Yes, you should remember that vehicles can suddenly stop. This means keeping a safe distance from the vehicle in front of you, and looking beyond that vehicle to anticipate what is going to happen. However, a vehicle illegally throwing itself in front of you and doing basically an emergency stop because it just feels like it is not something a reasonable driver is expected to predict.
If you look at the video, you can see that the first vehicle is almost at a full stop when it hits the Tesla - its driver performed as expected. The second vehicle comes to a full stop before hitting the first one, so that's also going quite well. The third vehicle (a rather large truck) doesn't brake but instead swerves into the other lane, almost hitting another car. This dooms vehicle number four, which suddenly finds itself mere feet away from a stopped vehicle. Five through seven are tailgating and never see it coming for the same reason. Number eight is able to stop in time.
The speed limit here is 50mph. Assuming they were driving the speed limit, the typical stopping distance would be 174 feet. I can assure you, nobody keeps a distance of 174 feet between them and the car in front.
In this case the Tesla stopped because of what you might charitably call a breakdown. This would probably get you out of the dangerous driving conviction but if it happened multiple times would probably result in a recall on the vehicles to fix or disable the faulty part.
The longest delay on a motorway I've ever had (get out of the car, walk the dog, hang around for a couple of hours in 32C degree heat) was a case of a driver of a small car having a heart attack, slamming on the brakes, and getting rear-ended.
From what I gather from the article the car didn't suddenly stop.
I still suspect it would be classed as dangerous driving as there wasn't a need to stop and they did find a safe place to stop.
If it did suddenly brake though and it was the car, that seems like something Tesla should be liable for. The time taken for any driver, when a car starts automatically braking, to assess the situation and override isn't going to be enough to avoid the dangerous situation.
-- not only does it stop suddenly - applies break - signals into a new lane - changes lane into new lane - breaks hard - well actually - if you watch the video frame by frame - it starts to initiate a left hand turn for lane change before it even signals - if a driver did this to me - sure i hit them --
I think they're referring to the offence "Death by dangerous driving" and it now can carry a life sentence[1], but you would really have to be doing something exceptionally bad to get a life sentence - it would have to be deliberate, likely a repeat offence, and have other aggravating factors for that sentence. Real sentencing in the UK is very lenient- famously an American woman killed a teenage motorcyclist when she was driving on the wrong side of the road, claimed diplomatic immunity and fled the country, she eventually returned to the UK to be sentenced to... 8 months in prison suspended sentence.
That's not the way the law works in the US. People are legally responsible for hitting something in front of them but not for getting read ended after slamming the breaks. Partially this is why the Tesla system is so trigger happy about braking. Brad Templeton goes into the design/legality issues around the accident in some detail here:
Do you have a citation for that? I have always understood the rules to be that you should always remember the car in front of you can make an emergency stop at any time for any reason, and it is always your fault if you hit them (unless they just moved into your lane).
Conversely if you need to stop suddenly, e.g. something has crossed the road in front of you (or you think it has), you don't worry about the vehicles behind, you just stop.
If you crash into the car in front of you in your lane, the responsibility is yours both morally and legally. The only car which has a valid excuse is the car immediately following the tesla since the tesla inserted itself into its lane then immediately stopped, not allowing it to build a safety distance.
The driver's account, as quoted here, doesn't even make sense. In the first sentence it says he was driving along in lane 1, the car slowed, and he "felt an impact."
The next part is contradictory, citing a different lane (2), a different cruising speed, and then a lane change... and no impact. WTH?
I had an accident exactly like that. Car in front of me fully stopped all of a sudden as I was accelerating. This was right passed the lights. No traffic or anything in front him. I don't know why he stopped. My insurance told me he was 100% at fault.
> clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances
The cause of the crash is the Tesla. You are not allowed to stop on bridges, in tunnels and several other places. The crashes starting with the Nth car and not the 1st is normal. Reaction time of the first car eats into the reaction of the second and so on until there's no more time to stop. Understand that cars further back do not see the car in front of them applying the brakes and slowing down, they are seeing a car moving at normal speed instantly crashing. Minimum safety distance is not as big as reaction time plus stopping to zero, that would be huge.
Not sure I understand your reasoning. The further you are from the first car, the more warning you get on what's going on, not the other way round. I typically watch not just the car in front of me but 1 or 2 cars ahead. If I see those cars braking, I start breaking. It's the car immediately following the tesla that got the least warning.
It is not always possible to see 1 or 2 cars ahead, that can not be the standard for who is at fault.
edit: to make it clear, we are debating whether the Tesla is at fault, not whether the other could have avoided it. Creating a dangerous situation still puts you at fault, you can not be allowed to do so at any time on the grounds that everybody else should avoid you.
The drivers who crashed were obviously wrong, but stepping on the brakes in a 70+mph tunnel is incredible dangerous. The Tesla created this situation, and holds most of the blame.
I was thinking the same. One earlier article about this had a rather inflammatory photo of some twisted-up cars next to an empty baby buggy. But if you watch the video, that’s clearly where some human drivers ploughed straight into the back of the pile-up.
The Tesla autopilot failure is really bad, for sure, but those human drivers should be banned for life. There’s no excuse for ramming into a traffic jam because you weren’t paying attention.
Edit: occurs to me that possibly I’m being overly harsh here. Is there something about the dynamics of traffic that puts the cars three or four slots back at greater risk when somebody unexpectedly stops? I would assume that the immediately following car is at the greatest risk, but after that everybody is at successively less risk as they should all be slowing and should all see each others’ brake lights.
The first driver has some reaction time needed so he’ll start breaking after the Tesla starts breaking. Which means, assuming they were at the same speed initially, that the 1st car will have to break a bit harder than the Tesla.
Then the second car will be in the same situation, it will have to break a bit harder than then first. 5 cars later, you are at the hardest possible breaking power, as it’s ultimately limited by adherence.
So, if there’s not a larger gap in the thread of cars somewhere to allow for breaking less hard than the car in front, it’s more or less inevitable.
To mitigate it, each driver needs to look 2 or 3 cars ahead, not just at the car in front. Everybody should do that but it's understandable if they don't (and it might be hard to see clearly, especially in a dark tunnel).
One car driving significantly slower than the rest of traffic can itself be a safety problem:
"To keep a smooth traffic flow, some highways also have minimum speed limits. If you drive slower than the minimum speed you can halt the traffic flow and create a dangerous condition. Even if there is no minimum speed limit, those driving too slow can be as dangerous as those who drive too fast."
While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances and not able to stop when there is a traffic jam ahead.