Firstly, human error in at least 7 out of the 8 cars caused this. FSD didn't take control away from the human, it encountered something outside of it's limits and returned full control to the human.
Secondly, that's what planes are. A lot of new plane safety comes from very automated, very complicated systems.
In some situations these planes fail in a way that leaves the pilot in a more dangerous plane than one with no electronics at all.
Why do we tolerate this? Because it doesn't matter how poorly something handles failure, what dominates is the base rate of the failure occuring.
If you ban these extra-automated driver aids, and people drive Teslas less, more people die. Same as if it causes a delay in developing real FSD.
I would rather live next door to an RBMK-1000, than drive a car 10 years older than my current one.
I caution you against changing automation policy based on an accident caused by human error.
Secondly, that's what planes are. A lot of new plane safety comes from very automated, very complicated systems.
In some situations these planes fail in a way that leaves the pilot in a more dangerous plane than one with no electronics at all.
Why do we tolerate this? Because it doesn't matter how poorly something handles failure, what dominates is the base rate of the failure occuring.
If you ban these extra-automated driver aids, and people drive Teslas less, more people die. Same as if it causes a delay in developing real FSD.
I would rather live next door to an RBMK-1000, than drive a car 10 years older than my current one.
I caution you against changing automation policy based on an accident caused by human error.