Humans get into accidents all the time. Is that not unacceptable for you?
I feel like people apply standards to self driving cars that they don’t to human driven ones. It’s unreasonable to expect a self driving system never to fail. It’s unreasonable to imagine you can just let it practice in simulation untill it’s perfect. This is what happens when you just narrowly focus on one aspect of self driving cars (individual accidents) - you miss the big picture.
I feel like people apply standards to self driving cars that they don’t to human driven ones.
Human drivers need to pass driving test, self-driving cars do not. Human drivers also have a baseline of common sense that self-driving cars do not have, so they really would need more testing than humans, not less.
It’s unreasonable to expect a self driving system never to fail.
I don’t expect them to never fail, I just want to know when they fail and how badly.
It’s unreasonable to imagine you can just let it practice in simulation untill it’s perfect.
What’s unreasonable about that?
individual accidents
They are only “individual” because there aren’t very many self-driving cars and because not every fail ends up deadly.
Tesla on FSD could easily pass the driving test that’s required for humans. That’s a nonsensical standard. Most people with fresh license are horribly incompetent drivers.
I don’t expect them to never fail, I just want to know when they fail and how badly.
“Over 6.1 million miles (21 months of driving) in Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses, none of which resulted in injuries”
How many human drivers have done millions of miles of driving before they were allowed to drive unsupervised? Your assertion that these systems are untested is just wrong.
“These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver.”
According to insurance companies, human driven cars have 1.24 injuries per million miles travelled. So, if Waymo was “as good as a typical human driver” then there would have been several injuries. They had zero serious injuries.
The data (at least from reputable companies like Waymo) is absolutely available and in excruciating detail. Go look it up.
Humans get into accidents all the time. Is that not unacceptable for you?
I feel like people apply standards to self driving cars that they don’t to human driven ones. It’s unreasonable to expect a self driving system never to fail. It’s unreasonable to imagine you can just let it practice in simulation untill it’s perfect. This is what happens when you just narrowly focus on one aspect of self driving cars (individual accidents) - you miss the big picture.
Human drivers need to pass driving test, self-driving cars do not. Human drivers also have a baseline of common sense that self-driving cars do not have, so they really would need more testing than humans, not less.
I don’t expect them to never fail, I just want to know when they fail and how badly.
What’s unreasonable about that?
They are only “individual” because there aren’t very many self-driving cars and because not every fail ends up deadly.
Tesla on FSD could easily pass the driving test that’s required for humans. That’s a nonsensical standard. Most people with fresh license are horribly incompetent drivers.
So why don’t we check it? Right now we are blindly trusting the claims of companies.
What are these claims we’re blindly trusting exaclty? Do you have any direct quotes?
Have you used it? It’s not very good. It tries to run red lights, makes random swerves and inputs, and generally drives like someone on sedatives.
They’ve had to inject a ton of map data to try to make up for the horrendously low resolution cameras, but “HD MaPs ArE a CrUtCh” right?
No radar or lidar means the sun can blind it easily, and there’s a blind spot in front of the car where cameras cannot see.
Is what they’ve made impressive? Sure, but it’s nowhere near safe enough to be on public roads in customer’s cars. At all.
“Over 6.1 million miles (21 months of driving) in Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses, none of which resulted in injuries”
How many human drivers have done millions of miles of driving before they were allowed to drive unsupervised? Your assertion that these systems are untested is just wrong.
“These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver.”
According to insurance companies, human driven cars have 1.24 injuries per million miles travelled. So, if Waymo was “as good as a typical human driver” then there would have been several injuries. They had zero serious injuries.
The data (at least from reputable companies like Waymo) is absolutely available and in excruciating detail. Go look it up.