Cruise recalls all self-driving cars after grisly accident and California ban | All 950 of the General Motors subsidiary’s autonomous cars will be taken off roads for a software update::All 950 of the General Motors subsidiary’s autonomous cars will be taken off roads for a software update
Apparently GM thinks killing a pedestrian every 10 million miles is acceptable?
GM saved like $2 on an ignition switch and killed 13 people. They knew about the issue for years. So yeah, GM doesn’t care if a few people have to die in order to turn a profit
https://www.nytimes.com/2014/05/27/business/13-deaths-untold-heartache-from-gm-defect.html
What’s acceptable?
Every 50 million? 100 million?
It will never be perfect, and there will never be no deaths at all, so if there is no acceptable limit you may as well ban self driving car research right now.
The rate of pedestrians killed in 2021 was approximately 1 in every 25,000,000 miles driven manually (8000 deaths and 203 billion miles travelled collectively. Should that be the minimum target?
The acceptable number is zero. For any type of vehicle.
Then we best ban all vehicles
Might as well ban people outright
Ok. If you really think that is the only way to improve the situation, I’m on board.
It really could be zero for pedestrians if we spent the money to ensure no human and vehicle would ever share the same space. It is less about how many humans/miles driven and more about how many humans/cost to avoid sadly.
It really could be zero for pedestrians if we spent the money to ensure no human and vehicle would ever share the same space. It is less about how many humans/miles driven and more about how many humans/cost to avoid sadly.
It really could be zero for pedestrians if we spent the money to ensure no human and vehicle would ever share the same space. It is less about how many humans/miles driven and more about how many humans/cost to avoid sadly.
So, no human driven cars, buses, trains, planes or anything else, then?
And literally no people leaving the house at all.
Have been unfortunate incidents of pedestrians accidentally striking other pedestrians which then result in heart attacks or suchlike, which then results in death.
We aren’t talking about your IQ here, so try to follow along yeah?
It would be interesting to see what the actual stats are for pedestrian deaths vs miles driven for non autonomous cars. I’m willing to bet autonomous cars will ultimately be safer, but it will take time to get to that point.
Edit: Apparently, according to the transportation safety in the US article on Wikipedia, the average is 1.25 pedestrians killed per 100 million miles driven.
That page doesn’t exclude commercial road vehicles or interstates, so the apples to apples comparison may be much closer to the autonomous rate. A 700 mile/day truck cruising I-40 through the desert is going to skew the data as safer while I bet a casual city driver will be an order of magnitude more dangerous. Maybe the best would be stacking it against taxi and other ride-hail drivers
Edit: Cruise didn’t even cause the incident. A human-driven car hit the pedestrian into the Cruise. This sky-is-falling reaction was started by a human doing worse.
GM was just getting it out of the way, that’s all. Nothing to see here. They operate better under pressure, see. Right? Sure they do.
But they recalling the vehicles so clearly not.
Unless you’re suggesting that the software update is too make the cars more efficient at killing pedestrians?
Read what they said. That they’re doing the recall even though it’s only 1 per 10m. Implying they think that is an acceptable rate for serious injuries.
Ford thought 180 dead per year was acceptable when it shipped the Pinto. GM looks like a saint by comparison, fuck.
I think they mean they’re not legally required to recall them. I guess the government have a limit on what they think is acceptable and that’s below the limit probably because it’s less than what human drivers achieve so it’s an improvement in safety.
No they don’t which is why they suspended all vehicles pending a software update.
Also, how does this compare to human drivers?
The best thing about this is that now the problem has been identified the software can be fixed and this particular problem won’t happen again. If a human makes this mistake you can’t push an update to fix all human drivers.
What’s that rate for human drivers?
Around 1 per 100 million miles.
The irony here is that the accident occurred because a human driver hit this pedestrian first. So it ain’t like us humans have a clean conscience here…
It’s a trolley problem of sorts. Currently it seems that we have higher standards for AI than humans. I bet that even if AI was twice as good driver, we’d still hate to hear about it causing accidents. I’m not sure why that is. I’m wondering if it has something to do with the fact, that there’s really not anyone to blame and that doesn’t fit with our morals.
Because corporations running AI means the first time actual human thought enters the picture is when the dividend check gets deposited.
And shareholder profits, sacred in law and the market, will push safety standards based on cost, not fewest deaths.
In all weather conditions. Autonomous vehicles only drive in optimal conditions, humans have to suffer whatever nature throws at us.
I want to believe you, but source please?
Just the sort of thing I was looking for. Thanks, internet stranger!
According to these numbers 1 death in 73 million miles. Which is much better than I thought.
Which includes trucks hauling through unpopulated areas
What’s the acceptable vehicular homicide rate? GM seems to think it’s more than zero.
It is more than zero. Anything that beats humans is a win. Getting to zero is unrealistic. Nothing has a zero risk of death.
Correct, that’s exactly what I’m saying. Zero is the acceptable number, so anything that gets us closer to that is good.
You’re shifting goal posts.
What’s the acceptable vehicular homicide rate? GM seems to think it’s more than zero.
Correct, that’s exactly what I’m saying. Zero is the ideal number, so anything that gets us closer to that is good.
Acceptable is different than ideal.
Only if you want it to be.
That’s true. But then you run into the issue of “The perfect being the enemy of the good.”
Ok ya pedantic fuck. I edited my comment just for you. I know English is hard to understand.
But now you’re misusing “acceptable”.
We would need to get to the other side of acceptable for widespread use of autos (self driving vehicles). It’s not an unachievable goal you always try to get closer to. That word is your previously used “ideal”. Which its seems now is what you meant with your original comment, instead of the “acceptable” you actually used.
It’s not just pedantic. I’m not the only one who thought you said something you apparently now didn’t mean, because you used words you apparently don’t understand. The words you use are vital to your being understood.
You could just humbly admit your original mistake in language, and nobody would give you a hard time.
That’s equally ridiculous to say. Self driving cars just need to be better than people to be worth it, they just currently are not better than people.
It’s ridiculous to think that cars shouldn’t be killing people? Well smack my ass and call me an extremist.
Yes, it’s ridiculous to say that if self driving cars kill fewer people than human driven cars but still more than zero that we should not use them. That’s like saying “why use seatbelts, they’re not 100% effective.”
That’s not what I said though.
Are you trying to be this much of an idiot?
That’s the implications of the logic you’re using.
Are you calling for a ban on human driven cars? They killed more than zero people yesterday! If you aren’t, you’ve accepted a human-driven vehicular homicide rate above zero.
How did you arrive at that conclusion?
In a statement on Wednesday, the GM unit said that it did the recall even though it determined that a similar crash with a risk of serious injury could happen again every 10m to 100m miles without the update.
Emphasis goes on “even though”.
As in “At GM we’re so benevolent that we’re doing a software update even though we think this will only kill someone every 10m miles (which we consider an acceptable murder rate for our cars)”.
How frequently this type of incident occurs is outside the control of GM.
In the crash, another vehicle with a person behind the wheel struck a pedestrian, sending the person into the path of a Cruise autonomous vehicle. The Cruise initially stopped but still hit the person.
You missed the part where this was specifically about their car dragging the person for 20ft after the crash and pinning them under the wheel?
I didn’t address it because you didn’t say anything about it.
No one was killed in the accident they are stating the rate of.
Yeah, but a car running over a woman, dragging her twenty feet and parking on top of her, could easily have killed her.
Yeah but equally you could argue that if all cars were self-driving this accident wouldn’t have happened. It involved a human making a mistake first.
I kind of feel like we’re getting the wrong takeaway from self driving cars.
What kind of mistake can a pedestrian make to cause a self-driving car run over them, and how does making more cars self-driving prevent that mistake?
Not just GM, if you tried ro question the safety of these cars on even Lemmy before these revelations came out you would get brigaded by people claiming they were safer than humans statistically and thats all they needed to be in order to be acceptable.
This incident started with a human driving their car into a pedestrian. It’s not exactly a smoking gun
This is the best summary I could come up with:
General Motors’ Cruise autonomous vehicle unit is recalling all 950 of its cars to update software after one of them dragged a pedestrian to the side of a San Francisco street in early October and a subsequent ban by California regulators.
The company said in documents posted by US safety regulators on Wednesday that with the updated software, Cruise vehicles will remain stationary should a similar incident occur in the future.
The 2 October crash prompted Cruise to suspend driverless operations nationwide after California regulators found that its cars posed a danger to public safety.
The state’s department of motor vehicles revoked the license for Cruise, which was transporting passengers without human drivers throughout San Francisco.
In a statement on Wednesday, the GM unit said that it did the recall even though it determined that a similar crash with a risk of serious injury could happen again every 10m to 100m miles without the update.
“As our software continues to improve, it is likely we will file additional recalls to inform both NHTSA and the public of updates to enhance safety across our fleet.”
The original article contains 712 words, the summary contains 184 words. Saved 74%. I’m a bot and I’m open source!
Instead of “what number of deaths is acceptable?” Ask, “who is responsible?”
When a human driver in control of a car hits a pedestrian, the human is responsible, not the car.
Who is responsible when a computer driven car hits a pedestrian? Also, whose insurance pays the bill?