How long before self driving dies?
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
Just a small note on the self driving car that dragged the person when it attempted to pull off to the side of the road. The person was first hit by a car driven by a human driver. Their body was thrown in front of the self driving car. The self driving car could not stop. (Physics). The human driver ran from the accident and is still being looked for. Unfortunately, the person fell into a spot that was outside the range of the car sensors. The car proceeded to try and pull over to wait for help with the accident and made things worse by running them over. Yes, a human being would get out of the car and look to aid the injured person before trying to move their car (unless they panicked and simply drove away.) Yes, the self driving car needs to have it's software upgraded to include the case where a body is thrown in front of it, collides with said body, and can not locate the body after hitting it. In that case it needs to simply stop, call 911, and wait for human assistance. Yes, There are many unique things that a car can encounter. Will software ever be up to the challenge? I honestly don't know. But I do know that the current carnage on our highways will continue, with or without automated driving help. Software can be upgraded, people; not so much.
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
I don't think it will - instead, I think the public opinion will shift to revulsion at the whole idea of manually driving a car. Think about existing legislation: seat belts, ABS, speed limiters, the recent whole-of-Wales reduction of the default speed limit to 20mph from 30 - it's all about increasingly small reductions in death and serious injury; self driving offers that a large reduction (which will be touted as a total prevention) may be possible and there isn't a politician who dares fight that! Car companies being sued as a result of their products failing? It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ... :sigh: And as the number of self driving cars increases and the communication between them (to increase safety and economy) rises as well the accident rate will plumet as a result. When humans realise that they can do what they want (legally) while the car does the work they will leap at the chance to browse social media, messages, phone calls, alcohol, drugs, tv, pr0n, ... Stuff they do at the moment anyway while they are supposed to be in control! I don't commute any more, but my regular commute was an hour each way with the lemmings on a motorcycle and the things I've seen while traffic is moving at 70mph was horrific, phones, texts, newspapers, even one guy with his lappie propped open on the dashboard typing away and steering with his elbows! Self driving cars will (eventually) be safer: and they are - probably - the future whether we like it or not.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
Slightly distracting from your main point, maybe, but what I don't understand about self-driving cars is that everybody is doing his own thing. Why not make this a collaborative effort? So when one unanticipated scenario comes up, someone writes a fix once, the community at large tests it (like bug fixes in open source - in theory) and every manufacturer gets to benefit from it. It seems to me things would evolve a lot more quickly than having everyone roll his own version, no? Is this a matter of patents? Or each car manufacturer using different types of sensors, so there isn't one common/re-usable source of data that can be acted upon?
-
MarkTJohnson wrote:
most programmers would never get in a self driving car.
20 CEOs of avionics companies were all on a plane, just before taxiing away from the terminal. The purser came down the aisle, and whispered to each CEO that their company's avionics are controlling the plane. 19 out of 20 CEOs got off the plane immediately, while the 20th stayed in place. The purser said that he/she/it must be very confident in the company's avionics, to which the answer was "with the programmers we employ, the plane won't even take off!" :)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
I was expecting a punchline that takes a shot at CEO's. I wasn't expecting a dig at programmers.
The difficult we do right away... ...the impossible takes slightly longer.
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
The very thought that in the coming two generations, people are likely be unaware of ... (a) manual transmission, (b) actual "driving while sitting in the driver's seat", (c) there's something called driving licence ... is somewhat unsettling.
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
They won't. But self-driving cars only make sense in a closed system where all of the cars talk to each other and nothing squishable gets on the roadway. As long as they don't interoperate with each other -- between makes as well -- and as long as there are non-affiliated (human-driven) cars on the road and people and animals can cross the roadway -- the system just can't work. The current research is fine for ironing out the bugs in preparation for making an eventual future city with a closed road system. I need to watch Logan's Run again.
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
Greetings and Kind Regards I assume Science / Technology march on ever forward. Sooner or later self-driving cars will be more or less perfect. I am rather surprised their legality occurred so quickly. I have always wondered why that was so as I assume a self-driving car would not know what to do in response to this not unusual situation Dancing policeman: America's most entertaining traffic cop - YouTube[^] . Then of course is the matter of software attacks which I find frightening.
-
Self driving private cars will never happen. (famous last words) There are many cars recalls for defects every years for different safery problems. [Check for Recalls: Vehicle, Car Seat, Tire, Equipment | NHTSA](https://www.nhtsa.gov/recalls)
CI/CD = Continuous Impediment/Continuous Despair
These safety problems are addressed in autonomous driving cars. I am working in the automotive industry, and I can assure you that I would buy an (electrical) AD vehicle from any European or Japanese car manufacturer without any discussion. I still have my own doubts regarding quality of EVs from China. And do not get me started about the Musk company - nice toy, best car to die in.
-
Aren't all auto mobiles self driving?
Religious freedom is the freedom to say that two plus two make five.
Looks like this passed way over everybody's head. Good one, though :-)
-
Slightly distracting from your main point, maybe, but what I don't understand about self-driving cars is that everybody is doing his own thing. Why not make this a collaborative effort? So when one unanticipated scenario comes up, someone writes a fix once, the community at large tests it (like bug fixes in open source - in theory) and every manufacturer gets to benefit from it. It seems to me things would evolve a lot more quickly than having everyone roll his own version, no? Is this a matter of patents? Or each car manufacturer using different types of sensors, so there isn't one common/re-usable source of data that can be acted upon?
dandy72 wrote:
Why not make this a collaborative effort?
There have been attempts of making suppliers and OEMs work together, there are government funded projects but everyone thinks they can do it better than the other, AND the first one coming with an affordable and reliable solution will kill all other. Trust me, I have been working for 20 years in this industry, and the answer to your "why" is that it is run by human beings with emotions.
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
You are both right and wrong -> The timescale plays an important role here. There will be a very long phase where the cars will be at AD level 2.5 to 3, e.g. only on some specific roads and the driver must be able to take over within seconds. AD will be a standard on highways in the coming 10 years, but that's it. The step to level 4 needs an established level 3, where almost all vehicles are connected AD vehicles. Then, level 4 can be rolled out, and only after we will jump to level 5. AD in all situation is extremely complex and requires lots of SW (there are already about 100 millions of LOC in SW of an average recent vehicle, this is 10 times what is required to fly a plane, and this is WITHOUT AD). During this time, people will accept that AD will not solve all crashes, and that they are using a machine that can fail. You sign term and conditions when you drive with the AD function, it is your decision and therefore will remain your responsibility, and it will clearly be put in disclaimers. If AD dies, it would only be because no driver would want to endorse responsibility of the system, but not because people will claim against OEM - at least, not more than today.
-
I don't think it will - instead, I think the public opinion will shift to revulsion at the whole idea of manually driving a car. Think about existing legislation: seat belts, ABS, speed limiters, the recent whole-of-Wales reduction of the default speed limit to 20mph from 30 - it's all about increasingly small reductions in death and serious injury; self driving offers that a large reduction (which will be touted as a total prevention) may be possible and there isn't a politician who dares fight that! Car companies being sued as a result of their products failing? It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ... :sigh: And as the number of self driving cars increases and the communication between them (to increase safety and economy) rises as well the accident rate will plumet as a result. When humans realise that they can do what they want (legally) while the car does the work they will leap at the chance to browse social media, messages, phone calls, alcohol, drugs, tv, pr0n, ... Stuff they do at the moment anyway while they are supposed to be in control! I don't commute any more, but my regular commute was an hour each way with the lemmings on a motorcycle and the things I've seen while traffic is moving at 70mph was horrific, phones, texts, newspapers, even one guy with his lappie propped open on the dashboard typing away and steering with his elbows! Self driving cars will (eventually) be safer: and they are - probably - the future whether we like it or not.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
OriginalGriff wrote:
It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ... :sigh:
Didn't this famously happen in the '70s with the Pinto, which had a design flaw making it liable to bursting into flames in an accident, but it would have been more costly to retool than to pay out when it caught fire?
-
OriginalGriff wrote:
It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ... :sigh:
Didn't this famously happen in the '70s with the Pinto, which had a design flaw making it liable to bursting into flames in an accident, but it would have been more costly to retool than to pay out when it caught fire?
Yes, it would have cost $11 per car to fix, but that was too much for them to want to pay: Ford Pinto's Fuel System Redesign and Ethics - 1357 Words | Case Study Example[^] I don't drive Ford cars ... :-D
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
EV's are a given. The fact they're all electronic and "connected" means that over time, the driver will be dumbed down to the point we do have "self driving", and you can sleep in your car (the actual goal). At some point, you won't be able to go above a certain speed. No crossing double lines. The trafic system will slow you down before the light changes. If you go "off the grid", without a permit, your car won't. Etc. We're talking years; but it will come.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
-
EV's are a given. The fact they're all electronic and "connected" means that over time, the driver will be dumbed down to the point we do have "self driving", and you can sleep in your car (the actual goal). At some point, you won't be able to go above a certain speed. No crossing double lines. The trafic system will slow you down before the light changes. If you go "off the grid", without a permit, your car won't. Etc. We're talking years; but it will come.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
Speed limiters have been a legal requirement in all new cars sold in Europe since July 2022 - and it's only a matter of time before it is illegal to disable them (they default to on when the car is started, but can be disabled by the driver). I don't doubt that they will become a part of annual compulsory vehicle testing (MOT in the UK) once the first such vehicles become old enough to need testing.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
-
Speed limiters have been a legal requirement in all new cars sold in Europe since July 2022 - and it's only a matter of time before it is illegal to disable them (they default to on when the car is started, but can be disabled by the driver). I don't doubt that they will become a part of annual compulsory vehicle testing (MOT in the UK) once the first such vehicles become old enough to need testing.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
OriginalGriff wrote:
Speed limiters have been a legal requirement in all new cars sold in Europe since July 2022
No it isn't. In July 2022 they signed the 2019/2144 regulation that goes into effect July this year. And there is no limiter (well not yet anyway). There will be a requirement for all new cars to warn you if you're speeding though. And the warning/notification should be designed to be easily dismissed or ignored. According to the law, the audible and haptic warnings must be "as short as possible in duration to avoid potential annoyance of the driver." I read some preparatory work for the regulation a few years ago and a limiter was considered to be out of the question. Reason being: Assume you're overtaking a slower vehicle and that vehicle suddenly accelerates to the speed limit while you're overtaking, you would end up driving side by side with you being in the wrong lane. In short, speed limiters would be quite dangerous used wrongly, which they would be. Well, let's see how long they have that opinion, speed limiters are already in effect for lorries since many years...
Wrong is evil and must be defeated. - Jeff Ello
-
How long before self driving cars claims stop? I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse. Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher. So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children. Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc. So in the above with a self driving care no person can be at fault. Because they were not driving. Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car. And then they will sue them for 10 million. Or 100 million. Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded. Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal. One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath. Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that? And even if the action was exactly right, is a lawsuit against the company still going to happen?
jschell wrote:
How long before self driving cars claims stop?
It won't, full self driving will come next year. It has been coming next year since 2017. Not very off topic: [Tesla Has The Highest Accident Rate Of Any Auto Brand - Forbes](https://www.forbes.com/sites/stevebanker/2023/12/18/tesla-has-the-highest-accident-rate-of-any-auto-brand/) [SAE Levels of Driving Automation™ Refined for Clarity and International Audience](https://www.sae.org/blog/sae-j3016-update) Interestingly, the only car make to my knowledge that has any cars that has autonomous driving at level 3 is Mercedes-Benz. I don't really expect level five will be reached during my lifetime
Wrong is evil and must be defeated. - Jeff Ello
-
These safety problems are addressed in autonomous driving cars. I am working in the automotive industry, and I can assure you that I would buy an (electrical) AD vehicle from any European or Japanese car manufacturer without any discussion. I still have my own doubts regarding quality of EVs from China. And do not get me started about the Musk company - nice toy, best car to die in.
Rage wrote:
These safety problems are addressed in autonomous driving cars.
Not sure I understand your statement. For example self driving cars are not going to prevent heated seats from catching on fire. That is one of the recalls. And Telsa has a recall in effect to reduce the ability of their cars to self drive. So very specific to self driving.
-
I don't think it will - instead, I think the public opinion will shift to revulsion at the whole idea of manually driving a car. Think about existing legislation: seat belts, ABS, speed limiters, the recent whole-of-Wales reduction of the default speed limit to 20mph from 30 - it's all about increasingly small reductions in death and serious injury; self driving offers that a large reduction (which will be touted as a total prevention) may be possible and there isn't a politician who dares fight that! Car companies being sued as a result of their products failing? It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ... :sigh: And as the number of self driving cars increases and the communication between them (to increase safety and economy) rises as well the accident rate will plumet as a result. When humans realise that they can do what they want (legally) while the car does the work they will leap at the chance to browse social media, messages, phone calls, alcohol, drugs, tv, pr0n, ... Stuff they do at the moment anyway while they are supposed to be in control! I don't commute any more, but my regular commute was an hour each way with the lemmings on a motorcycle and the things I've seen while traffic is moving at 70mph was horrific, phones, texts, newspapers, even one guy with his lappie propped open on the dashboard typing away and steering with his elbows! Self driving cars will (eventually) be safer: and they are - probably - the future whether we like it or not.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
I've had almost the same thoughts about autonomous vehicles. There's a whole range of accidents that occur because meat sacks are in control. Have you ever arrived at your destination and realized you have no clear memory of the journey? There's other things our brain does to edit reality. There's a well known example of a group of people asked to watch a football match, and answer questions afterwards. The first question is "Did you notice the guy in the gorilla suit?" Most people miss it. Because your brain edits it out as "not important" to the football game. Similarly with driving - or really any activity. My thought is that several things are going to happen. Firstly, insurance companies are going to look at the numbers and raise the rates on non-autonomous vehicles, to the point where the average Joe is going to be motivated to move to an AV. Then, as non AV's move into the minority, and communication between AVs becomes standardized, NAVs will be required to have transponders that alert AVs to their presence. Eventually, NAVs will be banned, except in tightly controlled situations (e.g. parades, etc). I expect that as the technology grows, there will be some terrible incidents. But like the airline industry, investigations and recommendations will continue to make AVs safer over time.
"A little song, a little dance, a little seltzer down your pants" Chuckles the clown