Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Uber self driving car kills woman

Uber self driving car kills woman

Scheduled Pinned Locked Moved The Lounge
com
90 Posts 30 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M milo xml

    I saw the video of what happened last night. While I don't think most normal people would have been able to react in time, I was dismayed that the car didn't identify the person sooner and prevent it. Police In Arizona Release Dashcam Video Of Fatal Crash Involving Self-Driving Car : The Two-Way : NPR[^]

    C Offline
    C Offline
    CodeWraith
    wrote on last edited by
    #76

    Exactly what I mean. It can only react to a situation, but posesses no foresight.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    1 Reply Last reply
    0
    • H HobbyProggy

      Saw it, and that clarified for me that the car was malfunctioning and the "safetydriver" failed totally. The darkness is just because the Dashcam doesn't work as good as a human eye, therefore the light seen might be way of what the driver should have seen and this also should not affect the radar components!

      Rules for the FOSW ![^]

      if(!string.IsNullOrWhiteSpace(_signature))
      {
      MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
      }
      else
      {
      MessageBox.Show("404-Signature not found");
      }

      S Offline
      S Offline
      Stefan_Lang
      wrote on last edited by
      #77

      I doubt that the driver could have reacted in time even if she had paid close attention: the woman was crossing midway between two overhead lamps, in the darkest area of the street, not wearing reflective clothing, and no active lights on the bike. Even when considering that the driver's eyes should have adapted somewhat to the darkness, it was near impossible to spot the pedestrian wearing dark clothes in the darkest possible area outside the range of the headlights.

      GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

      H 1 Reply Last reply
      0
      • S Stefan_Lang

        I doubt that the driver could have reacted in time even if she had paid close attention: the woman was crossing midway between two overhead lamps, in the darkest area of the street, not wearing reflective clothing, and no active lights on the bike. Even when considering that the driver's eyes should have adapted somewhat to the darkness, it was near impossible to spot the pedestrian wearing dark clothes in the darkest possible area outside the range of the headlights.

        GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

        H Offline
        H Offline
        HobbyProggy
        wrote on last edited by
        #78

        And that's exactly where the cars systems should have kicked in, by the way, you saw the driver looking to the left ? It felt kinda like she spotted the pedestrian but that is just an assumption. But still, the safetydriver is there to react and pay attention, she failed on that job. It really is questionable if the accident could have been avoided but i guess since there was no breaking effort done by the car they are mostly responsible for the accident. EDIT: An HDR picture of the scene at darkness Where it happened - Album on Imgur[^]

        Rules for the FOSW ![^]

        if(!string.IsNullOrWhiteSpace(_signature))
        {
        MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
        }
        else
        {
        MessageBox.Show("404-Signature not found");
        }

        S 1 Reply Last reply
        0
        • H HobbyProggy

          And that's exactly where the cars systems should have kicked in, by the way, you saw the driver looking to the left ? It felt kinda like she spotted the pedestrian but that is just an assumption. But still, the safetydriver is there to react and pay attention, she failed on that job. It really is questionable if the accident could have been avoided but i guess since there was no breaking effort done by the car they are mostly responsible for the accident. EDIT: An HDR picture of the scene at darkness Where it happened - Album on Imgur[^]

          Rules for the FOSW ![^]

          if(!string.IsNullOrWhiteSpace(_signature))
          {
          MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
          }
          else
          {
          MessageBox.Show("404-Signature not found");
          }

          S Offline
          S Offline
          Stefan_Lang
          wrote on last edited by
          #79

          I agree that the car should have reacted. Even with just the video as input, there was at least a second to hit the brakes. there is no good reason why it didn't.

          GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

          H 1 Reply Last reply
          0
          • S Stefan_Lang

            I agree that the car should have reacted. Even with just the video as input, there was at least a second to hit the brakes. there is no good reason why it didn't.

            GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

            H Offline
            H Offline
            HobbyProggy
            wrote on last edited by
            #80

            And the car has radar installed

            Stefan_Lang wrote:

            Even with just the video as input

            Rules for the FOSW ![^]

            if(!string.IsNullOrWhiteSpace(_signature))
            {
            MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
            }
            else
            {
            MessageBox.Show("404-Signature not found");
            }

            1 Reply Last reply
            0
            • OriginalGriffO OriginalGriff

              Self-driving Uber kills Arizona woman in first fatal crash involving pedestrian | Technology | The Guardian[^] And we know who the passenger was, don't we: God Mode ON | CommitStrip[^]

              Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

              L Offline
              L Offline
              Lost User
              wrote on last edited by
              #81

              It's not the "developer" that "pushes" to have things put into "production". There are some narrow-minded "executives" there that have exceeded their level of competence. Reminds me of shooting chimps into space. Or, can't make an omelette without breaking some eggs.

              "(I) am amazed to see myself here rather than there ... now rather than then". ― Blaise Pascal

              1 Reply Last reply
              0
              • F Fueled By Decaff

                Thank you for that, it was an interesting read. From that report it seems that the self driving cars have less minor accidents. It is a lot closer with the more significant accidents, but self-driving cars still have less accidents (although by the admission of the report there is too little data to form any conclusions.) I personally think they over-estimate the number of unreported serious accidents - although I might be wrong there. One thing they omit is the number of incidents that are averted by the driver interceding. I believe all of the data was gathered with an actual driver. What we are seeing more of now is driverless cars. BTW in case you have not guessed I am against driverless cars, as I do not think they are ready yet, but I am not against self-driving cars.

                T Offline
                T Offline
                Tomaz Stih 0
                wrote on last edited by
                #82

                After seeing the accident I now think there is a serious flaw in Uber software. The car didn't even try to apply brakes. Going 40mph directly into human. Besides the fact that there was a time frame of cca. 2 seconds (enough to at least try to apply brakes), and that sensors must've detected the obstacle on the road (Uber has multiple lidars, and radars!) long before that. It probably had at least 6 seconds to react, because it can see in the dark. For self driving car this was an avoidable accident.

                1 Reply Last reply
                0
                • C CodeWraith

                  Now I want to see how they show why their little monster did what it did, how it will react in other situations and how to 'cure' it from its delusions. The AI fans always forget that even the dumbest human driver has a few million years of evolution behind him. How can they think to play better in the same league with x hours of training and 'testing'?

                  I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                  J Offline
                  J Offline
                  jschell
                  wrote on last edited by
                  #83

                  CodeWraith wrote:

                  that even the dumbest human driver has a few million years of evolution behind him. How can they think to play better in the same league with x hours of training and 'testing'?

                  Those same ones that managed 19,000 deaths and 2.3 million injuries in just the US in the first 6 months of 2015? I am guessing there is quite a bit of wiggle room between 1 and 2.3 million for it to play with. Not to mention of course that all of that evolution has lead to people attempting to text, make phone calls, yelling at the other people in the car, getting high (in the car), eating, putting on makeup and even sometimes putting on their clothes (someone told me they use to change while driving down the a relatively busy and high speed street all the time.) Pretty sure a computer will not be doing most of that. U.S. Traffic Deaths, Injuries and Related Costs Up in 2015[^]

                  1 Reply Last reply
                  0
                  • OriginalGriffO OriginalGriff

                    They haven't released the video footage, but the reports say it was her fault - she walked out in front of it so close than nothing could have prevented the collision, human or robotic driver: Tempe police chief: Uber 'likely' not at fault in fatal self-driving car crash - Business Insider[^] And you can be sure that there is more telemetry and recorded info in this accident than in any previous death-by-driving case, with the possible exception of Ayrton Senna...

                    Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!

                    J Offline
                    J Offline
                    jschell
                    wrote on last edited by
                    #84

                    And as I understood it at night, in the middle of the street (crosswalk down the street) and at least on a bend in the road.

                    1 Reply Last reply
                    0
                    • C CodeWraith

                      That may be. There are certainly hopeless situations. Still, no telemetry in the world is going to tell us why the AI did or did not do something. Would you like to have to make any guarantees for the behavior of your contraption? They don't have to become Terminators to be dangerous.

                      I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                      J Offline
                      J Offline
                      jschell
                      wrote on last edited by
                      #85

                      CodeWraith wrote:

                      Would you like to have to make any guarantees for the behavior of your contraption?

                      Do you know what the "emergency brake" used to be for? Do you why it is now the "parking brake" instead? Do you know what anti-lock brakes are for? Do you know why they are safer, for most people, than versus the alternative, for most people, in the past? What about when cars will not stop? This happens apparently more than I thought because I found the following looking for the other example that I know exists. Driver was unable to stop or slow down his car[^] So perhaps you don't drive at all, but everyone else already relies on the behavior of their "contraption".

                      1 Reply Last reply
                      0
                      • L Lost User

                        OriginalGriff wrote:

                        nothing could have prevented the collision, human or robotic driver

                        Yeah well I would dispute that, we've all been in that situation driving along where nobody is in front of you but they are near enough that you keep your eyes open - people walking close to the edge of the road, kids playing football in front of their house, dog walkers with the dog jumping about ... If this woman "walked out in front of it so close than nothing could have prevented the collision" seems likely she was already close to the edge of the road, most humans would (1) gently nudge the car away from that lane/road edge before reaching (I'm sure in Az the lanes are wide enough), and (2) pay extra attention to watch for change of direction. There's more to driving then what does happen, but being ready for what else can happen - yes some things are completely unexpected but where you can anticipate these possibilities you can and should be prepared. You see a drunk on the road do you pass within inches or wait till a nice big gap appears...

                        J Offline
                        J Offline
                        jschell
                        wrote on last edited by
                        #86

                        lopati: roaming wrote:

                        If this woman "walked out in front of it so close than nothing could have prevented the collision" seems likely she was already close to the edge of the road, most humans would (1) gently nudge the car away from that lane/road edge before reaching (I'm sure in Az the lanes are wide enough), and (2) pay extra attention to watch for change of direction.

                        Most? Exactly what percentage is "most"? I know for a fact that where I live there is a law, a specific law, that says that people must move into another lane when approaching emergency vehicles on the side of the road. They even recently made a special effort to look for and ticket people that did not (so effectively like speed traps.) So certainly some people believe "most", what ever percentage that is, is certainly not enough in exactly the situation that is most apparent - the one with the big blinding police lights. That suggests to me that even less than "most" are going to do that when there is some obstacle on the side. I will note that I do slow down and give extra room. Where upon sometimes other people pass me, sometimes illegally, going at a speed that exceeds the speed limit, when I do so.

                        lopati: roaming wrote:

                        You see a drunk on the road do you pass within inches or wait till a nice big gap appears...

                        You think that is an argument for not having self driving cars?

                        1 Reply Last reply
                        0
                        • C CodeWraith

                          KennethKennedy wrote:

                          By the time this technology makes it to the mainstream, all the bugs will be sorted

                          Really? How will they do that? How do you unit test the AI? How do you prove that your AI can deal with any circumstances a very complex world throws at it? Look at how miserably we fail at testing normal code made up of simple, limited functions. From where do you take the optimism that this will miraculously work for something as complex as an AI?

                          I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                          J Offline
                          J Offline
                          jschell
                          wrote on last edited by
                          #87

                          CodeWraith wrote:

                          Really? How will they do that? How do you unit test the AI? How do you prove that your AI can deal with any circumstances a very complex world throws at it?

                          You must drive somewhere, some other country even, than the one I live in. Nothing like, every single time, there is a major storm watching the videos of many cars crashed off the side of the road because people failed when driving in that. Not to mention multiple car pile ups where people were going to fast for conditions. Then there are the accidents where someone hits the wrong pedal and ends up inside a building. Or actual clubs whose sole purpose is to race, actually race, down normal streets late at night. Hundreds of people show up at these meet ups. Not to mention, drunks, high, medicated (prescribed by the way), falling asleep and a huge variety of other distractions. I once was on the highway and looked over to see a car with no driver. Turned out the driver was completely prone reaching for something in the passenger seat.

                          CodeWraith wrote:

                          Look at how miserably we fail at testing normal code made up of simple, limited functions.

                          However that proves the very point. You are claiming that human programmers are fallible. But so are human drivers. But the code IS tested. Are you claiming that every human driver is tested as extensively? Especially on an on-going basis?

                          C 1 Reply Last reply
                          0
                          • S Stefan_Lang

                            I did also say at 38 mph. Typically a human moving at 38 mph through pretty much all of evolution was only seeing one thing, and that is the ground he was about to hit - not the kind of stuff going into the genes except into the genes of the onlookers. If evolution taught us anything it is that moving at 38 mph is fatal. Now, of course, if your forefathers were running through the jungle they certainly did learn to react to a creature moving into their path. But, depending on the number of claws and teeth (or raised clubs) of that creature, stopping might not have been the preferred type of reaction. I'm not saying that this is not an important bit of information when deciding that you need to slow down when something moves into your path, but it's also so much different from the evolutionary training, that the lesson learned can be pretty much reduced to saying that: if something moves into your path, slow down. And that is trivial to learn for any autonomous system, no matter how small. In case of the accident, this raises the question why the cars sensors did not detect the woman, or identify it as an actual obstacle. Apparently the driver didn't either, or at least not in time, and his millions of years of evolution didn't help him in any way there. But the car's systems should have been able to both detect the woman (using the LiDAR sensors) and react to it as well (thanks to super-human reaction times). The investigation should focus on these questions.

                            GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

                            J Offline
                            J Offline
                            jschell
                            wrote on last edited by
                            #88

                            Stefan_Lang wrote:

                            I did also say at 38 mph. Typically a human moving at 38 mph through pretty much all of evolution was only seeing one thing, and that is the ground he was about to hit - not the kind of stuff going into the genes except into the genes of the onlookers. If evolution taught us anything it is that moving at 38 mph is fatal. Now, of course, if your forefathers were running through the jungle they certainly did learn to react to a creature moving into their path. But, depending on the number of claws and teeth (or raised clubs) of that creature, stopping might not have been the preferred type of reaction.

                            I would say your well thought out logic is interfering with an ill-thought out rant.

                            1 Reply Last reply
                            0
                            • J jschell

                              CodeWraith wrote:

                              Really? How will they do that? How do you unit test the AI? How do you prove that your AI can deal with any circumstances a very complex world throws at it?

                              You must drive somewhere, some other country even, than the one I live in. Nothing like, every single time, there is a major storm watching the videos of many cars crashed off the side of the road because people failed when driving in that. Not to mention multiple car pile ups where people were going to fast for conditions. Then there are the accidents where someone hits the wrong pedal and ends up inside a building. Or actual clubs whose sole purpose is to race, actually race, down normal streets late at night. Hundreds of people show up at these meet ups. Not to mention, drunks, high, medicated (prescribed by the way), falling asleep and a huge variety of other distractions. I once was on the highway and looked over to see a car with no driver. Turned out the driver was completely prone reaching for something in the passenger seat.

                              CodeWraith wrote:

                              Look at how miserably we fail at testing normal code made up of simple, limited functions.

                              However that proves the very point. You are claiming that human programmers are fallible. But so are human drivers. But the code IS tested. Are you claiming that every human driver is tested as extensively? Especially on an on-going basis?

                              C Offline
                              C Offline
                              CodeWraith
                              wrote on last edited by
                              #89

                              No. All I am saying is that you are making a deal with the devil. The good part is that the devil likes to honor agreements to the letter, but usually in a way you are not going to like at all. I have played enough with AI to tell you that exactly this is going to happen. It already happens in simple scenarios and complex real world scenarios just beg for this behavior. It's the very nature of any AI to explore the possibilities within the frame you have set with your directives. I wish you good luck when someone wants to hold you accountable for the actions of your product and you have to explain everything to a judge.

                              I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                              J 1 Reply Last reply
                              0
                              • C CodeWraith

                                No. All I am saying is that you are making a deal with the devil. The good part is that the devil likes to honor agreements to the letter, but usually in a way you are not going to like at all. I have played enough with AI to tell you that exactly this is going to happen. It already happens in simple scenarios and complex real world scenarios just beg for this behavior. It's the very nature of any AI to explore the possibilities within the frame you have set with your directives. I wish you good luck when someone wants to hold you accountable for the actions of your product and you have to explain everything to a judge.

                                I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                                J Offline
                                J Offline
                                jschell
                                wrote on last edited by
                                #90

                                CodeWraith wrote:

                                No. All I am saying is that you are making a deal with the devil. The good part is that the devil likes to honor agreements to the letter, but usually in a way you are not going to like at all.

                                I doubt that. For example I would expect that a self-driving car would stop at a red light always. Now I always attempt to stop at red lights. Always. Very occasionally that is a bad decision because I end up sliding through the intersection on the ice. And that is something that I am very ill-equipped to deal with. I suspect a self driving car would be better able to do that.

                                CodeWraith wrote:

                                I have played enough with AI to tell you that exactly this is going to happen

                                You mean versus my last three cars that were totaled by the illegal actions of other drivers? So the AI is not going to be obeying the traffic laws and would not be better capable of detecting and avoiding collisions?

                                CodeWraith wrote:

                                and you have to explain everything to a judge.

                                Versus the multiple drivers whose cars have already unexpectedly accelerated or refused to stop? Versus the drivers who are still driving with multiple DUI convictions? Versus the drivers whose licenses are suspended immediately by a judge and then who leave the court and get into their car and drive away?

                                1 Reply Last reply
                                0
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Don't have an account? Register

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • World
                                • Users
                                • Groups