ecFH4f.jpg

The ethical dilemma of self-driving cars – Patrick Lin

View full lesson: http://ed.ted.com/lessons/the-ethical-dilemma-of-self-driving-cars-patrick-lin

Self-driving cars are already cruising the streets today. And while these cars will ultimately be safer and cleaner than their manual counterparts, they can’t completely avoid accidents altogether. How should the car be programmed if it encounters an unavoidable accident? Patrick Lin navigates the murky ethics of self-driving cars.

Lesson by Patrick Lin, animation by Yukai Du.

Similar Posts

50 Comments

  1. The problem with this concept, is that you do not consider that the self driving car was not keeping a safe distance from the truck in front of it. Even as a human driver, you are expected to keep a save distance, extra distance should be added because it is a truck that is carrying an unsecured load. At least visibly you can see it is a high risk load as it contains stacked items.
    Also, the example shown clearly shows unsafe driving of all participants, one you are not suppose to pass on the right side, and stay with the flow of traffic in one lane. So the human factor of bad driving is huge here.
    Now, you also forget that over time more cars will be autonomous, hence more and more cars will keep in a safe driving position. Hence this problem will eventually become irrelevant. You could also argue that the decision that was made by the "programmer" for the car is no different then the random choice you make at the time.
    Then, when an autonomous car makes decisions it would also consider new situations within fractions of a second. If it swerves and then detects another collision it would be able to quickly react to that. In theory the car might actually swerve so quickly and accurately it would not crash or hit anything.

    I believe this thought experiment is just blatantly ignoring obvious factors and how different humans vs machines react. Especially when it comes to reaction time. That already makes the concept pointless. In the end, you still have the human error, the fault lies with the person who did not secure the load.

  2. Accidents are just that, an accident. No matter the programming and the scenarios inputted in the cars system, I find it hard to believe that it would be prepared to react to an accident.

    I honestly disagree with the self-driving car phenomena. If you can’t drive just get off the road.

  3. Your smart car should be a better driver then that and shouldn’t have been driving that close to the truck thus never being in that situation. DUH

  4. This is a wrong exemple. This situation should not exist. The self driving car should not drive so close to the truck. If the car were driving further away from the truck, it would have had time to stop and no one would have been injured.

  5. If all cars are self driving and communicate with each other it wouldn’t matter which way you swerve as the other car you were about to swerve in to would detect this and compensate by slowing down or speeding up.

  6. I dont understand the assumption that you are allowed to careen into another vehicle to avoid a sudden oncoming collision. As the operator of a vehicle, you are responsible for everything that you hit, whether out of a sudden reaction or not. If I cant hit another vehicle to prevent a collision, than neither can my car. Autonomous cars should just brake and any attempt to swerve should be at the discretion of the operator, who is legally required to be aware of the situation even if the car is autonomous. At least for now.

  7. program enough space behind the truck. Done. this is why morals dont make sense if your a programmer.

  8. Even if all vehicles were controlled by computers (which may or may not be controlled by 1984’s Big Brother), don’t always count on a network of autonomous vehicles communicating together, to react to an incoming accident. There are always kids running after a ball, in the middle of the road. There is always a pedestrian jaywalking. There is always some drunk douche bag racing in his car, believing its central computer is an excuse for reckless behaviour (and, before you say the car should not allow his behaviour, think of this : a driver should always be capable of overriding the computer’s protocols and taking full control, at any time, anyway).

  9. seems like the option to Stop is not available in self driving cars, then I’m scared. I think that although plausible, these ethical issues are overwhelmingly extrapolated and it just seems like they’re trying to stop self driven cars from being produced, not to contribute to the advancement of technology

  10. go backwards

  11. why are we trying to save lives? we all gotta go some day right?

  12. I am so surprised by how many people in the comments section here are trying to "outwit" this dilemma situation. "Use the help of the other self-driving cars" or "why would it happen if the car itself is able to measure out a safe distance." Your thinking is too specific. Think in broad terms.
    This type of accident can happen anytime and and any place despite the most advanced of technology. What if a random person happened to be riding his skateboard and rolled off onto the road due to an icy trail? Because humans are not without error. To get rid of error, you get rid of humans. So the smartest thing would be to tackle this problem through ethical questioning, not pinpointing flaws in a specific scenario.

  13. Frankly I see these so-called  scenario  as a joke. We are placing moral judgement of right and wrong, good and bad as the responsible party, the machine.  HELLO, HELLO  Has Anyone Ever Heard of Human Drivers Behind The Wheel of The Car? Some of Us Don’t Need Hazardous Condition For A Life and Death Situation.   Well Has Anyone Heard of ROAD RAGE?  Are There Humans That Just Love To Tailgate Other Humans?  Are There Humans That If You Kiss Them Off In Traffic They will violate all kinds of laws chasing someone down the road ways.  Has anyone ever heard of human drivers drunk behind the wheel of a car?It Amazes me that humans are SUCH A HUGE RISK WHEN IT COMES TO DRIVING AND WE’VE GOT THE NEVER TO SAY I WONDER IF SELF DRIVING CARS WILL PROVE TO BE BETTER DRIVERS-WELL COMPARED TO MANY OF US THEY SURE AS HECK CAN’T BE ANY WORSE!

  14. What if we programmed the self-driving car to act like a human during accident? A "panicked" reaction, one chosen at random, for each accident?

  15. If we had the technology to have self-driving cars, don’t you think that they would be able to communicate. That sounds absurd but many devices already do it, therefore eradicating this problem

  16. Wow, I never considered this. Pretty Deep… although, I believe autonomous vehicles are programmed to keep a certain following distance.

  17. The ethical thing to do would be to allow your vehicle to be hit. Why does someone else have to suffer for your bad luck? This situation happened to you.

  18. or program the car so that it avoids getting boxed up. Which is exactly how everyone should be driving.

  19. What about self driving trucks to prevent this problem in the first place ?

  20. Great points but it misses an enormous point: the programmers designing the software will be no better than the ones that we have today which means it will be sloppy, inadequately tested, and loaded with bugs. Chances are this scenerio would not happen because the happy couple would have entered the vehicle only to find that the car updated itself over wifi and now the engine won’t start. Remember the 90s when everyone said that it was ok for Windows to be as buggy as it was because Microsoft didn’t control the hardware (and then years later they DID control the hardware behind the Xbox 360 which was one of the most unreliable pieces of tech ever released)? Well, unless Google and Apple become the world’s only automakers it means that software companies will be designing software for Dodge, Chevy, Ford, Lexus, Toyota, Hundai, Honda, Kia, etc. This means we’re going to have a world where the autonomous car software companies will not be in charge of the hardware. The industry has already said this situation makes it acceptable for their products to be loaded with bugs! I’m sorry but the talent just is not there in this industry to expect these cars to be safe. Case in point: autocorrect. Autocorrect tends to make more errors than it fixes, but these same developers are to be trusted with the far more complicated task of safe driving? We’ve become numb to restarting software and fixing the problems caused by bad code but you can’t redo your deadly ride to work.

  21. Why not just let the car stop?

  22. Don’t hit anyone, instead, stop pressing the gas pedal too hard freeway person.

  23. Interesting questions. But they all just boil down to utilitarian ethics. And since self-driving cars can potentially save tens of thousands of lives every year, these cases might be seen as unnecessary handwringing. Idk. It’s just pretty obvious to me that the response mechanisms should favor utilitarian solutions over self-preserving ones, and that’s the only way to make the algorithms ethical. (But this is principle that non-self-driving people often don’t respond to on any level, let alone on the level we are talking about in these cases.)

  24. I think that in the future as more and more vehicles become "self driving" they will be connected together so vehicle avoiding deadly accident etc. a kid on road would tell the car next to it as well so they would manover together so that noone would get harmed or the damage and danger is minimal

  25. So THAT’s why cartoon vehicles have ejector seats.

  26. Facinating thought experiment… and a bit scary at the same time.

  27. Just take the hit and don’t swerve, but let’s get also work hard to make sure that cars can take these hard hits and still save the passengers lives, it’s a win/win

  28. One problem is the transition between, at the start self driving cars will be more dangerous until most people get one and its considered "dangerous" to not own one :/

  29. Suggestion, a rake like spiked would shoot down into the back using high pressurised nitrogen that would penetrate hardened concrete to stop the car.

  30. What will happen is an increase in media coverage for tragic accidents caused by "distracted driving", cars allegedly being used as weapons and so on. This will allow for legislation approving only autonomous vehicles on the road. Those who disagree or who cannot afford such new vehicles will use mass transit or newly created bike lanes (as have been well-funded for exactly this result).

  31. How about crash into someone who is black? Even if there is no need to hahahaha!

  32. This video is very interesting, however, a self driving vehicle, shouldn’t find itself in the situation mentioned in the video because in that case, it wouldn’t have enough space to arrest, meaning it didn’t keep the right distances with the next vehicle.

  33. I wonder if in that case what if there was also another self driving car around the initial self driving car and if that was the case there could be a distress signal sent, thus the other self driving car(s) could provide a escape or more optimal route then the ones that have been described

  34. motorcycling is dangerous it’s their fault they used one

  35. I think it should come as part of the "deal" when buying a self-driving car, that the buyer assumes responsibility and risk, even if that comes at the cost of one’s own life. If mandated by law, this could fundamentally shift the moral paradigm of our current society, which currently favors one’s self instead of other’s. Sure, it would discourage buyers, but it could also serve to drive demand down (therefore prices) at first. Eventually, once driving cars become ubiquitous, this would practically become a non-issue, as accidents become less and less common.

  36. Strongly related: http://moralmachine.mit.edu/

  37. The government should date that sell driving cars should cause the least harm to others. If you choose to drive a self driving car, you have to pay for potential consequences.

  38. I have the perfect solution!! Let’s carefully secure our heavy objects!!!

  39. There could possibly be an underground programming ring that takes your self-driving cars and adjusts them to prioritize you

  40. the situations you talk about here are pretty rare. like Veritarsium said, the millions of deaths per year caused by car accidents could be significantly reduced. the real moral dilemma is not the rarest cases that are stated here but instead how soon we should get the cars out there

  41. simple make a random faction.

  42. Self driving cars are openly a future now. The only question is how manufacturer companies and governments make them more safer and ability to avoid accidental situations.

  43. Just wasted precious time of my life.

  44. Your hypothetical situation is easily avoided by actual safe driving, which the auto car will do. You are, by law, supposed to remain a safe distance behind any vehicle in front of you. If you are in a self driving car that is too close to stop, then you are in a self driving car that is not following the law. One has to question why we should consider such foolish hypothetical situations that necessarily require the car to do what the car won’t do.
    Furthermore, you are not accounting for reactionary time in this safe distance. Why is it that people always feel compelled to force dichotomies (or trichotomies in this case) on situations? You don’t think that a self driving car that has the ability to react within a thousandth the time of the operator would be able to hit the brakes, begin to slow down (given that it, by law, must remain a safe distance behind the truck in front), and then turn into another lane, weaving behind or around the SUV or motorcycle?

    The problem with hypothetical problems, is that we can continue to make hypothetical problems about a fraction of a percent of the outcomes….not to mention, the ability to deal with the ethical complications down the road. Here, it is illegal to operate a motorcycle without a helmet, and the laws will increase in order to promote safety.

    The more we sit around and ponder pointless moral dilemmas about whether a computer making a decision to take an action that results in collateral damage is somehow semantically different than if a human did the same thing, the more people die on the streets from preventable accidents that self driving cars would have eliminated. Rational thought is only rational when we don’t irrationally let our fear of change raise pointless moral dilemmas that ultimately result in the more irrational outcome of letting people die at higher rates because we are threatened by change and innovation, and justify it by arguing the semantics of whether it is a decision or reaction on the computer’s part.

    The real moral dilemma with the future of self driving cars, is that if the safest outcome it decides involves subjecting the passengers to injury instead of avoiding it and causing harm to someone else, will it affect the sales comparitively to normal cars. Will people buy cars that may put their life in danger for the overall safety of humanity as a whole, reducing accidental deaths on the road? Or will people opt to stick to driving for themselves if it means not having to take that personal risk? It’s one thing to apply reason and logic to yourself, and say, yes, I am willing to increase my risk factor ever so slightly in order to protect society as a whole, but am I willing to do so for my children? Fundamentally, that goes against our most basic biology. And if this is the case, then will auto makers produce cars that will deflect the threat of injury from the driver in order to improve sales, and what will the legality be around this?
    This still is a hypothetical question, and one that needn’t be of concern to launching these vehicles on the roads en masse, since, like all innovations and entrepreneur adventures, taking action first is always the best course for success in the industry, and capital growth. The issues raised, and changes needed can be made on the journey to implementing these cars. It isn’t an excuse to let our irrational fear of change hold us back from taking action.

  45. just keep safe distance and speed in a first place

  46. Just have the car break. It doesn’t need to hit anyone. Drive at your own risk. Makes sense to me.

  47. The answer is too simple: program the bikes to self-drive and to detect upcoming collisions. Duh. And as the others already mentioned, connect these to a network.

  48. How about hit the brakes?

  49. This is making a very huge assumption that swerving is safer. There are so many unpredictable variables. What if the SUV is forced into dirt or a ditch and it flips, killing the family? What if either of the cars swerve back into traffic and cause a pileup with fatalities? What if the motorcycle flips into traffic, causing multiple accidents with possible fatalities? The safest option is break and hope the part of the car that was made more effective at taking impacts than any other part, does it’s job and keeps you alive.
    There isn’t even a question here.

  50. That’s why self driving cars need collision alerts and manual override.

Leave a Reply

Your email address will not be published. Required fields are marked *