UIEW6r.jpg

The Real Moral Dilemma of Self-Driving Cars

We talk about all the potentially challenging situations autonomous cars could get into but not about how human drivers are not very good. Tens of thousands die on the roads every year in collisions, most of which could be prevented by autonomous vehicles.
Sponsored by BMW

I wanted to make a video about autonomous cars for some time but I hadn’t had the opportunity. The self-driving technology is already at a state where it can save lives if only it were more widely implemented.

Links to original clips:
TED-Ed https://www.youtube.com/watch?v=ixIoDYVfKA0
BBC Newsnight: https://www.youtube.com/watch?v=FypPSJfCRFk&t=172s

Music from http://www.epidemicsound.com “Ambient Electronic Groove,” “Pet Animals 2,” “The Long Ride.”
Filmed by Raquel Nuno
Edited by Trevor Carlee

Similar Posts

50 Comments

  1. The answer to the moral "dilemma" is to not swerve and endanger another vehicle other than itself.

  2. Featuring a tiny glimpse into who films these vids for Derrick (2:43).

  3. The moral dilemmas are not the problem so much as the fact that if you hard code solutions into car software people can then exploit those scenarios reliably to murder people. e.g. If I want to kill the driver of car [A] I just have to wait at a point I know they routinely pass and then roll an empty pram into the path of oncoming car [B] such that it swerves into the path of car [A] rather than kill a baby that does not have the protection of a car around it. Do it on a bridge or near a steep drop-off and the destruction is potentially even greater. It is the predictability of the autonomous cars that will be the problem, a very practical exploit and not a question of philosophy etc.

  4. I cant see myself ever being comfortable with this self driving car crap. I don’t think it’ll solve anything in the long run. are these vehicles going to be impervious to malfunctions? just what we need, cars going hay wire with no way to stop them! or worse… people hacking into these cars and driving them from there couch.. people love to have the latest and greatest and are also lazy, so these things will sell no matter what is said.. and that’s what it all comes down to, money.

  5. I personally don’t think tech is ready for self driving cars. I think we should focus on improving current designs and maybe computer assisted driving.

  6. The correct answer is always SLAM ON THE BRAKES!!!! That’s why all accidents involving self driving cars have the self-driving car being rear ended.

  7. 0:43 huh… they should really make a smartphone with haptic feedback… kind of like nintendo’s HD rumble… so that smartphones can have good gamepads… and then we should make a new store for apps because google play store is nothing but shovelware…

  8. I’m skeptical that self-driving cars are be intrinsically safer than people-driven cars. I personally wouldn’t put my life in the hands of a self-driving car. Even if they are safer, that will take the fun out of driving cars. 🙁 But of course, if they really are safer, it’s better to go with the boring option if it saves thousands of lives.

  9. The biggest reason I want a self-driving car…
    So I can get wasted but take my car home XD

  10. Great point about the elevators, i wonder how manufacturers can make self driving cars appealing to human psychology at first too.

  11. Wonder how self driven cars would road rage after they somehow crashed into one another?

    Would they just shoot web slang at each other like STFU, or GKY while us humans watched.

    Imagine two cars fighting like a Prius VS a Tesla Model X it’d be highway gold lol.

  12. Wow. I just noticed Veritasium has an atomic mass of 42.0.

    420?

  13. Google is the only viable technology despite not having millions of blue money to put into it. The answer is if it’s not Google, kiss the ground when you get out. The fact you let that BMW drive is equivalent to a Blind Man and a great memory.

  14. I mean, you need to make sure the cars are safe before you put them on the road, moral dilemmas aside.

  15. I am disappointed of this video. Very superficial.. At least u can read in the descriptions that this is pure advertisement for BMW and autonomous cars. Sad.

  16. I will trust my own brain over a machine thank you.

  17. I always thought the moral dilemma was the CIA’s ability to control these kinds of cars.

  18. Why even own a car? Cars should be a lot smaller on average and owned by private companies or government or whatever and you could just order one with an app. It would pick u up and drop you off and go meet the next person’s needs.
    Like a Self driving Uber driver basically, that a thing yet?
    Also drones could replace delivery in many forms with the right infrastructure, which would take more vehicles off the roads.

  19. if everyone had the self driving car then the dilemma would not exist. No programmed car will make a mistake.

  20. Hey its cruising along in Vegas!

  21. So, one day, we may not need to worry about the problems of auto driving. If everyone had a auto car, we could put them on the same network and then they could tell each other "OBSTICAL! CAR ON LEFT? YES, CAR ON LEFT, BRAKE." Or something similar… Just an interesting idea…

  22. The real moral dilemma is whether the car should save you or pedestrians outside. Like the car either swerves into the crowd to save you, or the car causes you to die and not run over pedestrians

  23. Remember the blackberry days when your device froze and required a battery pull? What happens when the cars computer freezes?

  24. The problem is fairness. I completely agree that fewer crashes are a good thing, and I also agree that automation is the way to do that. It simply "FEELS" unfair if I end up in an accident because of a machine’s decision. If I crash, it’s my fault, and I can deal with that. If something crashes me, it seems somehow unfair.

  25. I guess the main thing is there’s too many variables. The examples you gave, Veritasium, of AI in planes and elevators is much like cruise control in cars. It controls a limited number of variables and it can’t really go wrong (for the cruise control example thats why the driver is told to keep paying attention to the road). However with complete AI control, theres way too many variables, possibilities and unknown circumstances to be completely covered by an AI which ensures human safety.

  26. I like how half of these comments are all science based comments from experience of disagreement and open discussion of bother pros and cons of the idea, even though I’m totally against the idea of cars with no human control.

  27. Here’s a moral dilemma for you: How can you procreate knowing full well that your offspring is going to die and had no choice of whether or not he/she wanted to exist in the first place.

  28. A quick question, did you rewrite the script anything or just went with what BMW gave you? Watching your other videos, there is a noticable difference. And now I feel sad.

  29. 2:11 "We’re using our phones"

    What do you mean, "we"? That super cool douchebag there is using his phone.

  30. To me, I think if all the cars were self driving, the driving AI under the same code, there wouldn’t be much of a problem. If you think about it, if all the cars were following one rule, that would be safer.

  31. The correct choice is the car should make the choice that is most likely to protect its occupants. I would never buy a car that is programmed to kill me. Nor would I hire a bodyguard that doesn’t put me as his priority. Most people agree with this sentiment.

    If you can’t convince the majority of people to buy autonomous cars, there *wont be* autonomous cars.

  32. To add my two cents to the discussions here, I see a different scary problem than ethics or assigning guilt in cases of accidents… I see a problem with hacking.
    I presume the cars wont be completely separated from the global net. And if the softwares architecture isnt 100% safe from breach, this would be the first time ever, that someone could "hack" into a controlable lethal weapon on large scale. I mean, pilotless war drones are probably not being remotely controled by criminals, but there are very few of these, are proabably really well made and are controled by the military. This stuff will one day be made in china for a thousand dolars… what level of security will those cars have?

  33. soo… electric cars may be "almost" on the road by the time global warming is bad? bc arkansas will have a beach in the middle of the state…

  34. This videos explains the problem in a way too simplistic manner. Treating the moral dilemna purely with "quantifiable" arguments (number of deaths, accidents, etc…) does not bring out the real moral dilemna: the fact that we lose our ability to chose. Imagine if instead of your mentionned case scenarios, the driver had to die to save an entire schoolbus ? Should the driver do so ? What if it was just one family car? We must not let algorithms take such decisions, no matter the death toll it produces. We may reduce the number of people dying on the streets, but in doing so, we are actually diminishing our right for self-determination and what constitutes our humanity.

  35. should have also talked about horsey horseless. Seems like a good automotive example of people not being comfortable with change.

  36. Hurry the heck up and mainstream self-driving cars so I can sleep the extra 20 minute drive to work. I want tinted windows as well so I can play with mys … I mean pick boogers without anyone noticing.

  37. Why is breaking never an option in these moral dilemma problems?

  38. Veritasium had a good segment on probability, my observation should have been on his mind when he made the comment that planes on autopilot are safer. This is a poor example. Pilots put planes on autopilot when the probability of things going wrong is lowest. Planes crash on take off and landing. Rarely do they crash in between when pilots put them on autopilot.

  39. Why don’t we all use bikes?

  40. Or… we could just bring back trains…

  41. How many fatal airplane crashes killing hundreds of people at once occurred because people were relying on a bad auto pilot?

  42. The real answer to the posed dilemma is that the car will attempt to either swerve towards the SUV or the motorcycle and it will communicate with whichever vehicle so that vehicle also moves, and so on if there is another vehicle beside that on, etc. There is no reason to assume the cars will not be aware of each other and talking to each other. Essentially all accidents will be avoided because each car will respond to each other in such a way to avoid collisions altogether.

  43. +Veritasium there was a case of death caused by self driving car. The driver was killed. No thanks for me.

  44. People don’t care about being safe, people care about feeling safe.

    Humans are morons by nature

  45. In a different video, you talk about thinking and Drew’s inability to hold too many pieces of information. As a middle school teacher, I’ve had students ask if they can us the calculator, and I always say no. Because we know that young children need to learn calculations so well that it becomes automatic. Every year, to prove my point, I would challenge a student to come up with a multiple digit multiplication problem. Another student and I would race; he/she would use a calculator and I would do it in my head. I almost always won. But I also practice. I know; I’m a geek. My point here is that if we allow drivers, especially young ones but really any age, to release their responsibility to the vehicle, then those people will lose the ability to analyze road conditions and make smart choices. I predict that we’ll see this first with lane departure warnings. People will get used to hearing a beep, until it becomes part of the background that they don’t hear it anymore. So yes, for a while you’ll see fewer accidents, but I think in the long run it will backfire. Someone needs to put cell phone jammers in cars so that as long as the tires are moving, the cell phone won’t work. I think that will do a lot more to reduce accidents than an autonomous car.

  46. "Filling the roads with autonomous cars will eliminate jobs" – False. Who will program new algorithms,software and protocols for these cars? Programmers. Who will repair these cars? Mechanics, electrical engineers, IT. Who will harvest the materials for these vehicles? Miners, heavy machine/vehicle operators, managers etc. Who will build the factories to produce these vehicles and the robots which assemble the parts that humans don’t assemble? Construction workers. The list goes on and on. People said the same thing about computers and calculators when they came out. "What will all the people with jobs in mathematics and engineering do with all these computers eliminating their jobs?" They find other things to do. There is always something. I have the theory of the conservation of jobs, jobs can’t be created or destroyed, they can only change form or be moved. 😛

  47. The real moral dilhema is:
    We can accept that humans make mistakes because we know we make mistakes, so we would accept a person chosing either the car to the left or bike to the right & the decision time prevents rationale to be considered in a split sec.
    But we would not see this in a machine that decided the same course each time resulting in bike riders having a phobia about passing or riding next to an automated car etc.

    The blame will be placed on the technology itself (not the driver) & then the software writers, & they will be morally judged differently to a human driver forced to make a split decision because the software writers chose the bike rider to die.
    This will feel like pre-meditated homocide.

  48. So the solution is to build things for idiots and morons.

  49. The way I see it (This is how *I* see it, so, y’know, opinions and such), is that the problem with driverless cars is their "brains" are too…weak. Much slower in processing, much worse memory, much less intuition, much less adaptability than a human brain, but the problem with drivers is their brains are too strong. I tried to fit this in four lines, but sadly…

    The human brain is super powerful, so even something as complex as driving is so simple the mind can wander and do something else and have a fair chance of not crashing, and if the driver is concentrating completely on driving, even in bad conditions, you (the driver) can trust yourself. So the mind wanders because driving is too easy, and so people get distracted and crash.

    I guess what i’m trying to say is that each system has positives and negatives. Computers are laughably, astoundingly worse than a human mind and might not be capable of handling sudden icy roads or potholes or whiteouts in bad weather, and might not be able to make the "right call" if something horrible happens, but on the other side, driving computers can’t get drunk, distracted, angry, or sleepy.

    If you give a car a perfectly flat road with no cars close to it on a sunny day, the car is going to be better than the person. But if something goes wrong and some quick thinking or intuition is required, i’d rather be the one driving. So weigh the odds. One situation happens ore often, and the other is catastrophic if you get it wrong.

  50. yea like who would program a car specifically to be racist? it doesnt have time to calculate that stuff, in this low chance scenario, it doesnt really matter in the long run because its not going to happen very often, and if it does, why cant the driver take control?

Leave a Reply

Your email address will not be published. Required fields are marked *