|
http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/ Actually interesting article about the dilemma of a car making moral decisions in the case of unanticipated scenarios. (i.e. how does the car react to minimize lives, and whose lives have priority)
|
# ? Oct 24, 2015 07:10 |
|
|
# ? May 25, 2024 14:41 |
|
flosofl posted:
Aasimov bursts forth from his grave, screaming "THE THREE LAWS OF ROBOTICS IS AN INHERENTLY FLAWED CONCEPT AS EXEMPLIFIED IN THE STORIES IN WHICH THEY APPEAR!"
|
# ? Oct 24, 2015 07:29 |
|
In general self-driving cars will deal with these dilemmas by not speeding, not following too closely, having super fast reaction times, and so forth, which means they don't get into them in the first place. They simply apply the brakes and don't hit anyone.
|
# ? Oct 24, 2015 07:40 |
|
First Law of Self-driving Cars: Ram everything Second law of Selfdriving cars: Ram what hasnt't been rammed. Third law of selfdriving cars: Smooth driving.
|
# ? Oct 24, 2015 07:43 |
|
flosofl posted:
I don't think it's that interesting. Don't swerve. Don't guess and trade off lives, just stop.
|
# ? Oct 24, 2015 07:48 |
|
Everyone who has ever crashed a car should have really just stopped, instead of crashing imo
|
# ? Oct 24, 2015 08:32 |
|
Bogan Krkic posted:Everyone who has ever crashed a car should have really just stopped, instead of crashing imo A computer specifically built to safely handle the exact situation of a crash will have a hell of a better reaction time than even an alert, sober human, let alone a potentially drunk and/or fatigued one as is the case for at least one party in many crashes. I mean you're almost definitely trolling but it can't hurt to make that double clear for anyone else.
|
# ? Oct 24, 2015 08:37 |
|
The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error.
|
# ? Oct 24, 2015 08:52 |
|
The largest thing I fear with Self-Driving Cars is everyone not in them. Cyclists, pedestrians, other road vehicles, all quite unpredictable with slower unpredictable reactions.
|
# ? Oct 24, 2015 11:31 |
|
Snapchat A Titty posted:First Law of Self-driving Cars: Ram everything No no no. The three laws of robotomobiles are: 1.) Find a Farmers Market 2.) Run over several fat women 3.) Play recording of "I thought I was hitting the brake." when police arrive
|
# ? Oct 24, 2015 12:01 |
|
Hidden Directive: Lock Will Smith inside you, and keep him there. It feels good.
|
# ? Oct 24, 2015 13:39 |
|
Felix_Cat posted:In general self-driving cars will deal with these dilemmas by not speeding, not following too closely, having super fast reaction times, and so forth, which means they don't get into them in the first place. They simply apply the brakes and don't hit anyone. If my self-driving car isn't going to have Super Pursuit Mode (Late for Work Mode), I'm not interested. https://www.youtube.com/watch?v=WyYB-E0rnw4
|
# ? Oct 24, 2015 16:34 |
|
Bogan Krkic posted:The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error. Seriously, it's like people don't remember the time we solved ship deaths with that unsinkable ocean liner
|
# ? Oct 24, 2015 16:40 |
blugu64 posted:Seriously, it's like people don't remember the time we solved ship deaths with that unsinkable ocean liner Self driving cars don't need to be 100% perfect all the time always to be orders of magnitude safer than human drivers. Seriously, human drivers are loving awful. It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers. Theris has a new favorite as of 19:38 on Oct 24, 2015 |
|
# ? Oct 24, 2015 19:07 |
|
Theris posted:Self driving cars don't need to be 100% perfect all the time always to be orders of magnitude safer than human drivers. Seriously, human drivers are loving awful. It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers. The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities. "All they need to do is brake."
|
# ? Oct 24, 2015 19:52 |
|
Remember that one time a Tesla caught fire? Absolute proof electric cars are death traps.
|
# ? Oct 24, 2015 20:10 |
|
Theris posted:It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers. the hubris of man
|
# ? Oct 24, 2015 20:11 |
|
Nomad175 posted:The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities. Imagining all the worlds cars networking and deciding to lock their brakes perminantly. Crashes are solved!
|
# ? Oct 24, 2015 20:20 |
|
Tiberius Thyben posted:Imagining all the worlds cars networking and deciding to lock their brakes perminantly. Crashes are solved! Susan Calvin glares at the car. "If you lock your breaks, you'll prevent speeding, thus saving a human life. But if you lock your breaks, the human cannot go to work and will lose his job, thus hurting a human life!"
|
# ? Oct 24, 2015 20:28 |
|
Bogan Krkic posted:The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error. S9000, open the passenger doors.
|
# ? Oct 24, 2015 20:35 |
|
Screaming Idiot posted:Susan Calvin glares at the car. "If you lock your breaks, you'll prevent speeding, thus saving a human life. But if you lock your breaks, the human cannot go to work and will lose his job, thus hurting a human life!" Thank you for this.
|
# ? Oct 24, 2015 21:05 |
|
Bogan Krkic posted:The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error. Honda being an obvious switch for IPO EB, and in no universe would Electronics Boutique get an IPO. Much like IBM and PanAm (or whatever airline was in 2k1) aren't going to be part of interstellar flight.
|
# ? Oct 25, 2015 00:09 |
|
Nomad175 posted:The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities. No one argued that.
|
# ? Oct 25, 2015 00:23 |
|
|
# ? Oct 25, 2015 03:25 |
|
syscall girl posted:Honda being an obvious switch for IPO EB, and in no universe would Electronics Boutique get an IPO. Actually it was a joke about the HAL9000 and the Honda S2000 having kind of similar names Felix_Cat posted:No one argued that. Felix_Cat posted:They simply apply the brakes and don't hit anyone. Dylan16807 posted:Don't swerve. Don't guess and trade off lives, just stop. Bogan Krkic has a new favorite as of 04:52 on Oct 25, 2015 |
# ? Oct 25, 2015 03:43 |
|
Bogan Krkic posted:ummmm
|
# ? Oct 25, 2015 05:37 |
|
Tiggum posted:They weren't saying it would be 100% effective in every case though, they were saying that you don't need to program the car to choose whose life is more important or whatever, you just have it do the thing that is statistically most likely to avoid deaths/injuries. The article being discussed was bout the moral dilemma of creating intelligent cars because they might have to choose whether to swerve left and hit a child or swerve right and hit the mother, but in reality it would do neither of those things because that's dumb. But in order to have the car do the thing that is statistically most likely to avoid deaths/injuries, you do indeed need to program it to choose whose life is more important in a situation where, through error, external circumstances or other reasons, it could cause death or injury. Say for example, a self-driving car is moving at pace on a country road, and suddenly, in a freak piece of poor timing, a tree falls directly in front of the car. The car can swerve to avoid the tree, saving the lives of the 4 passengers, but the only direction it can go to avoid the tree is onto the sidewalk, striking a single pedestrian. Is that not a situation where the car needs to determine whose life is more important?
|
# ? Oct 25, 2015 06:03 |
|
The car doesn't need to be able optimally deal with every single possible scenario. Even in the edge cases like the one you describe I would imagine braking (preceded by careful driving) is often going to be one of the best things you can do (as opposed to swerving off the road and maybe tumbling over a few times). But if it turns out that self-driving cars perform worse than humans in certain situations that's completely fine, because they'll perform better the other 99 out of 100 times. So yes you can conceive of situations where self-driving cars might do worse than humans because they don't know how to perform snap ethical judgments like we do. But these are very much edge cases, and it's not like humans are particularly good at them in the first place.
|
# ? Oct 25, 2015 06:18 |
|
Sure, but the moral dilemma lies in the fact that people want to create self-driving cars and give them the agency to decide who lives and dies, which seems like a bad plan
|
# ? Oct 25, 2015 06:32 |
It isn't any worse of a plan than granting that agency to people who in no way treat it with the gravity it deserves. If everyone gave their full attention to driving, understood basic car control, were cognizant of their surroundings, and were courteous to other drivers, then your moral dilemma would be a genuine one. But they don't, and aren't, and 40,000 deaths occur in the US alone every year as a result, so it isn't. Edit: vvvv This is worth pointing out. As silly as "just brake, problem solved" seems on the surface, you need an obstacle to instantly appear less than 60m or so in front of a car moving 60mph before "just brake" can't avoid a collision completely. Less than that, and "just brake" still more likely than not turns the collision non-fatal. Any self driving system worth certifying is going to be watching its surroundings and will see the tree starting to fall or the pedestrian moving towards the street and will take appropriate action well before the "OH GOD WHO HAS TO DIE?" situation unfolds. Your scenario really has to have something like a hidden bollard that can suddenly pop up in the middle of the lane. Theris has a new favorite as of 07:42 on Oct 25, 2015 |
|
# ? Oct 25, 2015 07:15 |
|
Bogan Krkic posted:Say for example, a self-driving car is moving at pace on a country road, and suddenly, in a freak piece of poor timing, a tree falls directly in front of the car. The car can swerve to avoid the tree, saving the lives of the 4 passengers, but the only direction it can go to avoid the tree is onto the sidewalk, striking a single pedestrian. Is that not a situation where the car needs to determine whose life is more important? I have yet to read a realistic example where an autonomous car has to "decide" between life and death. If the car has time to turn 90° to avoid the tree, I assume the speed is not high enough that the airbags won't protect the passengers if it just brakes and then crashes into the tree. Look at that poo poo from 6:40 https://www.youtube.com/watch?v=YXylqtEQ0tk Seriously, what do people think the car is "seeing"? There's a bit more technology than a Wii-sensor mounted to the front bumper. RabbitWizard has a new favorite as of 07:32 on Oct 25, 2015 |
# ? Oct 25, 2015 07:25 |
|
Bogan Krkic posted:Sure, but the moral dilemma lies in the fact that people want to create self-driving cars and give them the agency to decide who lives and dies, which seems like a bad plan You can't give a self-driving car agency to decide anything, it's a machine, it will do what it's programmed to do, which is come to a stop as rapidly as it can in whatever conditions it finds itself. It can't choose to drive onto the footpath because it can't choose anything. If something appears in front of a self-driving car it will attempt to stop to avoid or minimise damage and injury, because statistically that is the safest course of action.
|
# ? Oct 25, 2015 07:37 |
|
Bogan Krkic posted:But in order to have the car do the thing that is statistically most likely to avoid deaths/injuries, you do indeed need to program it to choose whose life is more important in a situation where, through error, external circumstances or other reasons, it could cause death or injury. lol @ our country roads having sidewalks. Most of our city roads don't have them.
|
# ? Oct 25, 2015 09:26 |
|
Felix_Cat posted:Even in the edge cases like the one you describe Edge cases like accidents. Theris posted:Edit: vvvv This is worth pointing out. As silly as "just brake, problem solved" seems on the surface, you need an obstacle to instantly appear less than 60m or so in front of a car moving 60mph before "just brake" can't avoid a collision completely Like when you're a few car lengths behind and suddenly the car in front of you swerves to avoid an object/debris on the road that you don't know about? Have you ever driven on an interstate before because it sounds like you haven't. blugu64 has a new favorite as of 16:58 on Oct 25, 2015 |
# ? Oct 25, 2015 16:54 |
|
You all seem to be missing the bigger question here, which is, if I have a self-driving car, will my insurance rates be higher or lower?
|
# ? Oct 25, 2015 17:12 |
|
Karma Monkey posted:You all seem to be missing the bigger question here, which is, if I have a self-driving car, will my insurance rates be higher or lower? It depends. Are you a young male?
|
# ? Oct 25, 2015 17:20 |
|
CJacobs posted:Orgy is the name of a band, one-time sorta-star Corey Feldman is not actually having an orgy. The joke is that people are pretending to wish he actually was even though that's a thing no human being would ever actually want because corey feldman is grody It's that there literally was an article by a woman who went to an orgy with Corey Feldman because she was that big a Corey Feldman fan
|
# ? Oct 25, 2015 20:11 |
|
blugu64 posted:Edge cases like accidents. If everyone's using self-driving cars and they work as well as they'll have to before the public accepts them, accidents will be edge cases caused only by freaks of nature.
|
# ? Oct 25, 2015 22:34 |
|
|
# ? Oct 26, 2015 01:00 |
|
|
# ? May 25, 2024 14:41 |
|
Has Bishopville's 'lizard man' returned? Looks like it. Gators gonna gait.
|
# ? Oct 26, 2015 03:04 |