|
Sagebrush posted:This fatal accident is the first one in 130 million miles driven in Teslas with the autopilot on. Very few of the people throwing around the fatalities per 100 million miles thing seem to take the rest of what you mentioned into consideration. I wonder what the excuse would be if it had been a family of 4 in the car and they were now looking at being over 3 times the average instead?
|
# ? Jul 4, 2016 02:12 |
|
|
# ? Jun 5, 2024 05:13 |
|
Mercury Ballistic posted:Not a stats guy but don't we need lots more fatalities before the million miles/fatality metric becomes valid? Yes, of course. I'm using it as a point of discussion though because it's all we have so far. I personally imagine that fatalities/incidents per mile will actually increase over time, as people get more comfortable with using the "autopilot" in more difficult situations and let their attention wander more. Like, if this guy died because he was watching Harry Potter on autopilot, there have to be dozens more people out there watching Harry Potter as well who just haven't gotten into a bad situation yet. It'll be interesting to see how Tesla discusses the software upgrades they'll inevitably have to make to improve safety. "Autopilot 3.1, now the least likely to kill you yet!" fknlo posted:Very few of the people throwing around the fatalities per 100 million miles thing seem to take the rest of what you mentioned into consideration. I wonder what the excuse would be if it had been a family of 4 in the car and they were now looking at being over 3 times the average instead? Ok, let's go that way. The average vehicle on American roads contains 1.55 people. By that measure, Teslas on autopilot kill 1 person every 83 million miles, significantly less safe than the national average. Sagebrush fucked around with this message at 02:31 on Jul 4, 2016 |
# ? Jul 4, 2016 02:22 |
|
Mercury Ballistic posted:Not a stats guy but don't we need lots more fatalities See now the teslarati are clamoring for more people to die at the alter of the self driving car
|
# ? Jul 4, 2016 02:33 |
|
Ola posted:It's comical really, talking about lidar and radar and whatevdar as possible solutions to this particular case, as opposed to "the driver has to pay attention". this is because as long as the car drives itself people will never pay attention
|
# ? Jul 4, 2016 02:58 |
|
Mercury Ballistic posted:Not a stats guy but don't we need lots more fatalities before the million miles/fatality metric becomes valid?
|
# ? Jul 4, 2016 03:37 |
|
Throatwarbler posted:You the human driver are also at best "one weird accident away from total failure". No, that's only at the individual level. If I gently caress up, they don't suddenly say "well poo poo, we have to retrain all humans now". But if the computer fucks up, you have to fix all the computers. And "fix" is an optimistic concept implying that you completely understand what you're doing, which is hubris, which is one of Silicon Valley's biggest flaws.
|
# ? Jul 4, 2016 04:39 |
|
This is all just an elaborate--and unusually non-theoretical--trolley problem. There will be deaths on the way to fully autonomous driving. Is it worth it to have software (which learns) kill some people after it's better than humans at the same task, but still not perfect? I mean, I would personally say yes. I still plan on getting that Model 3 with the autopilot option. If I died in it because I was watching movies and going above the speed limit and some poo poo happened, I wouldn't really want my case to be used as a rallying cry to shut it all down, especially because I was, you know, loving speeding and watching harry potter or whatever. I kind of doubt a dude that had millions of youtube views about his use of the technology would feel terribly different, either. Of course there are criticisms to be had, and I agree Tesla could maybe do a better job of explaining what an Autopilot actually is and making the limitations a bit better known, but they're not really lying or covering anything up, either.
|
# ? Jul 4, 2016 08:20 |
|
Boten Anna posted:This is all just an elaborate--and unusually non-theoretical--trolley problem. There will be deaths on the way to fully autonomous driving. Is it worth it to have software (which learns) kill some people after it's better than humans at the same task, but still not perfect? just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them
|
# ? Jul 4, 2016 10:01 |
|
atomicthumbs posted:just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them It's worse than that. That sentence could be interpreted as the software learning to kill people better than humans kill people. Boten Anna supports skynet
|
# ? Jul 4, 2016 10:38 |
|
FWIW my 3 will absolutely have the AP option. Plenty of humans die on the road because of stupid humans doing stupid things. L2/AP may not be perfect, but I'll take a system thats 99% reliable in everyday situations (never blinks, never spills coffee in its lap, never talks on their cell while driving, and handles normal commute traffic nearly perfectly) and 50% in niche ones (doesn't notice a semi trailer turning left over the highway) over a system thats 95% in everyday situations and 90% in usual ones. You can always turn the drat thing off. Of course the system needs more miles. More data is always good. But until then its already pretty damned good. In the real world people die due to mundane failures all the time. Removing those will be a huge net positive to traffic death reports. ilkhan fucked around with this message at 14:01 on Jul 4, 2016 |
# ? Jul 4, 2016 13:58 |
|
Powershift posted:Boten Anna supports skynet The name "Boten Anna" refers to an IRC bot, so it's plausible that Boten Anna is skynet.
|
# ? Jul 4, 2016 14:11 |
|
ilkhan posted:FWIW my 3 will absolutely have the AP option. Mine would too, had I ordered one. It's perfectly fine as it is, used by attentive drivers in the same way you use adaptive cruise control. You let it do its thing but you're ready to take over any second. The problem is all the people that use it without paying attention and the company which keeps the hype going by talking about cross country summon etc.
|
# ? Jul 4, 2016 15:16 |
|
Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really.
|
# ? Jul 4, 2016 16:08 |
|
fknlo posted:Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really. Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.
|
# ? Jul 4, 2016 16:29 |
|
The Sicilian posted:Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys. It's an easy, no-lose position to take. EVs are a gimmick and a fad and Tesla is a fraud company run by charlatans. If all that turns out to be true, you get to say I told you so. If it's false, and everyone is driving solar powered robot cars made by Tesla who now outproduce and outcompete those we consider today to be some of the big guys, you can be a crotchety old oval office telling everyone how much better it was back in my day, with the respiratory illness and diesel soot so thick you would have had no idea those buildings were once white.
|
# ? Jul 4, 2016 16:42 |
|
I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for.
|
# ? Jul 4, 2016 17:06 |
|
Linedance posted:It's an easy, no-lose position to take. EVs are a gimmick and a fad and Tesla is a fraud company run by charlatans. If all that turns out to be true, you get to say I told you so. If it's false, and everyone is driving solar powered robot cars made by Tesla who now outproduce and outcompete those we consider today to be some of the big guys, you can be a crotchety old oval office telling everyone how much better it was back in my day, with the respiratory illness and diesel soot so thick you would have had no idea those buildings were once white. Yes, skepticism of Tesla == advocating for the return of unfiltered diesels. MrOnBicycle posted:I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for. The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason.
|
# ? Jul 4, 2016 17:30 |
|
ilkhan posted:FWIW my 3 will absolutely have the AP option. It Cannot See White Trucks Throatwarbler posted:The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason. probably the exact same reason it doesn't use LIDAR or a decent scanning radar or more than a single lovely camera atomicthumbs fucked around with this message at 18:12 on Jul 4, 2016 |
# ? Jul 4, 2016 18:10 |
|
atomicthumbs posted:just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them Humans have spent tens of thousands of years killing people in order to attempt to figure out better ways to kill them, so if I have to choose a direction for machines, yeah.
|
# ? Jul 4, 2016 18:14 |
|
here are a couple neat papers from journals of the Human Factors and Ergonomics Society, who should really be the people designing these things The Out-of-the-Loop Performance Problem and Level of Control in Automation quote:The out-of-the-loop performance problem, a major potential consequence of automation, leaves operators of automated systems handicapped in their ability to take over manual operations in the event of automation failure. This is attributed to a possible loss of skills and of situation awareness (SA) arising from vigilance and complacency problems, a shift from active to passive information processing, and change in feedback provided to the operator. We studied the automation of a navigation task using an expert system and demonstrated that low SA corresponded with out-of-the-loop performance decrements in decision time following a failure of the expert system. Level of operator control in interacting with automation is a major factor in moderating this loss of SA. Results indicated that the shift from active to passive processing was most likely responsible for decreased SA under automated conditions. “Take over!” How long does it take to get the driver back into the loop? quote:Raising the automation level in cars is an imaginable scenario for the future in order to improve traffic safety. However, as long as there are situations that cannot be handled by the automation, the driver has to be enabled to take over the driving task in a safe manner. The focus of the current study is to understand at which point in time a driver’s attention must be directed back to the driving task. To investigate this issue, an experiment was conducted in a dynamic driving simulator and two take-over times were examined and compared to manual driving. The conditions of the experiment were designed to examine the take-over process of inattentive drivers engaged in an interaction with a tablet computer. The results show distinct automation effects in both take-over conditions. With shorter take-over time, decision making and reactions are faster but generally worse in quality. Giving drivers a system that allows them to take themselves out of the vehicle's control loop but which requires them to return to full alertness and integration with it at a moment's notice is actively dangerous and is going to kill more people. I don't have the extension installed right now but you can probably get the full text on Sci-Hub.
|
# ? Jul 4, 2016 18:16 |
|
The Sicilian posted:Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys. View it more as mocking Elon and his setting of stretch goals that should have been kept more nebulous in public statements. I know I'd be ecstatic if Tesla came close to their Model 3 plans, but from the outside they look crazy optimistic.
|
# ? Jul 4, 2016 18:17 |
|
MrOnBicycle posted:I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for. My understanding is that The second-gen Pilot Assist works on any roads with clear lane markings. Bbut I just read a short review about S90/V90 that complained that the system drifts around too much in the lane and sometimes it crosses the lines before making a pronounced correction.
|
# ? Jul 4, 2016 18:25 |
|
Reinstalled Sci-Hub and got the papers: Endsley 1995 Gold 2013 The Sicilian posted:Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.
|
# ? Jul 4, 2016 18:31 |
|
pun pundit posted:The name "Boten Anna" refers to an IRC bot, so it's plausible that Boten Anna is skynet. I will neither confirm nor deny, and refer to my previous official statement, "Jag är ingen bot / Jag är en väldigt, väldigt vacker tjej"
|
# ? Jul 4, 2016 18:37 |
|
Linedance posted:It's an easy, no-lose position to take. EVs are a gimmick and a fad no Linedance posted:and Tesla is a fraud company maybe a little, if you get technical about it Linedance posted:run by charlatans. yes. Throatwarbler posted:The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason. GPS data is garbage for that kind of job. It relies on a database that may be out of date, incomplete, or just totally wrong, and the data can be quickly and randomly degraded to an unusable level by environmental factors. GPS will be used for general routing, i.e. telling the car where to turn when it gets to that point, but not for lane keeping. fake edit: I wonder how a Tesla with next-level assists would handle being the big rig in the situation we're all discussing? Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out?
|
# ? Jul 4, 2016 18:41 |
|
atomicthumbs posted:
How is this not common sense though? How many times have we all been goofing off in school and then called upon to participate? We have no idea what is happening and have an awkward few seconds to get back on track. Google's approach where you are trusted with an off button is, in my mind, a much safer approach. You are either 100% driving or 100% not, the laws need to be written to support this.
|
# ? Jul 4, 2016 18:46 |
|
Sagebrush posted:fake edit: I wonder how a Tesla with next-level assists would handle being the big rig in the situation we're all discussing? Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out? lidar at least has a chance of detecting a white semi truck that hasn't been chopped and slammed to lower it into the radar's field of view Mercury Ballistic posted:How is this not common sense though? Exactly. Ban all self-driving cars.
|
# ? Jul 4, 2016 18:47 |
|
[quote="Ola" post="""] Developing a sensor and software package that do the same sensing and cautious braking that I did AND not caution braking unnecessarily every 2 minutes, is a ridiculous project. You have to account for so many millions of combinations of things which sane, slightly experienced humans react to unthinkingly. A nation state level exercise in hubris which would forever be one weird accident away from total failure. You basically have to develop AI, which ironically Elon Musk fears, in something I suspect is a bit of a "look how sci-fi I am" put-on. I wonder if what Elon Musk fears the most is a future where things aren't all that different from today, only computers are a bit better and a few more people have them. [/quote] Subjunctive posted:Yes, Teslas should not be driving unsupervised. They tell you that every time you engage Autopilot, and they test that the driver has hands on wheel periodically. They are not autonomous vehicles. They have sophisticated driver assistance systems, straight L2 stuff. This isn't the endgame any more than the first mail-sorting or check-reading or face-recognizing systems were. What's the interval like? Even in Audi's latest it's 15 seconds before it deactivates (in the middle of a Q7 road trip so checked that duration on the drive up).
|
# ? Jul 4, 2016 19:25 |
|
drgitlin posted:What's the interval like? Even in Audi's latest it's 15 seconds before it deactivates (in the middle of a Q7 road trip so checked that duration on the drive up). three loving minutes and you can just sorta let your legs touch the wheel
|
# ? Jul 4, 2016 20:38 |
|
Saukkis posted:My understanding is that The second-gen Pilot Assist works on any roads with clear lane markings. Bbut I just read a short review about S90/V90 that complained that the system drifts around too much in the lane and sometimes it crosses the lines before making a pronounced correction. Oh I didn't know the S90/V90 had driver assist like that. I was referring to this concept: https://www.youtube.com/watch?v=xYqtu39d3CU
|
# ? Jul 4, 2016 20:51 |
|
Mercury Ballistic posted:Tesla Autopilot Enthusiast Killed In First Self-Driving Car Death Autopilot is a terrible name, it should have been named lane keeping because driver are idiots and feel they can just sleep through driving. wargames fucked around with this message at 21:01 on Jul 4, 2016 |
# ? Jul 4, 2016 20:52 |
|
Sagebrush posted:Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out? Yes, eyeballs
|
# ? Jul 4, 2016 21:03 |
|
atomicthumbs posted:just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat.
|
# ? Jul 4, 2016 21:08 |
|
The google car sees and tracks vehicles off on side streets and stuff. https://www.youtube.com/watch?v=MqUbdd7ae54 https://www.youtube.com/watch?v=tiwVMrTLUWg&t=469s Then again, in it's at-fault accident, the google car made the same mistake the trucker presumably made in this situation, it assumed the other vehicle wasn't just going to carry on at full speed into a collision course with it.
|
# ? Jul 4, 2016 21:16 |
|
Perhaps if all autopilotesque driver assists were essentially invisible to the operator and always on, such as traction control and the like, the transition might be less apt to kill folks?
|
# ? Jul 4, 2016 21:17 |
|
Mercury Ballistic posted:Perhaps if all autopilotesque driver assists were essentially invisible to the operator and always on, such as traction control and the like, the transition might be less apt to kill folks? Well, they are, for the most part. Auto-brake, emergency brake, and collision avoidance are always on in vehicles that have it. radar cruise control is as simple to set as regular cruise control, and lane assist is just one more thing to set. Tesla combined them and called it autopilot. Volvos and hyundais have very similar systems, but you don't see people making videos of them loving around with those systems set, because they're billed as driver assists, not the first step towards driver replacement. It still seems like a failure of marketing rather than a failure of technology. https://www.youtube.com/watch?v=UgNhYGAgmZo https://www.youtube.com/watch?v=NqnJRo4FQNo There seem to be a lot fewer of these videos now, perhaps the results related to the accident have flooded them out, perhaps people have pulled them down so the video doesn't end up on the news when their head is found in a corn field, who knows. In reality, these videos should be pursued to the same degree as someone filming themselves doing 150mph on public roads. Powershift fucked around with this message at 21:33 on Jul 4, 2016 |
# ? Jul 4, 2016 21:30 |
|
kimbo305 posted:Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat. Really now, the choice here is "let machines kill people in order to attempt to figure out a way not to kill them" versus "let human drivers kill way more people because we're too chicken to trust new technology".
|
# ? Jul 4, 2016 22:00 |
|
wargames posted:Autopilot is a terrible name, it should have been named lane keeping because driver are idiots and feel they can just sleep through driving. Yeah, they should call it cruise control, no one has ever confused what that meant and caused accidents.
|
# ? Jul 4, 2016 23:05 |
|
Cockmaster posted:Really now, the choice here is "let machines kill people in order to attempt to figure out a way not to kill them" versus "let human drivers kill way more people because we're too chicken to trust new technology". fknlo posted:Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really. And comparing S/X to 3 production is still stupid. The S/X are ridiculously time-intensive to build compared to what they are saying about the highly optimized 3. ilkhan fucked around with this message at 18:52 on Jul 5, 2016 |
# ? Jul 4, 2016 23:07 |
|
|
# ? Jun 5, 2024 05:13 |
|
kimbo305 posted:Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat. how exactly are you comparing "new kinds of vehicle" with "vehicle that drives itself under trucks"
|
# ? Jul 5, 2016 00:46 |