Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
fknlo
Jul 6, 2009


Fun Shoe

Sagebrush posted:

This fatal accident is the first one in 130 million miles driven in Teslas with the autopilot on.

The US highway average is roughly 1 fatality per 100 million miles. That figure includes not just average cars that aren't equipped with automatic braking and adaptive cruise like a Tesla, but also old pickup trucks with zero-star crash ratings, rusted-out Cavaliers on decade-old tires, motorcycles (including riders who aren't wearing helmets), etc.

Now granted, we only have a single data point so far...but 30% better safety than the nationwide average is not a super great figure for a brand new $100,000 luxury car. I bet if you could somehow pick out the statistics just for cars equipped with auto-braking and the other assists, you'd find that the Tesla under autopilot does worse than a human.

Very few of the people throwing around the fatalities per 100 million miles thing seem to take the rest of what you mentioned into consideration. I wonder what the excuse would be if it had been a family of 4 in the car and they were now looking at being over 3 times the average instead?

Adbot
ADBOT LOVES YOU

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.

Mercury Ballistic posted:

Not a stats guy but don't we need lots more fatalities before the million miles/fatality metric becomes valid?

Yes, of course. I'm using it as a point of discussion though because it's all we have so far. I personally imagine that fatalities/incidents per mile will actually increase over time, as people get more comfortable with using the "autopilot" in more difficult situations and let their attention wander more.

Like, if this guy died because he was watching Harry Potter on autopilot, there have to be dozens more people out there watching Harry Potter as well who just haven't gotten into a bad situation yet.

It'll be interesting to see how Tesla discusses the software upgrades they'll inevitably have to make to improve safety. "Autopilot 3.1, now the least likely to kill you yet!"

fknlo posted:

Very few of the people throwing around the fatalities per 100 million miles thing seem to take the rest of what you mentioned into consideration. I wonder what the excuse would be if it had been a family of 4 in the car and they were now looking at being over 3 times the average instead?

Ok, let's go that way. The average vehicle on American roads contains 1.55 people. By that measure, Teslas on autopilot kill 1 person every 83 million miles, significantly less safe than the national average. :pseudo:

Sagebrush fucked around with this message at 02:31 on Jul 4, 2016

blugu64
Jul 17, 2006

Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?

Mercury Ballistic posted:

Not a stats guy but don't we need lots more fatalities

See now the teslarati are clamoring for more people to die at the alter of the self driving car :words:

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Ola posted:

It's comical really, talking about lidar and radar and whatevdar as possible solutions to this particular case, as opposed to "the driver has to pay attention".

this is because as long as the car drives itself people will never pay attention

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Mercury Ballistic posted:

Not a stats guy but don't we need lots more fatalities before the million miles/fatality metric becomes valid?
I said that earlier, too, but now I've thought about it some more I think we actually don't. If we instead look at it as "chance of a fatality per mile" then we have a sample size of 130000 that in aggregate suggests that the chance of a fatality per mile is 1/130000 (I believe it actually suggests a significantly smaller odds of fatality than that because if those were the odds then our chances of reaching that count without a single accident are well below 50:50, and we should really be estimating at the balance-of-probability level). Point being it's not as silly as the apparent sample size of 1 would appear. Pretty sure looking at it this way has to be valid because clearly it would be ridiculous if, hypothetically, we had a billion road-miles with zero fatalities, to say "that's not enough data to make a safety judgement, we need fatalities first."

Ola
Jul 19, 2004

Throatwarbler posted:

You the human driver are also at best "one weird accident away from total failure".

No, that's only at the individual level. If I gently caress up, they don't suddenly say "well poo poo, we have to retrain all humans now". But if the computer fucks up, you have to fix all the computers.

And "fix" is an optimistic concept implying that you completely understand what you're doing, which is hubris, which is one of Silicon Valley's biggest flaws.

Boten Anna
Feb 22, 2010

This is all just an elaborate--and unusually non-theoretical--trolley problem. There will be deaths on the way to fully autonomous driving. Is it worth it to have software (which learns) kill some people after it's better than humans at the same task, but still not perfect?

I mean, I would personally say yes. I still plan on getting that Model 3 with the autopilot option. If I died in it because I was watching movies and going above the speed limit and some poo poo happened, I wouldn't really want my case to be used as a rallying cry to shut it all down, especially because I was, you know, loving speeding and watching harry potter or whatever. I kind of doubt a dude that had millions of youtube views about his use of the technology would feel terribly different, either.

Of course there are criticisms to be had, and I agree Tesla could maybe do a better job of explaining what an Autopilot actually is and making the limitations a bit better known, but they're not really lying or covering anything up, either.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Boten Anna posted:

This is all just an elaborate--and unusually non-theoretical--trolley problem. There will be deaths on the way to fully autonomous driving. Is it worth it to have software (which learns) kill some people after it's better than humans at the same task, but still not perfect?

I mean, I would personally say yes.

just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them

Powershift
Nov 23, 2009


atomicthumbs posted:

just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them

It's worse than that. That sentence could be interpreted as the software learning to kill people better than humans kill people.

Boten Anna supports skynet

ilkhan
Oct 7, 2004

You'll be sorry you made fun of me when Daddy Donald jails all my posting enemies!
FWIW my 3 will absolutely have the AP option.

Plenty of humans die on the road because of stupid humans doing stupid things. L2/AP may not be perfect, but I'll take a system thats 99% reliable in everyday situations (never blinks, never spills coffee in its lap, never talks on their cell while driving, and handles normal commute traffic nearly perfectly) and 50% in niche ones (doesn't notice a semi trailer turning left over the highway) over a system thats 95% in everyday situations and 90% in usual ones. You can always turn the drat thing off.

Of course the system needs more miles. More data is always good. But until then its already pretty damned good. In the real world people die due to mundane failures all the time. Removing those will be a huge net positive to traffic death reports.

ilkhan fucked around with this message at 14:01 on Jul 4, 2016

pun pundit
Nov 11, 2008

I feel the same way about the company bearing the same name.

Powershift posted:

Boten Anna supports skynet

The name "Boten Anna" refers to an IRC bot, so it's plausible that Boten Anna is skynet.

Ola
Jul 19, 2004

ilkhan posted:

FWIW my 3 will absolutely have the AP option.


Mine would too, had I ordered one. It's perfectly fine as it is, used by attentive drivers in the same way you use adaptive cruise control. You let it do its thing but you're ready to take over any second.

The problem is all the people that use it without paying attention and the company which keeps the hype going by talking about cross country summon etc.

fknlo
Jul 6, 2009


Fun Shoe
Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really.

The Sicilian
Sep 3, 2006

by Smythe

fknlo posted:

Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really.

Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.

Finger Prince
Jan 5, 2007


The Sicilian posted:

Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.

It's an easy, no-lose position to take. EVs are a gimmick and a fad and Tesla is a fraud company run by charlatans. If all that turns out to be true, you get to say I told you so. If it's false, and everyone is driving solar powered robot cars made by Tesla who now outproduce and outcompete those we consider today to be some of the big guys, you can be a crotchety old oval office telling everyone how much better it was back in my day, with the respiratory illness and diesel soot so thick you would have had no idea those buildings were once white.

MrOnBicycle
Jan 18, 2008
Wait wat?
I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for.

Throatwarbler
Nov 17, 2008

by vyelkin

Linedance posted:

It's an easy, no-lose position to take. EVs are a gimmick and a fad and Tesla is a fraud company run by charlatans. If all that turns out to be true, you get to say I told you so. If it's false, and everyone is driving solar powered robot cars made by Tesla who now outproduce and outcompete those we consider today to be some of the big guys, you can be a crotchety old oval office telling everyone how much better it was back in my day, with the respiratory illness and diesel soot so thick you would have had no idea those buildings were once white.

Yes, skepticism of Tesla == advocating for the return of unfiltered diesels.


MrOnBicycle posted:

I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for.

The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

ilkhan posted:

FWIW my 3 will absolutely have the AP option.

Plenty of humans die on the road because of stupid humans doing stupid things. L2/AP may not be perfect, but I'll take a system thats 99% reliable in everyday situations (never blinks, never spills coffee in its lap, never talks on their cell while driving, and handles normal commute traffic nearly perfectly) and 50% in niche ones (doesn't notice a semi trailer turning left over the highway) over a system thats 95% in everyday situations and 90% in usual ones. You can always turn the drat thing off.

Of course the system needs more miles. More data is always good. But until then its already pretty damned good. In the real world people die due to mundane failures all the time. Removing those will be a huge net positive to traffic death reports.

It Cannot See White Trucks

Throatwarbler posted:

The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason.

probably the exact same reason it doesn't use LIDAR or a decent scanning radar or more than a single lovely camera

atomicthumbs fucked around with this message at 18:12 on Jul 4, 2016

Godholio
Aug 28, 2002

Does a bear split in the woods near Zheleznogorsk?

atomicthumbs posted:

just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them

Humans have spent tens of thousands of years killing people in order to attempt to figure out better ways to kill them, so if I have to choose a direction for machines, yeah.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
here are a couple neat papers from journals of the Human Factors and Ergonomics Society, who should really be the people designing these things

The Out-of-the-Loop Performance Problem and Level of Control in Automation

quote:

The out-of-the-loop performance problem, a major potential consequence of automation, leaves operators of automated systems handicapped in their ability to take over manual operations in the event of automation failure. This is attributed to a possible loss of skills and of situation awareness (SA) arising from vigilance and complacency problems, a shift from active to passive information processing, and change in feedback provided to the operator. We studied the automation of a navigation task using an expert system and demonstrated that low SA corresponded with out-of-the-loop performance decrements in decision time following a failure of the expert system. Level of operator control in interacting with automation is a major factor in moderating this loss of SA. Results indicated that the shift from active to passive processing was most likely responsible for decreased SA under automated conditions.

“Take over!” How long does it take to get the driver back into the loop?

quote:

Raising the automation level in cars is an imaginable scenario for the future in order to improve traffic safety. However, as long as there are situations that cannot be handled by the automation, the driver has to be enabled to take over the driving task in a safe manner. The focus of the current study is to understand at which point in time a driver’s attention must be directed back to the driving task. To investigate this issue, an experiment was conducted in a dynamic driving simulator and two take-over times were examined and compared to manual driving. The conditions of the experiment were designed to examine the take-over process of inattentive drivers engaged in an interaction with a tablet computer. The results show distinct automation effects in both take-over conditions. With shorter take-over time, decision making and reactions are faster but generally worse in quality.

Giving drivers a system that allows them to take themselves out of the vehicle's control loop but which requires them to return to full alertness and integration with it at a moment's notice is actively dangerous and is going to kill more people.

I don't have the extension installed right now but you can probably get the full text on Sci-Hub.

fordan
Mar 9, 2009

Clue: Zero

The Sicilian posted:

Their ramped up goals were quite high, and they jut hired a production engineer from Audi. Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.

View it more as mocking Elon and his setting of stretch goals that should have been kept more nebulous in public statements. I know I'd be ecstatic if Tesla came close to their Model 3 plans, but from the outside they look crazy optimistic.

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

MrOnBicycle posted:

I like Volvo's approach more than Tesla's. Only allow autonomous driving on predetermined roads, instead of letting people try the autopilot on roads it's not designed for.

My understanding is that The second-gen Pilot Assist works on any roads with clear lane markings. Bbut I just read a short review about S90/V90 that complained that the system drifts around too much in the lane and sometimes it crosses the lines before making a pronounced correction.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
Reinstalled Sci-Hub and got the papers:
Endsley 1995
Gold 2013

The Sicilian posted:

Really don't understand what you gain by making GBS threads on some small car company going up against the big guys.

Boten Anna
Feb 22, 2010

pun pundit posted:

The name "Boten Anna" refers to an IRC bot, so it's plausible that Boten Anna is skynet.

I will neither confirm nor deny, and refer to my previous official statement, "Jag är ingen bot / Jag är en väldigt, väldigt vacker tjej"

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.

Linedance posted:

It's an easy, no-lose position to take. EVs are a gimmick and a fad

no

Linedance posted:

and Tesla is a fraud company

maybe a little, if you get technical about it

Linedance posted:

run by charlatans.

yes.

Throatwarbler posted:

The Tesla specifically does not use GPS map information as an input in its autonomous driving. I forget the reason.

GPS data is garbage for that kind of job. It relies on a database that may be out of date, incomplete, or just totally wrong, and the data can be quickly and randomly degraded to an unusable level by environmental factors.

GPS will be used for general routing, i.e. telling the car where to turn when it gets to that point, but not for lane keeping.

fake edit: I wonder how a Tesla with next-level assists would handle being the big rig in the situation we're all discussing? Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out?

Mercury Ballistic
Nov 14, 2005

not gun related

atomicthumbs posted:


Giving drivers a system that allows them to take themselves out of the vehicle's control loop but which requires them to return to full alertness and integration with it at a moment's notice is actively dangerous and is going to kill more people.

I don't have the extension installed right now but you can probably get the full text on Sci-Hub.

How is this not common sense though?
How many times have we all been goofing off in school and then called upon to participate? We have no idea what is happening and have an awkward few seconds to get back on track. Google's approach where you are trusted with an off button is, in my mind, a much safer approach.
You are either 100% driving or 100% not, the laws need to be written to support this.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Sagebrush posted:

fake edit: I wonder how a Tesla with next-level assists would handle being the big rig in the situation we're all discussing? Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out?

lidar at least has a chance of detecting a white semi truck that hasn't been chopped and slammed to lower it into the radar's field of view

Mercury Ballistic posted:

How is this not common sense though?
How many times have we all been goofing off in school and then called upon to participate? We have no idea what is happening and have an awkward few seconds to get back on track. Google's approach where you are trusted with an off button is, in my mind, a much safer approach.
You are either 100% driving or 100% not, the laws need to be written to support this.

Exactly. Ban all self-driving cars.

drgitlin
Jul 25, 2003
luv 2 get custom titles from a forum that goes into revolt when its told to stop using a bad word.
[quote="Ola" post="""]
Developing a sensor and software package that do the same sensing and cautious braking that I did AND not caution braking unnecessarily every 2 minutes, is a ridiculous project. You have to account for so many millions of combinations of things which sane, slightly experienced humans react to unthinkingly. A nation state level exercise in hubris which would forever be one weird accident away from total failure. You basically have to develop AI, which ironically Elon Musk fears, in something I suspect is a bit of a "look how sci-fi I am" put-on. I wonder if what Elon Musk fears the most is a future where things aren't all that different from today, only computers are a bit better and a few more people have them.
[/quote]

Subjunctive posted:

Yes, Teslas should not be driving unsupervised. They tell you that every time you engage Autopilot, and they test that the driver has hands on wheel periodically. They are not autonomous vehicles. They have sophisticated driver assistance systems, straight L2 stuff. This isn't the endgame any more than the first mail-sorting or check-reading or face-recognizing systems were.

What's the interval like? Even in Audi's latest it's 15 seconds before it deactivates (in the middle of a Q7 road trip so checked that duration on the drive up).

Grim Up North
Dec 12, 2011

drgitlin posted:

What's the interval like? Even in Audi's latest it's 15 seconds before it deactivates (in the middle of a Q7 road trip so checked that duration on the drive up).

three loving minutes and you can just sorta let your legs touch the wheel

MrOnBicycle
Jan 18, 2008
Wait wat?

Saukkis posted:

My understanding is that The second-gen Pilot Assist works on any roads with clear lane markings. Bbut I just read a short review about S90/V90 that complained that the system drifts around too much in the lane and sometimes it crosses the lines before making a pronounced correction.

Oh I didn't know the S90/V90 had driver assist like that. I was referring to this concept: https://www.youtube.com/watch?v=xYqtu39d3CU

wargames
Mar 16, 2008

official yospos cat censor

Mercury Ballistic posted:

Tesla Autopilot Enthusiast Killed In First Self-Driving Car Death
http://www.forbes.com/sites/briansolomon/2016/06/30/the-first-self-driving-car-death-launches-tesla-investigation/



Here comes some fun discussion I guess.

Autopilot is a terrible name, it should have been named lane keeping because driver are idiots and feel they can just sleep through driving.

wargames fucked around with this message at 21:01 on Jul 4, 2016

blugu64
Jul 17, 2006

Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?

Sagebrush posted:

Do sensors even exist that could scan a whole highway accurately enough and to a great enough distance that the car could choose when to pull out?

Yes, eyeballs

kimbo305
Jun 9, 2007

actually, yeah, I am a little mad

atomicthumbs posted:

just quoting this before you can edit out that you'd be fine with letting machines kill people in order to attempt to figure out a way not to kill them

Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat.

Powershift
Nov 23, 2009


The google car sees and tracks vehicles off on side streets and stuff.

https://www.youtube.com/watch?v=MqUbdd7ae54
https://www.youtube.com/watch?v=tiwVMrTLUWg&t=469s


Then again, in it's at-fault accident, the google car made the same mistake the trucker presumably made in this situation, it assumed the other vehicle wasn't just going to carry on at full speed into a collision course with it.

Mercury Ballistic
Nov 14, 2005

not gun related
Perhaps if all autopilotesque driver assists were essentially invisible to the operator and always on, such as traction control and the like, the transition might be less apt to kill folks?

Powershift
Nov 23, 2009


Mercury Ballistic posted:

Perhaps if all autopilotesque driver assists were essentially invisible to the operator and always on, such as traction control and the like, the transition might be less apt to kill folks?

Well, they are, for the most part. Auto-brake, emergency brake, and collision avoidance are always on in vehicles that have it. radar cruise control is as simple to set as regular cruise control, and lane assist is just one more thing to set. Tesla combined them and called it autopilot.


Volvos and hyundais have very similar systems, but you don't see people making videos of them loving around with those systems set, because they're billed as driver assists, not the first step towards driver replacement. It still seems like a failure of marketing rather than a failure of technology.

https://www.youtube.com/watch?v=UgNhYGAgmZo
https://www.youtube.com/watch?v=NqnJRo4FQNo

There seem to be a lot fewer of these videos now, perhaps the results related to the accident have flooded them out, perhaps people have pulled them down so the video doesn't end up on the news when their head is found in a corn field, who knows. In reality, these videos should be pursued to the same degree as someone filming themselves doing 150mph on public roads.

Powershift fucked around with this message at 21:33 on Jul 4, 2016

Cockmaster
Feb 24, 2002

kimbo305 posted:

Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat.

Really now, the choice here is "let machines kill people in order to attempt to figure out a way not to kill them" versus "let human drivers kill way more people because we're too chicken to trust new technology".

duz
Jul 11, 2005

Come on Ilhan, lets go bag us a shitpost


wargames posted:

Autopilot is a terrible name, it should have been named lane keeping because driver are idiots and feel they can just sleep through driving.

Yeah, they should call it cruise control, no one has ever confused what that meant and caused accidents.

ilkhan
Oct 7, 2004

You'll be sorry you made fun of me when Daddy Donald jails all my posting enemies!

Cockmaster posted:

Really now, the choice here is "let machines kill people in order to attempt to figure out a way not to kill them" versus "let human drivers kill way more people because we're too chicken to trust new technology".
Pretty much.

fknlo posted:

Tesla still not meeting their own delivery estimates. I'm sure they'll have that fixed in time to be delivering a few hundred thousand 3's within 2 years. Really.
Tesla missed delivery estimates by 2630 vehicles. Had 2535 additional vehicles produced and in transit to their owners (2535 net increase over in-transit number for Q1). Yeah, they are *way* off track.

And comparing S/X to 3 production is still stupid. The S/X are ridiculously time-intensive to build compared to what they are saying about the highly optimized 3.

ilkhan fucked around with this message at 18:52 on Jul 5, 2016

Adbot
ADBOT LOVES YOU

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

kimbo305 posted:

Isn't that how every transportation revolution has been? Boats. Airplanes. Cars. It's never going to be perfect off the bat.

how exactly are you comparing "new kinds of vehicle" with "vehicle that drives itself under trucks"

  • Locked thread