Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Flint_Paper
Jun 7, 2004

This isn't cool at all Looshkin! These are dark forces you're titting about with!

tactlessbastard posted:

My wife once had a judge correct her on the pronunciation of our last name.

"B'stard"

Adbot
ADBOT LOVES YOU

tactlessbastard
Feb 4, 2001

Godspeed, post
Fun Shoe

He certainly was!

Eighties ZomCom
Sep 10, 2008




tactlessbastard posted:

My wife once had a judge correct her on the pronunciation of our last name.

You're married to Hyacinth Bucket?

zedprime
Jun 9, 2007

yospos

Griefor posted:

If the brakes fail while underway, sure. But if the driver knows the brakes are not functioning before she enters the bus but it is company policy to drive buses with broken brakes she does not get to hide behind orders from her corporate overlord and is, in fact, responsible. Being asked to do something by your corporate overlords does not absolve you of having to obey the law.

I'd like to stress that there is no dichotomy here, both can be responsible. Like with murder-for-hire, there is no need to ask whether the murderer or the one putting out the hit is responsible, they can both be guilty of a crime.
You normally do get to hide behind the company in cases where diffuse responsibility ends up with something bad happening. For all the hand wringing when it protects a poo poo lord middle manager, it actually has an important use in protecting line employees who aren't often educated or informed enough to make any other decision. As demonstrated by this case where the company has been able to roll this protection back by legislating the exact rules to avoid their own implication when giving employees the conflicting instructions: "this car drives itself. You legally drive this car. Deal with it." These people aren't computer engineers, road safety experts, or ethicists. Why would we expect them to come up with the right answer to that command?

Slippery
May 16, 2004


Muscles Boxcar

jojoinnit posted:

A little mint and some vodka and you've got yourself an Egg de Menthe my friend.

Now that sounds egg-tastic. Just the thing to take the edge off a Thursday.

Doc Hawkins
Jun 15, 2010

Dashing? But I'm not even moving!


Absurd Alhazred posted:

I'm not saying I'm not already getting the popcorn to watch this movie, but it ain't in the Torah.

that's right: it's only in heaven :allears:

Ornamental Dingbat
Feb 26, 2007

tactlessbastard posted:

My wife once had a judge correct her on the pronunciation of our last name.

My middle name's Hrafn, which I put on my resume. The guy who hired me for my current job commented that I misspelled my middle name on the resume and asked what it actually was.

Son of Rodney
Feb 22, 2006

ohmygodohmygodohmygod

Ornamental Dingbat posted:

My middle name's Hrafn, which I put on my resume. The guy who hired me for my current job commented that I misspelled my middle name on the resume and asked what it actually was.

Are you a distant descendant of hvidserk ragnarsson?

Ishamael
Feb 18, 2004

You don't have to love me, but you will respect me.

Isn't that Ben Affleck's voice?

Also, what??

Veni Vidi Ameche!
Nov 2, 2017

by Fluffdaddy

Charging this person is like busting a street-level dealer. It is 100% useless as far as solving the problem, and it shifts all the blame and pain down to the chumps at the bottom while the guy at the top escapes all punishment and inconvenience, and profits from the misery he’s creating.

Yes, the driver should have been paying attention and maintaining control of the car, but the real problem is that the car shouldn’t have been on the road in the first place. The blame doesn’t even lie with the shithead CEOs of the companies involved; it lies with the legislators who haven’t stomped on this yet.

Dick Trauma
Nov 30, 2007

God damn it, you've got to be kind.
I've been nearly run over many times while crossing in a marked crosswalk at a red light. Sometimes the driver was even stopped, but as I crossed in front of them they just... decided to go. These robot cars are probably not any worse, and in most cases much better at not running people over.

Griefor
Jun 11, 2009

zedprime posted:

You normally do get to hide behind the company in cases where diffuse responsibility ends up with something bad happening. For all the hand wringing when it protects a poo poo lord middle manager, it actually has an important use in protecting line employees who aren't often educated or informed enough to make any other decision. As demonstrated by this case where the company has been able to roll this protection back by legislating the exact rules to avoid their own implication when giving employees the conflicting instructions: "this car drives itself. You legally drive this car. Deal with it." These people aren't computer engineers, road safety experts, or ethicists. Why would we expect them to come up with the right answer to that command?

This person has a driver's license. I don't know what is the current curriculum of the US driver's license test as I have a Dutch driver's license that I got 20 years ago, but I would say knowing that you are supposed to stay in control of the car and not hand it over to an AI is a very important thing to know for anyone who holds a driver's license today. If I kill someone by breaking a traffic rule that was changed after I already got my license I'm still responsible for that. I'd say the driver's license creates a higher standard of being required to know the rules than just being a citizen does.

That said though, I don't know how pervasive this knowledge of the law actually is, and sure, Uber definitely intentionally created a pretty terrible scenario for their "driver" to be in. I'm not a lawyer and maybe a case could be made for this lady but that would mean the current standard for knowledge of this particular area of the traffic rules for a person holding a driver's license is woefully inadequate and it is necessary to educate all drivers that they definitely need to not watch a TV show while in the driver's seat of a moving car, AI driver or not. And then after that, people should be held responsible for that. It's a rule just like not running red lights, not speeding, and stopping at crosswalks. It is an important rule that can cost someone their life if you don't follow it.

Ornamental Dingbat
Feb 26, 2007

Son of Rodney posted:

Are you a distant descendant of hvidserk ragnarsson?

Nah, Ingólfr Arnarson.

graventy
Jul 28, 2006

Fun Shoe

Dick Trauma posted:

These robot cars are probably not any worse, and in most cases much better at not running people over.

That's how they're being sold to the public, but that is not even remotely accurate.

Pinning the blame on the driver here is completely bullshit, particularly when they were undoubtedly instructed "do not touch that wheel or pedals unless you absolutely have to".

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


Report: Uber’s Software Flaws Led to Fatal Crash

quote:

The documents pinpointed the six-second period when the SUV’s software sensed Herzberg, but was confused by her movements, then kept speeding at 45 miles per hour without slowing down. The documents also show that Uber had disabled Volvo’s automatic emergency brake system to prevent stop-and-go driving, instead setting up an alarm that would alert the test driver to take over the car. But that system was flawed, too:

Exactly 1.2 seconds before the car struck Herzberg, the car realized that it would neither be able to stop in time nor swerve around the pedestrians. But it did not sound the alarm for test driver Rafaela Vasquez until .2 seconds before the crash, leaving Vasquez, 44, far too little time to get control of the vehicle. She hit the brakes a full second after the car hit Herzberg.

“Although the [Automatic Driving System] sensed the pedestrian nearly six seconds before the impact, the system never classified her as a pedestrian — or predicted correctly her goal as a jaywalking [sic] pedestrian or a cyclist — because she was crossing … at a location without a crosswalk,” the report said. “The system design did not include a consideration for jaywalking pedestrians.”

There's probably something the driver should've done differently but that car never should've been on the road in the first place.

Mr. Fall Down Terror
Jan 24, 2018

by Fluffdaddy
the driver gets some of the blame for blatantly not paying attention to the road

management also gets some of the blame for compromising on the security protocol. for example it was documented on driver-facing video that the driver was not paying attention, this documentation of driver behavior should have been reviewed more frequently - this was the entire purpose of recording the driver on video

also higher management catches blame for creating such a vehicle in the first place

and the arizona government for being terribly lax in both self driving car test regulations as well as just having bad pedestrian safety laws

really the NTSB is going to assign blame where blame needs to be assigned, here's the report

https://dms.ntsb.gov/public/62500-62999/62978/631030.pdf

it is important though to point out driver behavior since this is exactly how this technology will be used if it ever gets into mass adoption. like if you thought it was bad enough with people dicking with their phones behind the wheel that's going to become hugely more common when people believe their cars drive themselves

Mr. Fall Down Terror has a new favorite as of 17:26 on Sep 17, 2020

Cocaine Bear
Nov 4, 2011

ACAB

AI cars: good or bad? Discuss.

Cocaine Bear
Nov 4, 2011

ACAB

Lotron
Aug 15, 2006

Still clownin'

Cocaine Bear posted:

AI cars: good or bad? Discuss.

I forgot whose joke it was but

quote:

Yeah, let's leave the driving to the things that apparently can't figure this out:


And now this--

https://twitter.com/HomeiMiyashita/status/1306068425601478656

Trapick
Apr 17, 2006

ultrafilter posted:

There's probably something the driver should've done differently but that car never should've been on the road in the first place.

quote:

“The system design did not include a consideration for jaywalking pedestrians.”
Yah, that is a system that is just not ready for the road. I think the driver has responsibility here, but so does Uber and whoever allowed this to be on the road.

Now a tweet (that may be inside baseball a bit much, but anyone who uses Jira will enjoy at least):
https://twitter.com/isislovecruft/status/1292331568330022913?s=20

Veni Vidi Ameche!
Nov 2, 2017

by Fluffdaddy

Trapick posted:

Yah, that is a system that is just not ready for the road. I think the driver has responsibility here, but so does Uber and whoever allowed this to be on the road.

Now a tweet (that may be inside baseball a bit much, but anyone who uses Jira will enjoy at least):
https://twitter.com/isislovecruft/status/1292331568330022913?s=20

I am not surprised at the people who make Jira don’t want to use Jira.

number 1 snake fan
Jul 16, 2018

Those robot dogs are terrifying even when they're 1/4 size

ThisIsJohnWayne
Feb 23, 2007
Ooo! Look at me! NO DON'T LOOK AT ME!



JIRA is the most worthless agile-adjacent tool I swear to godddd

Post-it methods on the other hand is both the best, easy, and most honest of all the project overviews, I AM NOW ANGRY ABOUT PAPER ON THE COMPUTER gaaaaaah

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


https://twitter.com/SakagamiValerie/status/1306518623574392837

That's a hell of a time period to miss.

Aramoro
Jun 1, 2012




I mean yeah yeah should have been in control. Except it's a self driving car, they're being marketed and sold as self-driving cars. Uber are telling people explicitly to let the car drive itself. Now they're turning round saying 'well obviously she should have had the car under control in exactly the way we told her not to' . Then you get all the big brains on the internet defending them with 'well I was always taught to never let an AI drive the car, she was totally at fault here'

This is despite the fact the car was programmed to not recognise pedestrians not on a cross walk and it miss identified the pedestrian multiple times, switching between 2 choices. Every time to switched what it thought it was it threw away all the previous data and started working out if it was going to crash again, right up until it hit her.

Griefor
Jun 11, 2009

Aramoro posted:

I mean yeah yeah should have been in control. Except it's a self driving car, they're being marketed and sold as self-driving cars. Uber are telling people explicitly to let the car drive itself. Now they're turning round saying 'well obviously she should have had the car under control in exactly the way we told her not to' . Then you get all the big brains on the internet defending them with 'well I was always taught to never let an AI drive the car, she was totally at fault here'

This is despite the fact the car was programmed to not recognise pedestrians not on a cross walk and it miss identified the pedestrian multiple times, switching between 2 choices. Every time to switched what it thought it was it threw away all the previous data and started working out if it was going to crash again, right up until it hit her.

Literally nobody here is defending Uber.

Trevor Hale
Dec 8, 2008

What have I become, my Swedish friend?


drat. She’s gonna find out about those celeb deaths at the same time

Dick Trauma
Nov 30, 2007

God damn it, you've got to be kind.

graventy posted:

That's how they're being sold to the public, but that is not even remotely accurate.

It's accurate in the sense that human drivers are terrible and constantly run people over. I would rather be run over by a robot car than by a human being who did it because they literally don't care if they kill someone as long as they can have just one more moment of texting.

Would it be nice to have the choice to not be run over at all? Yes! But with humans driving cars that is never, ever going to be an option, and years of near-death pedestrian incidents has made me hope for an alternative.

OwlFancier
Aug 22, 2013

I don't know if i have ever sat down and formed a pre emptive opinion of who or what I would like to run me over.

zoux
Apr 28, 2006

Is there a way I can be placed into a coma until November 4

NoEyedSquareGuy
Mar 16, 2009

Just because Liquor's dead, doesn't mean you can just roll this bitch all over town with "The Freedoms."

zoux posted:

Is there a way I can be placed into a coma until November 4

https://www.youtube.com/watch?v=Dbr7B1OVa0g

Moo the cow
Apr 30, 2020

zoux posted:

Is there a way I can be placed into a coma until November 4

Walk outside and get coughed on by someone.

zoux
Apr 28, 2006

https://twitter.com/chrislhayes/status/1306630069608161285

Oh there we go

Air Skwirl
May 13, 2007

Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed shitposting.

Griefor posted:

If the brakes fail while underway, sure. But if the driver knows the brakes are not functioning before she enters the bus but it is company policy to drive buses with broken brakes she does not get to hide behind orders from her corporate overlord and is, in fact, responsible. Being asked to do something by your corporate overlords does not absolve you of having to obey the law.

I'd like to stress that there is no dichotomy here, both can be responsible. Like with murder-for-hire, there is no need to ask whether the murderer or the one putting out the hit is responsible, they can both be guilty of a crime.

In my hypothetical the driver was assured the brakes work, just like I'm sure the driver in the self driving car was told the self driving thingies worked.

Cocaine Bear
Nov 4, 2011

ACAB

But car AI, is it good or not? Please, help me get to the one definitive answer, goons!

wizzardstaff
Apr 6, 2018

Zorch! Splat! Pow!
https://twitter.com/snrrkk/status/1306432972577558528

Griefor
Jun 11, 2009

Skwirl posted:

In my hypothetical the driver was assured the brakes work, just like I'm sure the driver in the self driving car was told the self driving thingies worked.

Whether the self driving thingy works is besides the point. A driver is not allowed to depend on it for the safety of other road users. And having a driver's license creates due diligence to know the traffic rules.

PirateDentist
Mar 28, 2006

Sailing The Seven Seas Searching For Scurvy

Cocaine Bear posted:

But car AI, is it good or not? Please, help me get to the one definitive answer, goons!

Every time Waymo gives an update they seem to get more skeptical about being able to make a truly self driving car. They have probably orders of magnitude more research and testing than anyone else and they’re basically going “poo poo, it’ll be a while. This is hard.”

ThisIsJohnWayne
Feb 23, 2007
Ooo! Look at me! NO DON'T LOOK AT ME!



Griefor posted:

Whether the self driving thingy works is besides the point. A driver is not allowed to depend on it for the safety of other road users. And having a driver's license creates due diligence to know the traffic rules.

This is not praxis in America.

Adbot
ADBOT LOVES YOU

zoux
Apr 28, 2006

What if no one simply went anywhere ever

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply