Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
How many quarters after Q1 2016 till Marissa Mayer is unemployed?
1 or fewer
2
4
Her job is guaranteed; what are you even talking about?
View Results
 
  • Post
  • Reply
Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


Jose Valasquez posted:

An AI trained extensively on those conditions would probably do better than the average human driver in those conditions.

"Probably".

Adbot
ADBOT LOVES YOU

Civilized Fishbot
Apr 3, 2011
Yeah I'd imagine the situations where human drivers perform best relative to robot drivers aren't characterized by heavy environmental obstacles (where the range of sensors and precise controls used by the computer would be a huge advantage). I'd imagine that human drivers have our comparative advantage in making sense of other human pedestrians/cyclists/drivers and really novel situations.

So maybe in that sense San Francisco would be the hardest city for a robot driver to beat human baseline effectiveness.

Jose Valasquez
Apr 8, 2005


I don't have an AI trained on icy conditions so I can't say for certain but I've seen lots of human drivers and they have no clue what to do. Basic heuristics for a self driving car of what to do when a wheel starts slipping would outperform most human drivers

Vegetable
Oct 22, 2010

Even a braindead heuristic like “gently caress ton of snow? Drive at 10mph” would probably be better than whatever the gently caress humans do right now.

Dirk the Average
Feb 7, 2012

"This may have been a mistake."

Jose Valasquez posted:

I don't have an AI trained on icy conditions so I can't say for certain but I've seen lots of human drivers and they have no clue what to do. Basic heuristics for a self driving car of what to do when a wheel starts slipping would outperform most human drivers

Even then, there's already been non-AI improvements to systems like antilock brakes that help to deal with icy conditions. You don't actually need a self driving car to make things like that safer for a motorist.

evilweasel
Aug 24, 2002

Civilized Fishbot posted:

Yeah I'd imagine the situations where human drivers perform best relative to robot drivers aren't characterized by heavy environmental obstacles (where the range of sensors and precise controls used by the computer would be a huge advantage). I'd imagine that human drivers have our comparative advantage in making sense of other human pedestrians/cyclists/drivers and really novel situations.

So maybe in that sense San Francisco would be the hardest city for a robot driver to beat human baseline effectiveness.

yeah i think this is correct. humans do better on anything where you need "common sense" and a theory of mind because we have no idea how to program a computer to deal with those.

"how do i navigate this car to stay on its path in icy lovely weather" seems like the kind of situation that's tailor-made for a computer to beat the pants off a person because it is a knowable circumstance where instant diagnosis and reaction allow a computer a huge advantage over a person. to the degree if i were running a self-driving car company i would be lobbying to include as many weather obstacle problems as possible in whatever testing was demonstrating safety, compared to "what should i do in this unfamiliar circumstance or how do i anticipate the reactions of another human" situations.

that is not to say any particular driving AI would be safe under those circumstances, but that this is a situation where the bar to beat a human is about as low as it can be

Kwyndig
Sep 23, 2006

Heeeeeey


And yet all the trials have been in areas where the weather was currently perfect, which I think says something about AI cars and inclement weather conditions.

Steve French
Sep 8, 2003

“An AI that’s good at that would be good at that” is some next level begging the question poo poo

Main Paineframe
Oct 27, 2010
Nobody should be driving in icy conditions or heavy snow, robot or not. Where I'd expect AI cars to fall short of humans is with stuff that requires more contextual information.

For example, I've seen winter roads so heavily salted that the actual road markings literally aren't even visible, because the white lane markings and such blend into the white salty residue covering the road. Human drivers can handle that easily, but AVs still largely need to be able to see the lane lines.

Another example is big snowpiles on the side of the road obscuring pedestrians or cars. A human driver might be able to remember that there's a busy crosswalk up ahead behind that big snowpile and drive with more care, but autonomous driving systems are a long way from being able to apply that sort of contextual and situational knowledge.

karthun
Nov 16, 2006

I forgot to post my food for USPOL Thanksgiving but that's okay too!

Deuce posted:

I wanna see Waymo handle Duluth in the winter.

You know full well its going try to go up Lake Ave rather then take Mesaba or 6th around.

Nothingtoseehere
Nov 11, 2010


You would think adverse road conditions would be a benefit, but the is the mark 1 eyeball is a much better sensor (or more precisely, the brain is a very good processor of the eyeball) compared to a high tech suite of sensors. It's alot easier to trick or for AI algorithms to outright fail than the brain, as flawed as humans are.

Neito
Feb 18, 2009

😌Finally, an avatar the describes my love of tech❤️‍💻, my love of anime💖🎎, and why I'll never see a real girl 🙆‍♀️naked😭.

And it's not like there would be a ton of carryover; remember that these things are black boxes and we don't actually know how they... well, not think, but certainly at least arrive at conclusions. We don't know if "Rainy day training data" has any crossover with "Snowy day training data". It could be that the range of things we'd have to train them on is oddly large.

And it's not even "thinking" in a sense that we would understand. All these "AI" things are just black boxes that take in input and spit out an answer; it can't go "Well, this is rain; I know in snow my traction is reduced, so Rain probably has a similar effect".

evilweasel
Aug 24, 2002

Steve French posted:

“An AI that’s good at that would be good at that” is some next level begging the question poo poo

no. understanding what the relative strengths and weaknesses of a computer vs a brain lets you make very good guesses at what a computer would be good for vs. a person and where the answer is that an AI that can handle that situation better than a person is effectively a general-purpose AI (in other words, lol)

driving on snow is the sort of thing you can get fairly basic heuristics that can meaningfully improve on a person for without even a complex neural network, which is why things like anti-lock brakes and skid detection exist. and the point is since people are bad at driving on snow, you set the bar to beat a person very low, thus disguising that you have not really meaningfully done complex work. you've just programmed the car to go slowly, identify a skid, and then execute the well-understood method for getting out of a skid that people are lovely at.

evilweasel fucked around with this message at 19:52 on Oct 30, 2023

Oxyclean
Sep 23, 2007


I'm curious how well AIs could be set up to handle stuff like black ice or other loss-of-control scenarios. Humans don't handle these well, but I feel like you have scenarios where humans *can* make on the fly decision making that would be extraordinarily hard to design an AI to do?

Like the fact a bunch of self driving cars were incapacitated by placing a traffic cone on the hood makes me feel like self driving cars will always have weird sorts of blind spots where a human could easily correct.

evilweasel
Aug 24, 2002

Oxyclean posted:

I'm curious how well AIs could be set up to handle stuff like black ice or other loss-of-control scenarios. Humans don't handle these well, but I feel like you have scenarios where humans *can* make on the fly decision making that would be extraordinarily hard to design an AI to do?

Like the fact a bunch of self driving cars were incapacitated by placing a traffic cone on the hood makes me feel like self driving cars will always have weird sorts of blind spots where a human could easily correct.

"steer into the direction of the skid until you regain control and under no circumstances slam on the brakes" is taught in driving classes, but i would expect the number of people who remember it in the middle of a skid if they haven't trained specifically on that is...well, low. that's the sort of thing i mean: the heuristic of what to do is not hard. what's hard is instantly identifying the problem and executing the solution for a person.

karthun
Nov 16, 2006

I forgot to post my food for USPOL Thanksgiving but that's okay too!

Main Paineframe posted:

Nobody should be driving in icy conditions or heavy snow, robot or not. Where I'd expect AI cars to fall short of humans is with stuff that requires more contextual information.

For example, I've seen winter roads so heavily salted that the actual road markings literally aren't even visible, because the white lane markings and such blend into the white salty residue covering the road. Human drivers can handle that easily, but AVs still largely need to be able to see the lane lines.

Another example is big snowpiles on the side of the road obscuring pedestrians or cars. A human driver might be able to remember that there's a busy crosswalk up ahead behind that big snowpile and drive with more care, but autonomous driving systems are a long way from being able to apply that sort of contextual and situational knowledge.

It is not realistic to have people not drive during icy conditions. Icy conditions can occur in Minnesota basicly from now to late April. I'm not talking about blizzard or heavy snow. Every single morning and every single afternoon there is a serious chance of hitting ice on what looks like perfectly clear roads. In the morning its because the morning sun hasn't cleared off the ice that formed overnight, or in the afternoon you have the daytime sun that melted everything but the afternoon shade is causing ice to reform in places where there is heavy shade. Bridges can form ice quite easily too. These are all things that human drivers need to deal for several months of the year and its just not realistic to say people shouldn't be driving in it.

The real danger is when you have human customs like "Downhill has the right of way". I've slid through a red light intersection at 5mph and went another 40 ft until I finally came to a stop. Glad the rest of the cars all stooped on their green and let me slide through very slowly.

Neito
Feb 18, 2009

😌Finally, an avatar the describes my love of tech❤️‍💻, my love of anime💖🎎, and why I'll never see a real girl 🙆‍♀️naked😭.

That's not the convincing argument that you think it is.

SerthVarnee
Mar 13, 2011

It has been two zero days since last incident.
Big Super Slapstick Hunk
I keep getting reminded of that video where a bunch of Russians had to drive through a burning forest because that was the only road from the village and that village was about to be overrun by the flames.
I would not want to trust an autopilot in that situation when stopping might mean the death of every single driver behind me.

withak
Jan 15, 2003


Fun Shoe
Being able to say “can’t go out today, AI driver says it isn’t safe” might be the biggest point in their favor.

Foxfire_
Nov 8, 2010

Oxyclean posted:

Like the fact a bunch of self driving cars were incapacitated by placing a traffic cone on the hood makes me feel like self driving cars will always have weird sorts of blind spots where a human could easily correct.

I don't think the cone thing is any sort of weird blind spot. That is an intentional "There is an object on the hood blocking sensor view" => "Don't start driving when the sensors aren't working" behavior.

That's a "cars don't have hands" problem, not some unexpected emergent edge case.

Vegetable
Oct 22, 2010

Mercedes now offers L3 autonomous driving on California highways for a subscription fee. You can watch youtube or gently caress around while the car drives itself.

The weak thing is the speed is ludicrously low — something like 45 mph.

The interesting thing is they assume full liability if the car crashes during that time — if you aren’t using the system in an inappropriate way.

Of course the caveat is a big one. But the liability part is a big freaking deal. This probably won’t be a common thing for another 15 or 20 years. But it’s the kind of systemic shift that will move the needle for safety in the aggregate. Carmakers will optimize much more heavily towards safety if they have to bear the costs.

For what it’s worth Mercedes has apparently been testing this program in Germany for a year now and not had any accidents. Driverless cars still won’t replace every scenario of driving but if it can replace just perfect weather highway driving, it’ll already be saving a poo poo ton of lives.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

Vegetable posted:

You can watch youtube or gently caress around while the car drives itself.

...

The interesting thing is they assume full liability if the car crashes during that time — if you aren’t using the system in an inappropriate way.

Platystemon
Feb 13, 2012

BREADS
Who cares about liability?

Like Tesla autopilot is awful and only and idiot would use it, but they’re not all ending up bankrupt and in the slammer for wrecks that “their car” caused.

Ordinary car insurance is all they need, and owning a Mercedes wouldn’t get them out of paying for that.

Oxyclean
Sep 23, 2007


Foxfire_ posted:

I don't think the cone thing is any sort of weird blind spot. That is an intentional "There is an object on the hood blocking sensor view" => "Don't start driving when the sensors aren't working" behavior.

That's a "cars don't have hands" problem, not some unexpected emergent edge case.

But for what is supposed to be an autonomous taxi, that sort of feels like A Problem.

If self driving cars remain a thing where a driver with full driving experience needs to remain behind the wheel because the car can't handle someone putting a cone on the hood is enough to incapacitate it, or worse, blowing wet snow blocks the sensors, it seems like a pretty loving useless technology.

e: I'm not really convinced AI cars are ever going to reach the point people like to imagine they will. "Cars don't have hands" basically does mean there's going to be all sorts of little problems the car will not be able to solve by itself, that it might otherwise need to.

Oxyclean fucked around with this message at 23:21 on Oct 30, 2023

Kagrenak
Sep 8, 2010

Platystemon posted:

Who cares about liability?

Like Tesla autopilot is awful and only and idiot would use it, but they’re not all ending up bankrupt and in the slammer for wrecks that “their car” caused.

Ordinary car insurance is all they need, and owning a Mercedes wouldn’t get them out of paying for that.

No but your premiums do go up if your Tesla decides to run into another car and they make a claim. I assume you also still get points on your license. If merc is taking full liability then the other person wouldn't file a claim on your insurance but instead deal with Mercedes. Not sure what the deal would be with any license penalties though

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

Oxyclean posted:

But for what is supposed to be an autonomous taxi, that sort of feels like A Problem.

If self driving cars remain a thing where a driver with full driving experience needs to remain behind the wheel because the car can't handle someone putting a cone on the hood is enough to incapacitate it, or worse, blowing wet snow blocks the sensors, it seems like a pretty loving useless technology.

I mean it makes no sense to design your robotaxi car for the extreme edge case of “douchebag physically covering up the sensor array”. Like some rear end in a top hat could just as easily walk into an elevator and rip out all the buttons so nobody could use it anymore. Doesn’t seem like much you can do to prevent this other than like idk putting guns on the cars/elevators.

Platystemon
Feb 13, 2012

BREADS
They should make a car that doesn’t have windows and has all the seats face the rear for safety.

BRB trademarking “DymAxIon”.

Oxyclean
Sep 23, 2007


Boris Galerkin posted:

I mean it makes no sense to design your robotaxi car for the extreme edge case of “douchebag physically covering up the sensor array”. Like some rear end in a top hat could just as easily walk into an elevator and rip out all the buttons so nobody could use it anymore. Doesn’t seem like much you can do to prevent this other than like idk putting guns on the cars/elevators.

People have a hell of a lot more reason to sabotage robotaxis then they do to sabotage elevators. Just take a look at the ways Ubers were protested by older taxi services, or the fact that it's less "douchebags" putting cones on the taxis and more people annoyed that these lovely barely tested taxis are constantly blocking traffic and creating problems.

"The sensor array might be blocked, either unintentionally or maliciously" is absolutely something you need to design for. What if a branch falls on a robotaxi, causing it to become stuck, blocking a road?

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

Oxyclean posted:

"The sensor array might be blocked, either unintentionally or maliciously" is absolutely something you need to design for.

Who’s to say it hasn’t been designed for? The car probably phones home for help.

Oxyclean posted:

What if a branch falls on a robotaxi, causing it to become stuck, blocking a road?

If the passengers aren’t hurt I would assume there’s a help button in the car/app they can press to alert the robotaxi company the car got wrecked, assuming that the company hasn’t already been alerted by onboard sensors. Then they’d prolly call friends/family to come pick them up. And if the passengers were hurt call 911?

Much like what one would do if the same happened to them in a regular person driven car?

Jose Valasquez
Apr 8, 2005

If someone put a traffic cone on the hood of my self driving car I'd just make it stop and then get out and remove the cone just like if someone put a cone on the hood of my regular car

Platystemon
Feb 13, 2012

BREADS
lol if you don’t pull a J-turn like the hero in an action movie

Oxyclean
Sep 23, 2007


Jose Valasquez posted:

If someone put a traffic cone on the hood of my self driving car I'd just make it stop and then get out and remove the cone just like if someone put a cone on the hood of my regular car
The issue has been people coning autonomous taxis while they are (I assume) unoccupied. Meaning they just get stuck until someone comes along to help.

Boris Galerkin posted:

Who’s to say it hasn’t been designed for? The car probably phones home for help.

If the passengers aren’t hurt I would assume there’s a help button in the car/app they can press to alert the robotaxi company the car got wrecked, assuming that the company hasn’t already been alerted by onboard sensors. Then they’d prolly call friends/family to come pick them up. And if the passengers were hurt call 911?

Much like what one would do if the same happened to them in a regular person driven car?

I'm sure all the efficiency of AI driven cars will do wonders while waiting for a human to show up and troubleshoot a stuck car while it blocks traffic.

but I don't even mean "a branch falls on the car, wrecking it" I mean, a car's sensors become blocked unintentional due to something like nature. Now a car is stuck and needs someone to show up to fix it, all while possibly blocking a road or otherwise creating a hazard.

I'm basically not convinced a world of fully autonomous robotaxis is a realistic thing because there's just a lot of little problems that are hard to account for.

Vegetable
Oct 22, 2010

If driverless cars become a normal thing and people are just coning them, I assume society will eventually arrive at the point where the coners are prosecuted like any other offender who’s blocking or sabotaging human-driven cars.

The only reason this isn’t happening is because driverless cars are a novelty and the impact on everyone else is localized and fairly minimal.

Lyesh
Apr 9, 2003

Main Paineframe posted:

Nobody should be driving in icy conditions or heavy snow, robot or not. Where I'd expect AI cars to fall short of humans is with stuff that requires more contextual information.

For example, I've seen winter roads so heavily salted that the actual road markings literally aren't even visible, because the white lane markings and such blend into the white salty residue covering the road. Human drivers can handle that easily, but AVs still largely need to be able to see the lane lines.

Another example is big snowpiles on the side of the road obscuring pedestrians or cars. A human driver might be able to remember that there's a busy crosswalk up ahead behind that big snowpile and drive with more care, but autonomous driving systems are a long way from being able to apply that sort of contextual and situational knowledge.

Yeah, this is much more what I was thinking of when I mentioned snow. It obliterates markings and narrows streets even after its been cleared. An AI that's trained to the specific roads in (say) Duluth would probably be able to get around that, but that also gets into the issues where similar life-or-death systems in aerospace or medicine are regulated to the point where code changes need regulatory approval. The mantra of Silicon Valley, "move fast and break things" is an idiots' mantra that gets people killed in those contexts. This doesn't even get into the fact that ML systems are complete blackboxes where you can't pursue safety concepts like provable correctness.

This is all a complex set of tradeoffs that I don't expect industry to make itself in a remotely reasonable way, and the history of self-driving over-promising and under-delivering is not making me worry about it less. Especially the talk about how humans are terrible at driving. Humans are GREAT at driving. Proof: a naive system will go straight and accelerate to a speed that's either dangerous or useless. That system would almost instantly get into a collision and/or kill someone. For a slightly less lovely system, look at old cruise control. It needs a speed preset by the human operator, but it can't steer at all. Stuff like rain can cause terrible accidents because the speed indicator doesn't work right if the tire is no longer contacting the road (hydroplaning). These systems have death rates in the range of several-miles-driven per death. Humans in the US are around a hundred million miles per fatality.

The data right now is way too sparse for driverless cars (those companies are running cars on the order of one million miles per year in areas that have lower speeds and crashes that are less lethal in general), and if you restricted human drivers to, say, OBEY SPEED LIMITS, you'd likely have a much rougher competition for the AIs.

The problem isn't that humans suck at driving. The problem is that driving is a common activity that has a low rate of extremely negative consequences.

Vegetable
Oct 22, 2010

Lyesh posted:

if you restricted human drivers to, say, OBEY SPEED LIMITS, you'd likely have a much rougher competition for the AIs.
Somehow you’ve articulated a more fantastical reality than all the driverless car shills

notwithoutmyanus
Mar 17, 2009

Jose Valasquez posted:

An AI trained extensively on those conditions would probably do better than the average human driver in those conditions.

The correct phrase here is "probably not". You think these things have difficulty in the best of times and yet assume it'll do better because reasons.

Chronojam
Feb 20, 2006

This is me on vacation in Amsterdam :)
Never be afraid of being yourself!


You can put a series of cones across some traffic lanes, right now, and stop most humans from using that lane or even turning down an entire street.

They don't even need to be big cones, the little soccer practice ones work well and are easier to carry around -- a place nearby does this routinely, and only a few people get out to move them and drive past.

Lyesh
Apr 9, 2003

Vegetable posted:

Somehow you’ve articulated a more fantastical reality than all the driverless car shills

True dat. I can't tell if true speed limits or mandatory breath-alcohol interlocks would be the less popular idea. The tools for speed limits even exist in current cars, so it's really just a pure social issue. Not a surmountable one though.

Oxyclean
Sep 23, 2007


Chronojam posted:

You can put a series of cones across some traffic lanes, right now, and stop most humans from using that lane or even turning down an entire street.

They don't even need to be big cones, the little soccer practice ones work well and are easier to carry around -- a place nearby does this routinely, and only a few people get out to move them and drive past.

A human might be able to discern if they're being hosed with, and take actions to get past.

and again, this is sort of the root of my point: Humans can make decisions an AI will never be able to. They won't always be good, but they will be ones AI can't.

Oxyclean fucked around with this message at 01:57 on Oct 31, 2023

Adbot
ADBOT LOVES YOU

Clarste
Apr 15, 2013

Just how many mistakes have you suffered on the way here?

An uncountable number, to be sure.

evilweasel posted:

"how do i navigate this car to stay on its path in icy lovely weather" seems like the kind of situation that's tailor-made for a computer to beat the pants off a person because it is a knowable circumstance where instant diagnosis and reaction allow a computer a huge advantage over a person.

Okay, but can it see ice? Can it distinguish ice from other types of ground? Given the track record I feel like it would much, much worse than an average person at diagnosing this "knowable circumstance," even if it might be better at remembering what to do in it (theoretically).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply