|
Pigsfeet on Rye posted:Code poorly writ? You must acquit! Jaywalking laws are what got them off the hook. Remember, Auto Clubs pushed for jaywalking laws that largely blamed victims for their demise rather than drivers
|
# ? Nov 10, 2019 15:56 |
|
|
# ? May 29, 2024 22:38 |
|
Pyroclastic posted:The article I read had a little more detail--it re-categorized the victim several times, but even after a re-categorization, it was still detecting an object in the road that it may have to brake for. The real issue wasn't 'didn't see a pedestrian because it wasn't in the crosswalk' (although if it was able to recognize a pedestrian outside of a crosswalk, maybe it wouldn't've gotten confused at what it was seeing), it was because every time it re-categorized her, it threw out its prior motion data, so the system literally didn't recognize that she was moving across its path. Every time the system changed what it thought it saw, she had a known starting speed of 0. It detected her as a bicycle several times, but the system still would have braked for a bicycle moving across its path at any point in a roadway, but the re-categorizations just meant it had near-static snapshots of a variety of obstacles that didn't have a vector that crossed the car's. Holy poo poo, that is so absurd I didn't even consider it.
|
# ? Nov 10, 2019 16:13 |
|
Burt Sexual posted:Osha? pitting workers against each other is an OSHA hazard
|
# ? Nov 10, 2019 16:25 |
|
HardDiskD posted:And someone mentioned that Uber got acquitted, right? Acquittal implies criminal charges were filed. They barely even considered holding Uber criminally liable.
|
# ? Nov 10, 2019 16:35 |
|
evil_bunnY posted:lmao that's criminally incompetent. I doubt any of these systems have object permanence. I know Tesla’s “smart summon” feature doesn’t.
|
# ? Nov 10, 2019 16:55 |
|
Yes it is criminally negligent. Also it's completely and utterly bugfuck insane that anyone is allowed to test something like this on a public road. A giant multi-ton projectile with unknown behavioural characteristics with the public as the guinea pig. It's like testing a self-shooting gun in a public park.
|
# ? Nov 10, 2019 19:18 |
|
Spatial posted:Yes it is criminally negligent. Also it's completely and utterly bugfuck insane that anyone is allowed to test something like this on a public road. A giant multi-ton projectile with unknown behavioural characteristics with the public as the guinea pig. It's like testing a self-shooting gun in a public park. and yet so many people, goons included, were breathless to tell us that this was going to be great. Meanwhile, anyone who had any experience with software or big companies like uber knew that we were in for some serious loving bullshit.
|
# ? Nov 10, 2019 19:29 |
|
drgitlin posted:I doubt any of these systems have object permanence. I know Tesla’s “smart summon” feature doesn’t. Uber's system did. Ars Technica posted:Second, the constantly switching classifications prevented Uber's software from accurately computing her trajectory and realizing she was on a collision course with the vehicle. You might think that if a self-driving system sees an object moving into the path of the vehicle, it would put on its brakes even if it wasn't sure what kind of object it was. But that's not how Uber's software worked. And, yes, Uber was cleared of criminal liability almost immediately, well before the NTSB finished its investigation (maybe the feds could file their own charges?). Uber's just fine with throwing their 'safety' driver under the bus. Or under the Uber.
|
# ? Nov 10, 2019 19:30 |
|
Spatial posted:Yes it is criminally negligent. Also it's completely and utterly bugfuck insane that anyone is allowed to test something like this on a public road. A giant multi-ton projectile with unknown behavioural characteristics with the public as the guinea pig. It's like testing a self-shooting gun in a public park. ED-209, except with random members of the public instead of executives
|
# ? Nov 10, 2019 19:30 |
|
https://i.imgur.com/Cfwhj6N.mp4
Somebody fucked around with this message at 00:23 on Nov 11, 2019 |
# ? Nov 10, 2019 19:52 |
|
I can't imagine that guy is ok...
|
# ? Nov 10, 2019 20:14 |
|
That is definitely the "instantaneously vaporized" sort of arc flash, yes.
|
# ? Nov 10, 2019 20:22 |
|
if that guy died, you should probably remove that one.
|
# ? Nov 10, 2019 20:32 |
|
It's the adjacent electric box that seems to explode to me?
|
# ? Nov 10, 2019 20:37 |
|
Please link and nms those ty. I’m mobile
|
# ? Nov 10, 2019 20:37 |
|
Video make me real scared
|
# ? Nov 10, 2019 21:56 |
|
Pyroclastic posted:Uber's system did. That quote says it didn’t, because changing classification made it forget it.
|
# ? Nov 10, 2019 22:43 |
|
Enkmar posted:Video make me real scared It’s in the op you idiot.
|
# ? Nov 10, 2019 22:52 |
|
car: *hurtles towards pedestrian* AI: *screams and covers face with hands*
|
# ? Nov 10, 2019 23:32 |
|
corgski posted:That is definitely the "instantaneously vaporized" sort of arc flash, yes. It really depends on how long he was exposed, and how well his PPE protected him. If the explosion was was at chest level or above (or not located in the panel closest to him), he probably had a pretty decent shot at minor to no burns, assuming he moved away immediately. He was in the open, not an enclosed space, and appeared to have everything covered... Hopefully with material that was FR and did not continue the burn after the accident. Worst case would be a close explosion that was below chest level, because as the heat rolls up it would get behind the face shield, or an untucked shirt. This poo poo right here is why I harp on equipment grounding and fault protection in the wiring thread. There is a stupid amount of fault current potential at the utility level, even at residential voltage in many situations. You want that mess to be outside and enclosed, with anything coming inside protected to limit the fault potential.
|
# ? Nov 11, 2019 00:19 |
|
If you want to spin up cutting discs to twice their rated speed you need to upgrade from standard safety glasses. (only watermelons were harmed in this video by the Hydraulic Press Channel people) https://www.youtube.com/watch?v=4ttaUA4y1c4
|
# ? Nov 11, 2019 02:09 |
|
Pyroclastic posted:The system used an object's previously observed locations to help compute its speed and predict its future path. However, "if the perception system changes the classification of a detected object, the tracking history of that object is no longer considered when generating new trajectories," the NTSB reports. I get the batshit logic that they were using here -- if you've calculated, say, the motion path for a pedestrian and then you decide that the object is actually a bicycle, you need to update your predictions because a bicycle can move faster and in different ways than a pedestrian does. However, just throwing out the entire prediction and starting from scratch is so loving dumb. That's like first-year computer science student level work. In many ways it doesn't matter what the object is; if it's moving at 3 meters per second into the path of the car, you can reasonably anticipate that it will continue to move in a similar fashion in the near future, and any changes in its motion will be restricted to a narrow range of vectors and accelerations that are available to earthbound objects obeying the laws of physics. You can keep the model and be aware that something is about to intersect with the car even if you're not completely sure of what it is.
|
# ? Nov 11, 2019 02:28 |
|
The car has to calculate how much of an insurance liability it will be for Uber for every scenario before it will stop. It starts at 0 every time.
|
# ? Nov 11, 2019 02:35 |
|
Green Intern posted:The car has to calculate how much of an insurance liability it will be for Uber for every scenario before it will stop. It starts at 0 every time. Apparently it ends at 0 every time, too.
|
# ? Nov 11, 2019 02:56 |
|
angryrobots posted:This poo poo right here is why I harp on equipment grounding and fault protection in the wiring thread. There is a stupid amount of fault current potential at the utility level, even at residential voltage in many situations. You want that mess to be outside and enclosed, with anything coming inside protected to limit the fault potential. Seriously. The worst I've personally been witness to was some clown using a Class II multimeter to directly meter a 250kVA generator outputting (what was supposed to be) 208v. In that case a decent portion of the meter vaporized.
|
# ? Nov 11, 2019 06:11 |
|
Sagebrush posted:I get the batshit logic that they were using here -- if you've calculated, say, the motion path for a pedestrian and then you decide that the object is actually a bicycle, you need to update your predictions because a bicycle can move faster and in different ways than a pedestrian does. It is not batshit or dumb at all. It is easy to point and laugh afterwards, but it is very easy to conceive of scenarios that would lead to such implementation. The object permanence has to compete against objects disappearing (maybe only from view) and appearing. In the real world, objects don’t just change, so it is quite possible the code considered it a new object instead of flakey object detection (which is again explainable as the correct answer was excluded here). Everybody would also have called it stupid if a Tesla hit a pedestrian because a car went past it, so it inherited the speed of the car and was hit because it was too darn slow. This is of course a ridiculous example as a Tesla would never have object permanence. I’m not defending Uber, but saying that the problem is a lot more complex than people give it credit for, and insights like this that are obvious in hindsight are not beforehand. They should have caught it in testing, but that is literally what they did. People (well, more people than have already) are going to die before the technology gets good enough to be used safely. That raises the question about whether it is worth it? I dunno, maybe? It turns out that the trolley problem we have to deal with is not whether a murdercar should kill 10 hobos to save one businessman, but instead whether incomplete pieces of unpredictable heaps of mess should kill people now in order to potentially save people in 20+ years when the technology is good enough.
|
# ? Nov 11, 2019 06:42 |
|
klafbang posted:It is not batshit or dumb at all. It is easy to point and laugh afterwards, but it is very easy to conceive of scenarios that would lead to such implementation. Current consumer-facing SDKs have no trouble tracking the multitude of objects in view in a driving scenario by unique ID regardless of if the classification changed. Not keeping a useful running tally of each object's classification history in that case is extremely batshit and dumb, among other things. There are lines of research around advanced behaviors like maintaining tracking even if an object deforms in view. That sounds like the sort of thing that should be legally required to be implemented and validated before any of these things hit the road with inattentive/low paid 'safety' drivers as fallbacks.
|
# ? Nov 11, 2019 07:07 |
|
klafbang posted:insights like this that are obvious in hindsight are not beforehand. They should have caught it in testing, but that is literally what they did. People (well, more people than have already) are going to die before the technology gets good enough to be used safely. Ow Oof this take is so loving hot If the system is anticipated to kill innocent people in its testing, then you find a better loving way to test it or you cancel the research. We don't conduct experiments like that any more. We don't involve unwilling participants in scientific research at all, let alone research that poses a risk to their life. That's literal Nazi poo poo. klafbang posted:whether incomplete pieces of unpredictable heaps of mess should kill people now in order to potentially save people in 20+ years when the technology is good enough. No. They should not. End of debate. There is no reason that anyone should die in the development of self-driving cars. Yes, the process may take longer and cost more if we have to use motorized crash test dummies on a closed course instead of letting half-baked CS 102 projects loose on public streets, but that is the only ethical way to approach the problem. Uber's project only made it onto the streets because of greed and stupidity and it's a goddamn travesty that no one is in jail for this woman's murder. E: like I just want you to really grasp that the trolley problem is not applicable to this situation or to the real world in general. We have decided as a society that it is never acceptable to violate a human being's right to safety and life because doing so might save more people down the line. You can't even make someone donate their blood or bone marrow to save someone else's life at little risk to their own. Saying "well, some people will be killed by our experiments, but maybe in a couple of decades we will have improved outcomes somewhat so it balances out" is just incredibly callous and, again, literally the kind of justifications used by Nazi doctors. Sagebrush fucked around with this message at 08:51 on Nov 11, 2019 |
# ? Nov 11, 2019 08:40 |
|
klafbang posted:It is not batshit or dumb at all. It is easy to point and laugh afterwards, but it is very easy to conceive of scenarios that would lead to such implementation. Apparently there are still people in the world so very willing to sacrifice other unwilling people they don’t personally know on the altar of science so they can maybe enjoy some kind of unknown benefit at some indeterminate point in the future.
|
# ? Nov 11, 2019 08:57 |
|
Techbros view safety the same way they do quality: expect errors so just patch them out. Like it's acceptable to have buggy launch where people die so long as it gets fixed.
|
# ? Nov 11, 2019 08:59 |
|
Maybe if the trolley runs over enough people it will become enlightened and stop running over people.
|
# ? Nov 11, 2019 09:02 |
|
Platystemon posted:Maybe if the trolley runs over enough people it will become enlightened and stop running over people. What if we connect the trolley to the internet? with a app?
|
# ? Nov 11, 2019 09:05 |
|
Sagebrush posted:We don't conduct experiments like that any more. We don't involve unwilling participants in scientific research at all, let alone research that poses a risk to their life. That's literal Nazi poo poo. Are we really not? There's no real way to say. The US was still doing that poo poo in secret up until the 1960's, that we know of. We live in hellworld, I wouldn't put anything past governments/big business.
|
# ? Nov 11, 2019 09:26 |
|
Sagebrush posted:Ow Oof this take is so loving hot I really appreciate this post, and it needs to be quoted in full again and again and again.
|
# ? Nov 11, 2019 09:28 |
|
C.M. Kruger posted:What if we connect the trolley to the internet? with a app? And people can place money in the app to increase the 'cost' the trolley applies to running them over! Genius!
|
# ? Nov 11, 2019 09:29 |
|
Platystemon posted:Maybe if the trolley runs over enough people it will become enlightened and stop running over people. Or maybe it becomes so enlightened it actively seeks out humans to run over for the betterment of the planet?
|
# ? Nov 11, 2019 09:33 |
|
Bloody Hedgehog posted:Are we really not? There's no real way to say. The US was still doing that poo poo in secret up until the 1960's, that we know of. The reason it was happening in secret is that we as a society generally agree that it's completely and totally un-loving-acceptable. Undoubtedly there are sociopaths still doing that sort of thing in various dark corners of the world, and also in silicon valley - but when such things come to light, it's the responsibility of anyone with any sort of moral center to stand up and say that yes, this is still and has always been completely unacceptable. Don't make excuses for murderous sociopaths.
|
# ? Nov 11, 2019 09:34 |
|
Sagebrush posted:Ow Oof this take is so loving hot Thanks for jumping to the nazi card. That really stresses I should take your comments seriously. I am not saying that the testing method is acceptable and that Uber should not be punished. I am saying that the error is not as obvious as everybody would make it seem. It is very easy to point afterwards and saying that people should have caught it. Even if off-the-shelf APIs can follow objects, that does not imply that this also works with feature interactions that have conflicting goals. You can catch some interactions in testing, but you can never catch all. Uber should probably have caught this one. And the next 10 that will also kill people. And Tesla should be forcibly dissolved for keeping playing crash test dummies with real people and their insane auto pilot. But even after this and decades of testing there will still be issues. At some point it will be put to the test on real people; it should not have happened yet, but no matter how much it is tested, at some point people are killed for this idea. The reference to the trolley problem is not an attempt at a hot take, but more a reference to the same nerd-wankery that leads to people talking about whether to offer free wifi and the color of the interior in Elon's deathtubes (not the child tube, the hypertube) they should probably talk about making giant vacuum tubes work in the first place. Similarly, the discussion about who to kill in an accident, which is a discussion that needs to be had, is second to the very real and current discussion about whether to kill people for making a technology that could potentially be good. That is also a discussion that needs to be had; right now, it should probably forbid any real-world testing (maybe Waymo can be allowed, they seem relatively careful). But what in 5 years? 10? 20? I don't claim to know the answer and am happy I don't have to make the decision (I would just outlaw all cars period, but realize that not everybody can get where they need by bike/public transport).
|
# ? Nov 11, 2019 09:40 |
|
Jabor posted:... in various dark corners of the world, and also in silicon valley But you repeat yourself.
|
# ? Nov 11, 2019 09:45 |
|
|
# ? May 29, 2024 22:38 |
|
klafbang posted:I am saying that the error is not as obvious as everybody would make it seem. Hard disagree.
|
# ? Nov 11, 2019 09:46 |