I find that hard to believe given the amount of vehicles in the US.
|
|
# ? Mar 22, 2018 16:31 |
|
|
# ? Jun 8, 2024 07:24 |
|
Owlofcreamcheese posted:Render what? data clouds? depth maps? concentric circles? volumentic? false color? color overlay? wow who made you an export in the data recorded by automated cars, what makes you think you know more than gary uber
|
# ? Mar 22, 2018 16:31 |
|
BENGHAZI 2 posted:wow who made you an export in the data recorded by automated cars, what makes you think you know more than gary uber It's me, I'm the export
|
# ? Mar 22, 2018 16:34 |
|
Invalid Validation posted:I find that hard to believe given the amount of vehicles in the US. Consider the bulk of travel done by the cars of the US will be mostly on highways with extremely low incidence of pedestrian fatalities whereas the self driven cars mostly test in traffic of towns so they don't really get to rack up that many miles too quickly.
|
# ? Mar 22, 2018 16:41 |
|
what does an automated car slamming into a pedestrian sound like? a good start
|
# ? Mar 22, 2018 16:41 |
|
Teal posted:I didn't bother to check the source or maths so feel free to dismiss this figure as bullshit but somebody calculated it and allegedly self driven cars now are statistically 40 times more lethal for pedestrians per distance traveled than driver driven cars who alleged that?
|
# ? Mar 22, 2018 16:44 |
|
Teal posted:I didn't bother to check the source or maths so feel free to dismiss this figure as bullshit but somebody calculated it and allegedly self driven cars now are statistically 40 times more lethal for pedestrians per distance traveled than driver driven cars
|
# ? Mar 22, 2018 16:53 |
|
Jesus Christ, lighten up. That statistic was a joke. We haven't murdered enough people with robot cars yet to get a reliable number.
|
# ? Mar 22, 2018 17:09 |
Any statistics on how many accidents automated cars have compared to the general public?
|
|
# ? Mar 22, 2018 17:24 |
|
CopperHound posted:Jesus Christ, lighten up. That statistic was a joke. We haven't murdered enough people with robot cars yet to get a reliable number. I assure you, there will be a bunch of people arguing about that “statistic” completely seriously. Invalid Validation posted:Any statistics on how many accidents automated cars have compared to the general public? Not until they stop using safety drivers. So maybe Waymo’s next report will give us an inkling.
|
# ? Mar 22, 2018 17:30 |
|
Owlofcreamcheese posted:Render what? data clouds? depth maps? concentric circles? volumentic? false color? color overlay? You’re trying to make this way more complicated than it is.
|
# ? Mar 22, 2018 18:40 |
|
Condiv posted:You’re trying to make this way more complicated than it is. Not really, those super pretty videos of lidar use a ton of interpolation, and smoothing and sanding the models it generaties. It's really cool, but if you are looking for like, an accident reconstruction they aren't what you should use. But if you just released a video of the raw point cloud or something it'd look like a crazy mess of points jittering all around. They haven't said they won't release any of the data, and they probably will release it, but like, generating a video that would be useable in the court system wouldn't be a one click no decisions thing. Figuring out how to saw off the random noise to make a useable video but not hide noise that may have impacted the event isn't something you can do as easy or fast as just posting a sensor camera footage unmodified.
|
# ? Mar 22, 2018 18:54 |
|
Invalid Validation posted:Any statistics on how many accidents automated cars have compared to the general public? not yet, california requires 'disengagement' tracking aka how often a vehicle requires intervention from a driver, and there's a handful of fender bender accidents that don't mean much (many of these between human drivers are not reported). so the limited data we have on self driving cars isn't really comparable to the massive dataset of human driver accident/fatality stats
|
# ? Mar 22, 2018 19:17 |
|
Owlofcreamcheese posted:Not really, those super pretty videos of lidar use a ton of interpolation, and smoothing and sanding the models it generaties. It's really cool, but if you are looking for like, an accident reconstruction they aren't what you should use. But if you just released a video of the raw point cloud or something it'd look like a crazy mess of points jittering all around. No one said it’s as fast as raw camera footage. It’s deffo not a multi day process though. Like I said, you’re making it out to be more difficult than it is
|
# ? Mar 22, 2018 19:21 |
|
Condiv posted:No one said it’s as fast as raw camera footage. It’s deffo not a multi day process though. Like I said, you’re making it out to be more difficult than it is It is always multiple days to get things ready. And I already explained why to you. I did E-Discovery work for a couple of years. So I'm pretty informed about how long things take when you're talking about prepping data for potential legal proceedings. You don't know what you're talking about and you're wrong. Go away.
|
# ? Mar 22, 2018 19:24 |
|
Xae posted:Do the LIDAR sensors generate a visual recording by default? No, they generally do things like create point clouds or other geometric representation of the FOV. Condiv posted:it doesn't take a day or two days or three days to produce that rendering oocc I don't know how useful a visual representation of a point cloud is going to be, that's not how this poo poo works. Even if you as a human see a visual representation and can say "I can see a person" doesn't mean that's how the model they're working with sees it. Cicero posted:According to wikipedia stats + math I just did, the US has a traffic fatality rate of 1 per 88 million miles. Waymo has done 4 million miles, so at most self-driving cars would be 22 times more lethal, but I dunno how many miles all the other companies combined have driven, adding those in would lower the rate. If you count the "virtual miles" (like all their marketing does) they "drive" 10 million miles a day.
|
# ? Mar 22, 2018 19:25 |
|
Condiv posted:No one said it’s as fast as raw camera footage. It’s deffo not a multi day process though. Like I said, you’re making it out to be more difficult than it is Like what are they releasing to who? The investigators certainly would want the raw data and not care about some pretty PR video they made out of it, and they probably are handing that stuff over asap. for something to release to you, they gotta think a lot on what specifically to release so they aren't sanding it down so much it's misrepresenting the data by making some absolutely perfect 3d models but also not giving it so unfiltered that the public is gonna do some "the car drives by this random swarm of dots! of course it crashed! you can't tell a thing that is happening!"
|
# ? Mar 22, 2018 19:35 |
|
Uber isn't going to release the LIDAR data publicly, because there's no possible way it could make them look good - after all, they can't fall back on the "it was dark" excuse with LIDAR. Either the LIDAR did see the pedestrian and ran them over anyway, or the LIDAR couldn't see a human-sized figure in the middle of the road in basically ideal conditions for a LIDAR. Either way, it's in their best interest to insist that the car wasn't any worse than a human driver, and that means avoiding calling any attention to the car's sensory advantages.
|
# ? Mar 22, 2018 20:00 |
|
I am more interested in them releasing information as to what was at fault in their automation system. Like which module was misconfigured or malfunctioned or, which comm between modules bugged out. If it was simple enough as to show a log file dump, I'd be ok with that. Is there a timestamped entry there with an alert about some obstacle/ moving object in collision course? It's probably not the LIDAR, and I wasn't aware that it would take some processing to make its data presentable for public "consumption". I'd guess it would suffice if they showed something on which some expert would then point out when the LIDAR detects the person (or if it totally missed them for some reason).
|
# ? Mar 22, 2018 20:03 |
|
Owlofcreamcheese posted:seriously "why is this sensor camera look so bad compared to this iphone video" is really really dumb. You’re the only person here who is talking about iPhone cameras dude. If you really think it’s dumb then change starts with you.
|
# ? Mar 22, 2018 20:07 |
|
Main Paineframe posted:that means avoiding calling any attention to the car's sensory advantages.
|
# ? Mar 22, 2018 20:09 |
|
Solkanar512 posted:You’re the only person here who is talking about iPhone cameras dude. If you really think it’s dumb then change starts with you. people are literally posting an iphone video and saying "why doesn't their camera look as nice as this!???" and saying they intentionally darkened the camera or something. Owlofcreamcheese fucked around with this message at 20:24 on Mar 22, 2018 |
# ? Mar 22, 2018 20:19 |
|
Main Paineframe posted:Uber isn't going to release the LIDAR data publicly, because there's no possible way it could make them look good - after all, they can't fall back on the "it was dark" excuse with LIDAR. Either the LIDAR did see the pedestrian and ran them over anyway, or the LIDAR couldn't see a human-sized figure in the middle of the road in basically ideal conditions for a LIDAR. Either way, it's in their best interest to insist that the car wasn't any worse than a human driver, and that means avoiding calling any attention to the car's sensory advantages. she isn't invisible to light or anything, there isn't going to be an issue with her being visible on lidar, the problem is going to be some form of the system not recognizing what she was. It'll be something like the camera detecting her wide bike as the back of a car instead of a pedestrian and not dealing with moving horizontally correctly.
|
# ? Mar 22, 2018 20:24 |
|
Owlofcreamcheese posted:people are literally posting an iphone video and saying "why doesn't their camera look as nice as this!???" and saying they intentionally darkened the camera or something. This is Uber, the company that constantly engages in unethical and illegal activities and cuts corners and lies about it. It's completely plausible that they altered the video.
|
# ? Mar 22, 2018 20:35 |
|
Owlofcreamcheese posted:people are literally posting an iphone video and saying "why doesn't their camera look as nice as this!???" and saying they intentionally darkened the camera or something. Hold the gently caress on, I did resist referring to you directly before but now you're just being daft, what exactly is it you're arguing here? That the data looks like poo poo because that mode of operation where you indeed can't see poo poo is somehow advantageous to the car guidance, or that one shouldn't expect the primary, forward facing camera to be comparable with the $35-$43 camera on an iPhone? Or you try to pile it in with Lidar and imply that the cameras intended to read human-readable signs and signals records in some bizarre super complex too-smart-for-us format that has to be intensely processed to give human readable image, and the fact that you can't see loving toss is a result of that? And by the way, gently caress you for the Lidar argument anyway; no, many of us would be very well able to interpret the raw LIDAR data, and while I absolutely expect them to not release it to the public because it would be damning as gently caress, I an bunch of other people have both the know-how and equipment to do our own interpretation for fun, if for no better reason. Teal fucked around with this message at 20:43 on Mar 22, 2018 |
# ? Mar 22, 2018 20:41 |
|
ElCondemn posted:I don't know how useful a visual representation of a point cloud is going to be, that's not how this poo poo works. Even if you as a human see a visual representation and can say "I can see a person" doesn't mean that's how the model they're working with sees it. that's a problem with their model then Owlofcreamcheese posted:Like what are they releasing to who? again, you're making it more complicated than it is. don't claim they couldn't release a video rendering of their lidar data in a day next time, cause you're dead wrong Xae posted:It is always multiple days to get things ready. And I already explained why to you. that wasn't what was being discussed though xae. sorry, but try to not wander into a conversation if you can't keep up
|
# ? Mar 22, 2018 20:46 |
|
Condiv posted:that wasn't what was being discussed though xae. sorry, but try to not wander into a conversation if you can't keep up You're try to say the discussion about why things take time isn't relevant to why things take time? You got caught running your mouth off about poo poo you don't understand. Go away and grow up a bit. Xae fucked around with this message at 20:52 on Mar 22, 2018 |
# ? Mar 22, 2018 20:49 |
|
Owlofcreamcheese posted:It'll be something like the camera detecting her wide bike as the back of a car instead of a pedestrian and not dealing with moving horizontally correctly. This sounds like Ubers fault.
|
# ? Mar 22, 2018 20:56 |
|
Xae posted:You're try to say the discussion about why things take time isn't relevant to why things take time? xae, we were discussing technical poo poo. no-one was talking about courtrooms or discovery or anything until you decided to interject. and quite frankly, i don't care about any of that if you wanna argue with someone about discovery or legal proceedings, do it with someone else. ok?
|
# ? Mar 22, 2018 21:05 |
|
Condiv posted:xae, we were discussing technical poo poo. no-one was talking about courtrooms or discovery or anything until you decided to interject. and quite frankly, i don't care about any of that You think that the footage and data won't be part of the inevitable lawsuit? You think Uber's lawyers aren't heavily involved in this and aren't running the show? Uber is dumb as gently caress, but even they can figure out that hitting a pedestrian is going to be a lawsuit.
|
# ? Mar 22, 2018 21:13 |
|
Xae posted:You think Uber's lawyers aren't heavily involved in this and aren't running the show? i already said i don't care about any of that. oocc made a claim it wasn't technically feasible, i argued otherwise. go argue with someone else about what uber's lawyers are doing
|
# ? Mar 22, 2018 21:14 |
|
Teal posted:Hold the gently caress on, I did resist referring to you directly before but now you're just being daft, what exactly is it you're arguing here? That the data looks like poo poo because that mode of operation where you indeed can't see poo poo is somehow advantageous to the car guidance, or that one shouldn't expect the primary, forward facing camera to be comparable with the $35-$43 camera on an iPhone? Or you try to pile it in with Lidar and imply that the cameras intended to read human-readable signs and signals records in some bizarre super complex too-smart-for-us format that has to be intensely processed to give human readable image, and the fact that you can't see loving toss is a result of that? Go get a really nice professional digital camera and a really bad consumer camera and take a photo in bad light. Now open the picture as taken and you'll be shocked to see how much better the crap camera's photo is. It's not that the bad camera is secretly better and the nice camera was a scam or something. It's that the bad camera knows you don't care and just want it to work so it'll chop off the top of the color range, push up the bottom and compress the color range way down and call it good. While the good camera will happily show you a 12,3,23 pixel and a 11,4,28 color pixel which both look nearly absolutely identical to your eye because they know that you are a guy with a nice camera and can make your own color balence choices and would rather do that than throw out half the color range and get a grainy orange mess just because it makes the initial photo look better. A camera in a sensor is way more into knowing if the pixel got a red value of 28 or 31 than it is making it a nice looking picture by just declaring that pixel 'red" and color balancing it up till it looks good.
|
# ? Mar 22, 2018 21:20 |
|
Condiv posted:i already said i don't care about any of that. oocc made a claim it wasn't technically feasible where?
|
# ? Mar 22, 2018 21:21 |
|
Owlofcreamcheese posted:where? Owlofcreamcheese posted:I mean, they released the video first because it's a human readable format. It's a lot easier to tweet out a video than like, an array of hundreds of thousands of [x,y,z] points. They might release a rendering from the lidar data later, but it's more steps to make that than just release a video that is viewable as is. hth. i hope the rest of the argument doesn't need to be traced out again for you
|
# ? Mar 22, 2018 21:26 |
|
|
# ? Jun 8, 2024 07:24 |
|
This thread has gone beyond the scope I had intended. If you wish to further talk about uber's murder machine, make a new thread.
|
# ? Mar 22, 2018 21:27 |