|
That seems like a problem with time zone management more than the encoding format. Although I might be misunderstanding because everything else makes sense. In any case I think we can all agree that time itself is the horror.
|
# ? Oct 29, 2016 22:14 |
|
|
# ? May 14, 2024 22:13 |
|
Time zones are the horror. People's dumb obsession with the idea that the numbers must correlate with their observations of the sun. Smartest thing the Chinese ever did was picking a single time zone for their entire country.
|
# ? Oct 29, 2016 22:20 |
|
Time being politicized is also what makes it a horror. Daylight savings time changes by congress, for example, and especially just the names of months of the year back to ancient Roman times all leave their mark upon the world for a long time. If you look at all the years and dates Oracle supports (the database) they include pre-Gregorian calendars, lunar calendar (different civilizations' like Cherokee vs. Chinese), and I think even a goddamn Mayan calendar. Someone wanted that and Oracle felt like they were worth writing that crap for evidently. I can only imagine who the hell wanted these calendars in an RDBMS of all places.
|
# ? Oct 29, 2016 22:31 |
|
Decimal time. Let's try it again.
|
# ? Oct 29, 2016 22:48 |
|
Time is an illusion cause relativity, let's just get rid of it
|
# ? Oct 29, 2016 22:50 |
|
necrobobsledder posted:Time being politicized is also what makes it a horror. Daylight savings time changes by congress, for example, and especially just the names of months of the year back to ancient Roman times all leave their mark upon the world for a long time. Honestly, it was probably a bored programmer who did it as kind of an Easter egg, though pre-Gregorian calendars are useful for databases that have to track dates before the calendar change.
|
# ? Oct 29, 2016 22:50 |
|
xtal posted:Time is an illusion cause relativity, let's just get rid of it That's not what relativity means. It just means that the question "what time is it?" makes no sense unless you also know where it is and what coordinate system you're using.
|
# ? Oct 29, 2016 22:52 |
|
xzzy posted:Decimal time. Let's try it again. What do you man, again? Today is Black Salsify day in the Foggy month of the 225th year!
|
# ? Oct 29, 2016 22:57 |
|
Dr. AA Hazredstein posted:That's not what relativity means. It just means that the question "what time is it?" makes no sense unless you also know where it is and what coordinate system you're using. I'm not aware of any 4 dimensional date encoding formats
|
# ? Oct 29, 2016 23:03 |
|
xtal posted:I'm not aware of any 4 dimensional date encoding formats A time-zone is nothing but a coarse-grained coordinate providing the "space" in "spacetime".
|
# ? Oct 29, 2016 23:05 |
|
Holy poo poo
|
# ? Oct 29, 2016 23:13 |
|
ExcessBLarg! posted:The number of elapsed seconds is actually somewhere between 26 and 36 seconds more since 26 leap seconds have been added to UTC since "new UTC" went into place in 1972, and fractional seconds were added to UTC prior to that since it was synchronized to TAI in 1958. Could you elaborate on that? I'm working on a small project relating to this and as far as I knew the relationship between TAI and UTC wasn't well-defined prior to the beginning of 1961.
|
# ? Oct 29, 2016 23:20 |
|
Suspicious Dish posted:IMO JSON has flaws. The biggest one, which I actually ran into in practice, is that parsers are actually completely random if there duplicate keys. I even ran into one parser that turned: {"foo": "bar", "foo": "butts"} into {"foo": ["bar", "butts"]}. that's because in json {"foo": "bar", "foo": "butts"} is exactly {"foo": "bar", "foo": "butts"} and not anything else. its completely useless and means that most every parser and mapper doesn't roundtrip json correctly.
|
# ? Oct 30, 2016 02:35 |
|
JSONx does!
|
# ? Oct 30, 2016 12:22 |
|
FamDav posted:that's because in json {"foo": "bar", "foo": "butts"} is exactly {"foo": "bar", "foo": "butts"} and not anything else. its completely useless and means that most every parser and mapper doesn't roundtrip json correctly. why the gently caress is this even legal jason. it's so bad.
|
# ? Oct 30, 2016 12:50 |
|
Soricidus posted:why the gently caress is this even legal jason. it's so bad. Because Doug Crawford doesn't understand the difference between specifying the syntax of a data format and specifying its semantics.
|
# ? Oct 30, 2016 14:51 |
|
Soricidus posted:why the gently caress is this even legal jason. it's so bad. I think in an ordinary use case, the duplicate key should not come up - like in JS, if you define value["key1"]=2, value["key2"]=3 , the object itself just value["key1"]=3... Like the intended use of JSON is to serialize an object to a string - how do you end up with multiple duplicate keys in the same object? Maybe it could happen if you're creating your JSON through string concatenation...
|
# ? Oct 30, 2016 14:56 |
|
Bruegels Fuckbooks posted:I think in an ordinary use case, the duplicate key should not come up Yes, it's not like there have been a bazillion serious bugs caused by people producing inputs that would never be seen in ordinary use cases, let's go ahead and invent more standards that make huge assumptions about what inputs people will produce!
|
# ? Oct 30, 2016 15:23 |
|
Bruegels Fuckbooks posted:Maybe it could happen if you're creating your JSON through string concatenation... Protocol buffers is one encoding that allows this kind of thing, where you can concatenate the text format strings or even the raw bytes of the binary coding to "merge" messages. For singular fields, the last value for a particular field is taken; for repeated fields, the values are all appended together (in fact, non-packed repeated fields in a regular message are stored that way already). So it's not totally insane that JSON might allow this too in theory, but it doesn't quite hold up because the concatenation of two JSON strings doesn't make sense: you'd have "{ ... }{ ... }", which isn't valid. (...right? )
|
# ? Oct 30, 2016 15:36 |
|
Soricidus posted:Yes, it's not like there have been a bazillion serious bugs caused by people producing inputs that would never be seen in ordinary use cases, let's go ahead and invent more standards that make huge assumptions about what inputs people will produce! If you're taking JSON from the outside world and not linting it, then you deserve what happens to you. Flobbster posted:{ ... }{ ... } Those are some pretty misshapen triple-nippled boobs you got there, friend.
|
# ? Oct 30, 2016 15:52 |
|
TooMuchAbstraction posted:Those are some pretty misshapen triple-nippled boobs you got there, friend. Hey, we're not here to kinkshame Jason.
|
# ? Oct 30, 2016 17:45 |
|
Internet Janitor posted:Because Doug Crawford doesn't understand the difference between specifying the syntax of a data format and specifying its semantics. Exactly. Even if the semantics was "invalid JSON, throw error", I'd be OK with it.
|
# ? Oct 30, 2016 18:32 |
|
TooMuchAbstraction posted:If you're taking JSON from the outside world and not linting it, then you deserve what happens to you. I can't stop laughing at the idea of a data serialisation format that requires a linter because the spec is so poorly written that merely accepting all standard-compliant inputs leaves you facing undefined behavior.
|
# ? Oct 30, 2016 19:15 |
|
Soricidus posted:I can't stop laughing at the idea of a data serialisation format that requires a linter because the spec is so poorly written that merely accepting all standard-compliant inputs leaves you facing undefined behavior. Somebody took a poorly-thought-out shortcut about a decade ago and now everybody's using it and there's no room for a four-lane but they did it anyone and there's a bunch of overhang and sinkholes pop up all the time but are YOU going to explain why we need to rewrite all the maps and who has time for that innovate innovate innovate disrupt disrupt disrupt?
|
# ? Oct 30, 2016 19:46 |
|
Bruegels Fuckbooks posted:I think in an ordinary use case, the duplicate key should not come up - like in JS, if you define value["key1"]=2, value["key2"]=3 , the object itself just value["key1"]=3... Like the intended use of JSON is to serialize an object to a string - how do you end up with multiple duplicate keys in the same object? Maybe it could happen if you're creating your JSON through string concatenation... Where I ran into this: this was a library that was serializing URL query strings, where duplicate keys are allowed and supported. It did not output correct JSON for the use case where multiple keys keys were passed.
|
# ? Oct 30, 2016 20:42 |
|
Suspicious Dish posted:Where I ran into this: this was a library that was serializing URL query strings, where duplicate keys are allowed and supported. It did not output correct JSON for the use case where multiple keys keys were passed. Multiple instances of the same query string keys is an edge case for HTTP parsers that seems to get regularly tripped up. That was probably a bad idea itself.
|
# ? Oct 30, 2016 21:10 |
|
That only matters with parameters in the format of whatever[] right? I would expect that to get converted to an array.
|
# ? Oct 30, 2016 21:16 |
|
Don't serialize querystrings to JSON is the only correct answer. If you want them to round-trip then you have to store it as something that preserves order. Actually you could still use JSON: [["key", "value"], ["key", "value"], ...].
|
# ? Oct 30, 2016 21:33 |
|
xtal posted:That only matters with parameters in the format of whatever[] right? I would expect that to get converted to an array. That's a convention established by PHP, not part of any standard.
|
# ? Oct 30, 2016 21:49 |
|
If you want two definitions of 'key' to turn into [val1, val2] why don't you just make it that way in the first place? And if you don't, what the hell do you expect is going to happen?
|
# ? Oct 30, 2016 22:26 |
|
I don't care what happens, as long as it's specified and consistent. I just don't want people thinking JSON or serialized formats are super simple when they don't specify or seem to care about edge cases.
|
# ? Oct 30, 2016 22:45 |
|
Soricidus posted:Yes, it's not like there have been a bazillion serious bugs caused by people producing inputs that would never be seen in ordinary use cases, let's go ahead and invent more standards that make huge assumptions about what inputs people will produce! This is the same line of thinking that led to him being all cutesy with the 'may not be used for evil' license.
|
# ? Oct 30, 2016 23:05 |
|
Suspicious Dish posted:Where I ran into this: this was a library that was serializing URL query strings, where duplicate keys are allowed and supported. It did not output correct JSON for the use case where multiple keys keys were passed. I promise this is not nitpicking: "serialising URL query strings" is an incorrect phrase. The serialisation of a string is the string itself, appropriately wrapped. What that library must have been doing is DE-serialising the query string into an object, and then RE-serialising the object into a JSON string. I don't know the stack you're working with, so is that correct? If it is, the question then becomes: how were the key-value pairs represented in the intermediate object? If they were represented as object properties or as a dictionary, then duplicate keys should not have been allowed to exist and the JSON serialiser is (partially!) excused because it was asked to serialise an already-illegal object. If, however, they were represented as a list of key-value pairs, or as a dictionary with arrays for values (either of which can be a correct representation when duplicate keys must be supported), then yeah, the serialiser absolutely shat the bed.
|
# ? Oct 31, 2016 14:47 |
|
NihilCredo posted:I promise this is not nitpicking
|
# ? Oct 31, 2016 18:01 |
|
Parsing get parameters into a json string sounds like one heck of a coding horror to me. I can sort of see why someone might want to do that but it sounds like a huge injection exploit waiting to be discovered. Validate that poo poo before it gets to the building a string stage.
|
# ? Oct 31, 2016 18:04 |
|
NihilCredo posted:I promise this is not nitpicking: "serialising URL query strings" is an incorrect phrase. The serialisation of a string is the string itself, appropriately wrapped. Paraphrased in JS, the code (in a third-party library) was roughly: JavaScript code:
|
# ? Oct 31, 2016 18:09 |
|
HTML code:
|
# ? Oct 31, 2016 20:10 |
|
SO, please never changecode:
|
# ? Oct 31, 2016 20:42 |
|
Oh my, that switch is interesting.
|
# ? Oct 31, 2016 20:46 |
|
|
# ? May 14, 2024 22:13 |
ratbert90 posted:
I see red touching yellow so I'm pretty sure it's poisonous.
|
|
# ? Oct 31, 2016 20:55 |