|
ISO 8601?
|
# ? Oct 27, 2016 16:48 |
|
|
# ? May 14, 2024 15:18 |
|
Volte posted:If I ever consider using JSON for cross-server interop I just imagine trying to send a date from Javascript to Python reliably and my brain kind of short circuits and resets. How do you want your code to receive a date? As some kind of IPC object? Newsflash: those have to be serialized too.
|
# ? Oct 27, 2016 17:01 |
|
IMO JSON has flaws. The biggest one, which I actually ran into in practice, is that parsers are actually completely random if there duplicate keys. I even ran into one parser that turned: {"foo": "bar", "foo": "butts"} into {"foo": ["bar", "butts"]}.
|
# ? Oct 27, 2016 17:02 |
|
qntm posted:ISO 8601? The format everyone suggests but nobody actually implements. Show me a full ISO-8601 compliant parser in any language.
|
# ? Oct 27, 2016 17:03 |
|
JSON's biggest flaw is its refusal to accept commas after the last element in a list or dict. Everything else is buggy parsers.
|
# ? Oct 27, 2016 17:11 |
|
TooMuchAbstraction posted:JSON's biggest flaw is its refusal to accept commas after the last element in a list or dict. Everything else is buggy parsers. Working with puppet for 5+ years has trained me to always leave a comma after the last item in a list. The only effect it's had is making me painfully aware how many languages/parsers freak the gently caress out when you do that.
|
# ? Oct 27, 2016 17:14 |
|
qntm posted:ISO 8601? But in a string? I was actually impressed how MongoDB handled dates, they use ISODate(...) or new Date(...) which is novel but ultimately really lovely on how much extra development would be required to work with it outside of JS.
|
# ? Oct 27, 2016 17:17 |
|
Suspicious Dish posted:The format everyone suggests but nobody actually implements. Show me a full ISO-8601 compliant parser in any language. In my experience this is because what they mean by ISO-8601 is YYYY-MM-DDThh:mm:ssZ, possibly since that's what `date --iso-8601` does, since they've never had to look at the crawling horror that is the ISO-8601 specification.
|
# ? Oct 27, 2016 17:18 |
|
Suspicious Dish posted:The format everyone suggests but nobody actually implements. Show me a full ISO-8601 compliant parser in any language. https://github.com/csnover/js-iso8601 MrMoo posted:But in a string? I was actually impressed how MongoDB handled dates, they use ISODate(...) or new Date(...) which is novel but ultimately really lovely on how much extra development would be required to work with it outside of JS. JSON doesn't have any kind of date object, so, yeah. Alternatively, send an integer containing a count of Unix seconds. Edison was a dick posted:In my experience this is because what they mean by ISO-8601 is YYYY-MM-DDThh:mm:ssZ, possibly since that's what `date --iso-8601` does, since they've never had to look at the crawling horror that is the ISO-8601 specification. Agh, well, there you go. I didn't know ISO 8601 had nonsense like "--10-27" in it. I revise my suggestion to "YYYY-MM-DDThh:mm:ssZ". qntm fucked around with this message at 17:31 on Oct 27, 2016 |
# ? Oct 27, 2016 17:27 |
|
This library strictly implements the simplified ISO 8601 date time string format specified in the ES5 Errata (§15.9.1.15) and will not parse ISO 8601 variants that would otherwise be considered valid under ISO 8601:2004(E).
|
# ? Oct 27, 2016 17:31 |
|
qntm posted:https://github.com/csnover/js-iso8601 Also from that link quote:If you are attempting to parse date strings coming from non-ES5-conformant backends, please consider using the non-conformant edition.
|
# ? Oct 27, 2016 17:36 |
|
TooMuchAbstraction posted:JSON's biggest flaw is its refusal to accept commas after the last element in a list or dict. Everything else is buggy parsers. This. Also no comments! Though that may be more of a using-JSON-as-config-file-format horror.
|
# ? Oct 27, 2016 17:41 |
|
quote:ECMAScript revision 5 adds native support for simplified ISO 8601 dates Emphasis mine.
|
# ? Oct 27, 2016 17:53 |
|
Supporting a subset of a standard is the same thing as not supporting a standard
|
# ? Oct 27, 2016 18:03 |
|
Suspicious Dish posted:Emphasis mine. Yeah, my bad. Up until now I didn't realise ISO 8601 was any more complicated than "YYYY-MM-DDThh:mm:ss" plus "Z" or a time zone offset on the end of it. If someone is sending "2016-W43-4" or "--10-27" then that is troublesome, I don't know offhand of a parser which can handle those cases. I also assumed we were sending specific instants in time, not specific moments on a calendar. Again, if someone sends e.g. "2016-11-06T01:30:00" then that is troublesome and I'm not sure what I would do to handle that using JSON.
|
# ? Oct 27, 2016 18:03 |
|
If you're messing with configuration files, you should be using some actually known configuration file format with some parser libraries or something like Augeas that can convert to and from different formats for whatever data structure your abominable program is using and so that you can define a grammar of some sort.xzzy posted:Working with puppet for 5+ years has trained me to always leave a comma after the last item in a list. The only effect it's had is making me painfully aware how many languages/parsers freak the gently caress out when you do that. I also saw someone argue that it makes it easier to write templates without needing to use a join function. However, I think there's no templating system I'd ever use in production that won't support using a ubiquitous foo.join(",") syntax that's easier to read than something like for(e in foo) { e + "," }
|
# ? Oct 27, 2016 18:05 |
|
qntm posted:Yeah, my bad. Up until now I didn't realise ISO 8601 was any more complicated than "YYYY-MM-DDThh:mm:ss" plus "Z" or a time zone offset on the end of it. If someone is sending "2016-W43-4" or "--10-27" then that is troublesome, I don't know offhand of a parser which can handle those cases. It's funny that this always happens. Someone complains about some format/specification/API being able to do X. Someone says "nah, you just do this and its fine", and then because engineers are the worst it turns out specifications are always loving huge labyrinthine beasts it turns out that X is really easy if you just implement the small subset of the spec that is all 90% of people need. OK, well maybe that's not what always happens but it's not uncommon either.
|
# ? Oct 27, 2016 18:09 |
|
I have no opinion on whether Augeas is a coding horror, but it is definitely a horror of some sort. The lenses are a new level of hell, I'm sure of it.
|
# ? Oct 27, 2016 18:09 |
|
Volguus posted:And about JS: are the existing JS engines so performant that nobody wants to replace them? Even chrome doesn't come with a native Dart engine (it did in a developer edition). Or Edge with Typescript. Everyone just compiles to JS, like is the holy loving grail of languages. I saw a presentation from Mozilla at MidwestJS that described how they took Python code, compiled it to ASMJS and got a performance increase over PyPy. Modern JS engines are stupid fast.
|
# ? Oct 27, 2016 18:13 |
|
xzzy posted:I have no opinion on whether Augeas is a coding horror, but it is definitely a horror of some sort. I had not previously heard of Augeas but it's interesting that they named it after the owner of literally the biggest pile of poo poo in Greek mythology. (or maybe that's the joke, who knows).
|
# ? Oct 27, 2016 18:45 |
|
Thermopyle posted:It's funny that this always happens. I ran into this when parsing OpenGraph data from HTML which suggests that datetimes are "just ISO8601". One website had a timezone spec. Another had "Z" at the end. There was no Python library I could find that actually parsed these drat things properly. For some reason I haven't been able to figure out, everyone thinks ISO8601 is a really simple spec.
|
# ? Oct 28, 2016 19:07 |
|
Also, Augeas solves the problem of "computers should be able to manipulate hand-written configuration files", without ever asking if maybe you should instead have a better mechanism for supporting both machine and human configurations, like the "dropfile snippets .d" approach.
|
# ? Oct 28, 2016 19:12 |
|
fritz posted:I had not previously heard of Augeas but it's interesting that they named it after the owner of literally the biggest pile of poo poo in Greek mythology. (or maybe that's the joke, who knows). Considering it's a tool that attempts to provide a standard interface for managing any one of the billion types of config files that might exist in /etc, yeah, it's 100% intentional. (plus they mention it in their faq) I actually enjoy using the tool, when it works. When it doesn't?
|
# ? Oct 28, 2016 19:13 |
|
Suspicious Dish posted:I ran into this when parsing OpenGraph data from HTML which suggests that datetimes are "just ISO8601". One website had a timezone spec. Another had "Z" at the end. There was no Python library I could find that actually parsed these drat things properly. iso8601 is more parseable than the alternative, namely "whatever the gently caress random poo poo the author thought would be nice and familiar to people from the same country they grew up in" like, at least you can reliably tell which bit is supposed to be the year
|
# ? Oct 28, 2016 19:29 |
|
ISO 8601 is apparently a massive, massive superset of "2016-10-28T19:53:45+00:00 or possibly 2016-10-28T19:53:45Z", but "2016-10-28T19:53:45+00:00 or possibly 2016-10-28T19:53:45Z" by itself, which is what most people evidently mean when they say "ISO 8601", seems fairly tractable to me.
|
# ? Oct 28, 2016 22:39 |
|
Don't forget fractional seconds like 2016-10-28T19:53:45.123456789Z. ISO 8601 is really nice when extending to periods and durations though, and that is where the Java ThreeTen implementation is still a bit lacking.
|
# ? Oct 29, 2016 00:46 |
|
I build the back end for an app and serve the client dates in ISO8601 format, which is the default when you're using Rails. I got in a lot of trouble for serving strings that failed to parse and crashed the client. When I checked what the problem was, they were parsing the date with a hand written format string that didn't account for decimal seconds.
|
# ? Oct 29, 2016 02:05 |
|
xtal posted:I build the back end for an app and serve the client dates in ISO8601 format, which is the default when you're using Rails. I got in a lot of trouble for serving strings that failed to parse and crashed the client. When I checked what the problem was, they were parsing the date with a hand written format string that didn't account for decimal seconds. Tell them to gently caress off and be proper defensive clientside devs. You didn't serve strings that crashed the client, they wrote a lovely client that's crashable for dumb reasons. I can pretty much picture their lovely code right now. Is it Android? It's Android isn't it. This is why we can't have nice things.
|
# ? Oct 29, 2016 13:50 |
|
Edison was a dick posted:In my experience this is because what they mean by ISO-8601 is YYYY-MM-DDThh:mm:ssZ, possibly since that's what `date --iso-8601` does, since they've never had to look at the crawling horror that is the ISO-8601 specification. It's not YYYY-MM-DDTHH:mm:ss.ssZ It's technically YYYY-MM-DDTHH:mm:ss.sssZ YYYY-MM-DDTHH:mm:ss.ssZ just works by convention in modern browsers - however in IE9, passing an ISO string to the date constructor using format YYYY-MM-DDTHH:mm:ss.ssZ will return the "Invalid Date" object, whereas with the sss alternative, it will work. (Btw use moment.js don't parse dates using the date object in browsers) Bruegels Fuckbooks fucked around with this message at 14:54 on Oct 29, 2016 |
# ? Oct 29, 2016 14:51 |
|
Decimal ISO8601 is a stretch goal. They also misuse string formatting in the other direction, doing manual URL encoding on the request bodies. Our solution to the first issue was rounding the times to the second at the server view layer Because Deadlines.
|
# ? Oct 29, 2016 14:51 |
|
Soricidus posted:iso8601 is more parseable than the alternative, namely "whatever the gently caress random poo poo the author thought would be nice and familiar to people from the same country they grew up in"
|
# ? Oct 29, 2016 14:56 |
|
Volte posted:The alternative is not "some ad-hoc bullshit that someone made up for their own case", it's a sane standard that is trivially parseable. Something compatible with strptime for example. What is this alternative standard called, and why does nobody ever recommend it instead of iso8601? I'm living in the real world here buddy, I agree that a good standard would be nice but the only choice I'm actually aware of is: some subset of iso8601, or mm/dd/yy hell
|
# ? Oct 29, 2016 17:49 |
|
There's also RFC 2822 for what it's worth but I don't think any of them are sensible. Is there anything wrong with Unix timestamps? xtal fucked around with this message at 18:03 on Oct 29, 2016 |
# ? Oct 29, 2016 17:52 |
|
Soricidus posted:What is this alternative standard called, and why does nobody ever recommend it instead of iso8601?
|
# ? Oct 29, 2016 18:08 |
|
RFC 3339? Or has that been superseded too?
|
# ? Oct 29, 2016 18:21 |
|
xtal posted:Is there anything wrong with Unix timestamps? So Unixtime is good for encoding near-past and near-future date-times accurately to within a few seconds, where the actual representation of the date-time needs to be converted between some local time representation by a library or operating system. More specifically, if your application only cares about time accuracy to the point where regular-old NTP is sufficient for synchronization, then Unixtime is probably good enough for serialization. This includes most web apps. Hell, if you're using JSON at all, Unixtime is probably good enough. Unixtime has a bunch of flaws though. It's basically an encoding of UTC with a trivial mathematical formula that doesn't account for leap seconds. Right now my computer says it's "2016-10-29T17:40:11Z", which is 1477762811 in Unixtime. You can manually calculate that by adding up the number of days between 1970-01-01 and 2016-10-29 (17,103), then using the formula: unix_time = elapsed_days*86400 + hours*3600 + minutes*60 + seconds, or, 1477762811 = 17103*86400 + 17*3600 + 40*60 + 11 The problem is that it's commonly believed that Unixtime is "the number of seconds that elapsed since 1970-01-01T00:00:00Z UTC" which isn't actually true. The number of elapsed seconds is actually somewhere between 26 and 36 seconds more since 26 leap seconds have been added to UTC since "new UTC" went into place in 1972, and fractional seconds were added to UTC prior to that since it was synchronized to TAI in 1958. Thus, Unixtime isn't accurate in a scientific sense. There's discontinuities in Unixtime whenever a leap second is removed, and a repetition when one is added. Now, for a machine where NTP regularly says "oops you're -3 seconds off!" it's fine, but for things like GPS it's not fine. Unixtime also doesn't deal with the representation of date-times. That's one of its advantages, since its trivial representation makes it relatively unambiguous aside from the aforementioned leap second issues or where it uses a different timescale (such as "seconds elapsed TAI"). But there's applications where storage of a specific date-time representation is actually important. Birthdays are a good example. Nobody cares about their birth time as elapsed seconds from 1970, but the actual date in the local timezone in which they were born. For example, if you were born at 9pm on July 4, 1976 in US Eastern, you might be proud of your bicentennial birth date, but would have an ISO birth date-time of 1976-07-05T01:00Z and you would never claim to have an "ISO birth date of 1970-07-05". ExcessBLarg! fucked around with this message at 19:18 on Oct 29, 2016 |
# ? Oct 29, 2016 19:14 |
|
Oh I should stop writing my birthday in Unix time?
|
# ? Oct 29, 2016 19:21 |
|
xtal posted:Oh I should stop writing my birthday in Unix time? Yes, and remember to include the timezone in future, and adjust which day it is depending on where you are.
|
# ? Oct 29, 2016 19:29 |
|
I summarize it as "Unix timestamps do not address all the problems in this article and the follow-up article" This doesn't mean that everyone should go crazy and use protocols like PTP but if you know that Unix timestamps aren't sufficient then you should be able to articulate where it falls short for your use cases. For what I've been working on, trying to address accuracy and precision in timing within even a millisecond is hard to do on commodity commercial compute systems because virtualization basically ruins everything, NTP's correction systems can make your accuracy look real wonky, and everyone addresses flaws in timekeeping systems differently (AWS and GCP do not handle leap second adjustments exactly the same, different NTP servers available implement yet other ways). Then there's issues with most programs not using monotonic clock syscalls for externally visible code (see: log files v. internal timers) so I can't rely upon most programs for their reported timestamps when it comes to accuracy although protocols typically address precision ok enough for within a few milliseconds.
|
# ? Oct 29, 2016 19:33 |
|
|
# ? May 14, 2024 15:18 |
|
xtal posted:Oh I should stop writing my birthday in Unix time?
|
# ? Oct 29, 2016 22:03 |