|
Shout out to CMake, which treats all of the following as false (matched case-insensitively): 0, OFF, NO, FALSE, N, IGNORE, NOTFOUND, the empty string, and any string that ends in "-NOTFOUND".
|
# ? Feb 25, 2020 07:28 |
|
|
# ? May 26, 2024 22:34 |
|
Winter Stormer posted:Shout out to CMake, which treats all of the following as false (matched case-insensitively): 0, OFF, NO, FALSE, N, IGNORE, NOTFOUND, the empty string, and any string that ends in "-NOTFOUND". EverythingOKProblemsNotFound
|
# ? Feb 25, 2020 07:29 |
|
Real men use the alpha-3 ISO three letter codes anyway.
|
# ? Feb 25, 2020 08:02 |
|
Less "helpful" YAML would be really nice though
|
# ? Feb 25, 2020 08:36 |
|
|
# ? Feb 25, 2020 09:14 |
|
Xarn posted:Less "helpful" YAML would be really nice though Or just better specified json, like with an actual standard behavior for numbers. And comments and trailing commas goddamnit
|
# ? Feb 25, 2020 09:54 |
|
Let's just agree on s-expressions, instead of JSON: a JS programmer's wish to use something like s-expressions.
|
# ? Feb 25, 2020 10:10 |
|
Soricidus posted:Or just better specified json, like with an actual standard behavior for numbers. And comments and trailing commas goddamnit So XML?
|
# ? Feb 25, 2020 10:24 |
|
Thanks, I hate it
|
# ? Feb 25, 2020 11:16 |
|
YAML is actually ok to write. (XML is preferable to JSON though, esp. as an interchange format...)
|
# ? Feb 25, 2020 11:19 |
|
Are we back to XML again? Because XML is fine, actually.
|
# ? Feb 25, 2020 11:37 |
|
dick traceroute posted:So XML? An extra paragraph of semantics and a rule for comments doesn't get you anywhere near XML. And trailing commas are free.
|
# ? Feb 25, 2020 11:44 |
|
Use csv for everything. Especially that huge numerical data dump. Be sure to include thousands separators in approximately 10% of the entries For a real pro move, create it as 5 different files with different column separators then cat them before sending
|
# ? Feb 25, 2020 12:23 |
|
The point of yaml or json is not to be "feature complete" - Its not sgml or xml Anyone pointing out how for some case is not a valid tool is missing the point
|
# ? Feb 25, 2020 13:13 |
|
DoctorTristan posted:Use csv for everything. Especially that huge numerical data dump. Be sure to include thousands separators in approximately 10% of the entries CSV/TSV is fine. Not great, but fine. The horror in your case is that your ETL flow apparently doesn't include a validation step. However the file is delivered, up to and including sneakernet, the specs should give you the option to respond with the equivalent of a 400 Bad Request "Every line must contain exactly 25 commas, sorry fuckos".
|
# ? Feb 25, 2020 13:26 |
|
Tei posted:The point of yaml or json is not to be "feature complete" - Its not sgml or xml The point of a thing is related to how it is actually used, not how you think it is intended to be used. And the prevalence of json parsers with support for extensions like comments, and the endless arguments over whether it’s ok to silently lose precision by interpreting large integers as doubles, indicate that these things do matter in the real world.
|
# ? Feb 25, 2020 13:32 |
|
Soricidus posted:The point of a thing is related to how it is actually used, not how you think it is intended to be used. No Do I need to write a longer rejection? People put light bulbs inside their rear end and then these are stuck - I don't care what people use these formats for - if is a bad use I will point and laught and I will not use the wrong format myself if I can avoid
|
# ? Feb 25, 2020 13:42 |
|
Tei posted:The point of yaml or json is not to be "feature complete" - Its not sgml or xml JSON, the JavaScript Object Notation, is incapable of representing all POD JavaScript objects.
|
# ? Feb 25, 2020 13:48 |
|
NihilCredo posted:CSV/TSV is fine. Not great, but fine. Did I mention the file was also 300GB. The story has a happy ending when that supplier got fired.
|
# ? Feb 25, 2020 13:52 |
|
If you have to write enough JSON or XML or YAML by hand that you feel the need for comments and types, give a try to writing Dhall and then printing it to whatever format your tools ask for. It's really good.
|
# ? Feb 25, 2020 13:58 |
|
If you have a solution for a problem that only solve 90% of your problems - then get that solution and use it for these 90% of problems If you try to create a solution to solve 100% of your problems, you have created a problem so large that it may take your entire life to solve and the result may still be unusuable, economic unsound, hard to use, ugly looking and so on so on. JSON is more popular today than XML because is not tryiing to be the best tool for every use. One of the starting points of JSON was... "Okay lets use UTF-8 for strings". ...from that decision they come limitations, but also it make life easier. XML declare the character set in the document -- so to know what character set is written the document you are going to read -- you have to read the document, then going back with that idea. .... From flexibility comes hard to solve problems, social problems, political problems, problems that create other problems..
|
# ? Feb 25, 2020 14:47 |
|
b0lt posted:JSON, the JavaScript Object Notation, is incapable of representing all POD JavaScript objects. What is "POD" there? Plain Old Dumb? Also, are you complaining that JSON can't natively represent a reference cycle?
|
# ? Feb 25, 2020 15:37 |
|
Tei posted:If you have a solution for a problem that only solve 90% of your problems - then get that solution and use it for these 90% of problems quote:If you try to create a solution to solve 100% of your problems, you have created a problem so large that it may take your entire life to solve and the result may still be unusuable, economic unsound, hard to use, ugly looking and so on so on. JSON is pretty objectively bad for round-tripping arbitrary data which is what it's mostly used for, and it's pretty bad at being human readable/writable. There's no use case that I can think where JSON is appropriate and something else wouldn't be equally or more appropriate. Volte fucked around with this message at 15:40 on Feb 25, 2020 |
# ? Feb 25, 2020 15:37 |
|
Munkeymon posted:What is "POD" there? Plain Old Dumb?
|
# ? Feb 25, 2020 15:46 |
|
Volte posted:JSON is pretty objectively bad for round-tripping arbitrary data which is what it's mostly used for, and it's pretty bad at being human readable/writable. There's no use case that I can think where JSON is appropriate and something else wouldn't be equally or more appropriate. Delivering data to a web browser, hth Volte posted:POD means 'Plain Old Data' i.e. just a structure full of data without any object semantics involved, like a C struct. The issue is that undefined and Symbol are primitive JavaScript types that have no analogue in JSON. That's not even mentioning function types or dates, but I dunno if I'd call those plain old data. Symbols are newer than the JSON spec, AFAIK, so that's unsurprising, and I'm having a lot of trouble coming up with a use case for sending an undefined other than simply not sending it because it's not defined. I would not consider functions or dates plain old data, either, but we at least have ISO date formats to exchange a very usefully large subset of them in. Munkeymon fucked around with this message at 16:01 on Feb 25, 2020 |
# ? Feb 25, 2020 15:52 |
|
Munkeymon posted:Delivering data to a web browser, hth Munkeymon posted:I would not consider functions or dates plain old data, either, but we at least have ISO date formats to exchange a very usefully large subset of them in. Volte fucked around with this message at 16:09 on Feb 25, 2020 |
# ? Feb 25, 2020 16:06 |
|
Volte posted:Cool, what happens if I need to send a date, the number 2^59, one out of an enumeration of possible values, or a list of values that are all the same type? I need to rely on the client-side having the exact right bespoke JSON extensions to validate and process the data, rendering the entire point of a standardized interchange format moot. If I'm expecting XML, I only need an XML library and the schema to validate and parse it. If I'm expecting JSON, I need a JSON library to parse the string into a JSON object, and then a bespoke parser on top of that to parse and validate the JSON object into something else that I can use. Oh, hey, we're right back to the "just never write bugs, or integrate with people who do, and this is the best possible solution" approach. Yes, XML makes it possible to define an ironclad schema. The mechanisms for doing this are an opaque pain in the rear end. Many, many people will bypass them with shortcuts and do their own thing to make their lives easier. Since we're talking about an interchange format, just doing things right on your own end isn't enough to fix the whole problem - everybody in the space you're integrating with has to do the annoying and difficult but technically correct thing. That never happens. What you end up with is a half-baked schema, with a whole bunch of special cases and poorly documented extensions that you have to integrate with anyway. And, at that point, it becomes pretty attractive to throw up your hands, say "gently caress it, it's all strings now," and go to JSON. It makes no promises, it has no spec, but at least that means it won't tell you lies.
|
# ? Feb 25, 2020 16:32 |
|
JSON is not for everything. It don't need too. And is great because is not tryiing to be everything for everyone. Can I write some fiction to explain it again? is not a concept that come natural to some people, seems. ------------------ Forks can't cut meat, you use a knife for that. That don't mean "forks" are wrong, or that is a good idea to eXtend forks with a blade. Only pain and misery waits that path. It means if you need to cut meat, you use a knife. People on this thread point to a fork and say "it can't hammer a nail". People on this thread say "Maybe we should make the fork weight a lot, so it can hammer the nail". People on this thread say "The author of Forks probably forgot that nails exist" People on this thread say "People often need to hammer nails, Forks are clearly flawed". Here I say, if you need a hammer a nail, use a hammer. A fork is for food. Tei fucked around with this message at 16:48 on Feb 25, 2020 |
# ? Feb 25, 2020 16:35 |
|
Ever heard of pthreads? Well we're very much in a pee-thread right now
|
# ? Feb 25, 2020 16:54 |
|
Pee-threads are very useful in certain cases. Sure they have their flaws, but you clearly don't know what you are talking about talking poo poo about pee. add-content-to-post edit: Context: a C codebase where objects carry their own type, it's full of explicit comparisons (e.g. A->type == B->type) I make some modifications to the existing code to add serialization/deserialization of those objects. Of course I notice after a week of debugging (print debugging, no gdb for platform) that my modifications are breaking randomly BECAUSE TYPES ARE POINTERS TO GLOBAL VARIABLES. WHO DOES THIS Beef fucked around with this message at 17:10 on Feb 25, 2020 |
# ? Feb 25, 2020 16:57 |
|
Volte posted:Cool, what happens if I need to send a date, the number 2^59, one out of an enumeration of possible values, or a list of values that are all the same type? I need to rely on the client-side having the exact right bespoke JSON extensions to validate and process the data, rendering the entire point of a standardized interchange format moot. If I'm expecting XML, I only need an XML library and the schema to validate and parse it. If I'm expecting JSON, I need a JSON library to parse the string into a JSON object, and then a bespoke parser on top of that to parse and validate the JSON object into something else that I can use. Someone has to do the work of turning your internal business object graph into reasonable user-facing output at some point(s) no matter how you serialize it. I've worked with people who made thoughtful JSON APIs and people who just poo poo the graph out over the wire and made front-end figure it out. There are advantages to each, actually. Volte posted:Yeah ISO date formats are cool, too bad JSON doesn't support any of them. "Just throw it in a string and let the client's a priori knowledge of the expected format carry them through" is not actually a solution, it's the entire problem that standardized interchange formats are meant to solve. Look at the absolute fiasco that is JSON.NET date handling if you need more info. I'm really baffled as to why you're pretending exchanging ISO date formats as strings is some terrible abomination here? It's at worst a minor inconvenience until a library author makes a bad decision about handling them but that's not evidence of some massive underlying problem with the format the library deals with. Yeah, real date support would be nice, but dates are complicated as gently caress so fobbing that off to the user makes the transport format that much easier to deal with. I'm guessing date format validation would have added a whole lot of JSON parser bugs over the years, just off the top of my head.
|
# ? Feb 25, 2020 17:03 |
|
Tei posted:Forks can't cut meat, you use a knife for that. Space Gopher posted:Oh, hey, we're right back to the "just never write bugs, or integrate with people who do, and this is the best possible solution" approach. Space Gopher posted:What you end up with is a half-baked schema, with a whole bunch of special cases and poorly documented extensions that you have to integrate with anyway.
|
# ? Feb 25, 2020 17:03 |
|
Munkeymon posted:Someone has to do the work of turning your internal business object graph into reasonable user-facing output at some point(s) no matter how you serialize it. I've worked with people who made thoughtful JSON APIs and people who just poo poo the graph out over the wire and made front-end figure it out. There are advantages to each, actually.
|
# ? Feb 25, 2020 17:05 |
|
Tei posted:People put light bulbs inside their rear end and then these are stuck Pro tier programming technique
|
# ? Feb 25, 2020 17:08 |
|
Volte posted:Have you ever actually had to deal with these issues? Do you think I'm making these things up? Date handling in JSON has taken up literal weeks of my time. Dates are complicated. That's exactly why there should be one single place that's responsible for the uniform interpretation of these dates that everyone can rely on, so the only thing the clients need to care about is "what do I do with this date" and not "what the gently caress is this string, is this a date? Let's hope so" Blaming JSON on date formatting incompatibilities makes as much sense to me as blaming TCP
|
# ? Feb 25, 2020 17:09 |
|
Munkeymon posted:Blaming JSON on date formatting incompatibilities makes as much sense to me as blaming TCP Edit: given that JSON has no concept of dates, I shouldn't have said "date handling in JSON". I should have said "utterly unstructured wild west date handling". Volte fucked around with this message at 17:26 on Feb 25, 2020 |
# ? Feb 25, 2020 17:13 |
|
Volte posted:Am I missing something again??? How is this not the JSON experience to a T? Perfect-world XML experience: everything is rigorously defined by a machine-readable schema. Import that schema, point a client at the right endpoints, and your integration work is done in a few minutes. You can get to work building out logic. The sun shines and you get a lollipop. Real-world XML experience: there's still a machine-readable schema, but importing it doesn't begin to describe the data. You need to implement a bunch of mapping logic on top of that schema to handle a legacy of bad decisions and quick hacks. If you're lucky, there's a second set of human-readable documentation that describes these special cases and extensions. Everything you implement needs to take multiple sources of truth into account. Good luck if they conflict. It's easy to say "just don't do this, do it the perfect way," but that just doesn't happen. I've integrated with generally well-architected systems that run major business operations for a company you've heard of, which used mostly coherent and mostly standards-compliant XML. The system I was working against was one of the better-maintained ones I've seen in my career. We still spent weeks going over everything that would, in a perfect world, be in the schema, and all the special cases we needed to know about, before we could start integrating anything. That was probably the best experience with XML that I've ever had - and I've had quite a few others. Sometimes it gets to the point where you need to write your own "XML" parser, because it's really just "we put some tokens in angle brackets with no other consideration." Real-world JSON experience: there's no schema, just some API documentation. It's still probably out of date and doesn't cover everything, but at least it's easier for everyone to keep up to date, and there's less trouble integrating two sources of truth. It's nowhere near as good as the perfect-world XML experience could be, but it's miles better than the one that crops up whenever an XML-based system evolves over time. There are reasons people go for JSON.
|
# ? Feb 25, 2020 17:27 |
|
Beef posted:Of course I notice after a week of debugging (print debugging, no gdb for platform) that my modifications are breaking randomly BECAUSE TYPES ARE POINTERS TO GLOBAL VARIABLES. WHO DOES THIS Objective-C I think. Was the answer to serialize something other than the address of the type, then look up the type's address on deserialize?
|
# ? Feb 25, 2020 17:34 |
|
Space Gopher posted:Perfect-world XML experience: everything is rigorously defined by a machine-readable schema. Import that schema, point a client at the right endpoints, and your integration work is done in a few minutes. You can get to work building out logic. The sun shines and you get a lollipop. edit: Is programming the only craft where the accepted solution to "the people doing the work are loving everything up at every turn" is "just give them dumber tools with no sharp edges"? Volte fucked around with this message at 17:47 on Feb 25, 2020 |
# ? Feb 25, 2020 17:38 |
|
|
# ? May 26, 2024 22:34 |
|
pokeyman posted:Objective-C I think. Yep, giant switch statement on read. And a giant printf gently caress you on the user-defined-type case, because there is no way to magically predict where malloc is going to ploink down that type object at some point in the future.
|
# ? Feb 25, 2020 17:57 |