|
I remember people talking about "fuskers" in dpph lol
|
# ? Sep 11, 2017 03:28 |
|
|
# ? May 26, 2024 10:19 |
|
*fart*
|
# ? Sep 11, 2017 03:29 |
|
akadajet posted:*fart*
|
# ? Sep 11, 2017 03:57 |
|
LinYutang posted:generating javascript through Java string concatenation is extremely my poo poo this, but via an xslt transform
|
# ? Sep 11, 2017 04:43 |
|
redleader posted:this, but via an xslt transform this, but coldfusion
|
# ? Sep 11, 2017 07:53 |
|
Dude, there are some things you just don't talk about in public.
|
# ? Sep 11, 2017 08:07 |
i have worked with json data source for about two weeks and my blood is boiling jesus
|
|
# ? Sep 11, 2017 10:11 |
the data source in question is a steam of governmental financial reports for our customers, and the reporting "schema" within the span of 21 days consists of at least 81 distinct (from the processing perspective) variations
cinci zoo sniper fucked around with this message at 11:58 on Sep 11, 2017 |
|
# ? Sep 11, 2017 10:13 |
|
cinci zoo sniper posted:i have worked with json data source for about two weeks and my blood is boiling jesus i feel your pain
|
# ? Sep 11, 2017 10:25 |
MALE SHOEGAZE posted:i feel your pain i ended up writing a fairly steamy pile of poo poo in r (i know, i know) that i can probably improve a bit but gently caress me if the original source was made in good faith, or if json is something i would seriously ever consider for anything
|
|
# ? Sep 11, 2017 13:05 |
|
i can't understand why anyone consuming data chooses to use a store that guarantees exactly nothing about what's in a record. projecting our collections (which were being used as de facto tables) into real tables was the only way to see all the nulls where different records used slightly different element names.
|
# ? Sep 11, 2017 17:12 |
|
cinci zoo sniper posted:i ended up writing a fairly steamy pile of poo poo in r (i know, i know) that i can probably improve a bit but gently caress me if the original source was made in good faith, or if json is something i would seriously ever consider for anything are you manually parsing a raw json string in loving r of all things? because then i don't think json is the problem here!!! (its u)
|
# ? Sep 11, 2017 17:24 |
HoboMan posted:are you manually parsing a raw json string in loving r of all things? because then i don't think json is the problem here!!! (its u) to inspire this thread even more, the pipeline looks like this 1) go to mongo 2) run a lovely select query 3) save data to json 4) load it into r 5) do automated (3rd party) decomposition of the remaining micro-blobs* 6) select relevant parts from them, and the singular things extracted in mongo 7) shittily convert everything to 2d representation* 8) export the file to csv 9) skype it to relevant coworker * parts im confident i can improve by a wide margin now that i have more experience with this whole fuckery
|
|
# ? Sep 11, 2017 18:14 |
motedek posted:i can't understand why anyone consuming data chooses to use a store that guarantees exactly nothing about what's in a record. projecting our collections (which were being used as de facto tables) into real tables was the only way to see all the nulls where different records used slightly different element names. so much this, the thing i worked on was full of poo poo like event1 -subevent --date event2 -date event3 -1 --date event4 -subevent.date and then the loving nulls everywhere that i ended up hunting down with a gigantic fuckoff lapply^2 statement so they don't gently caress something else (semi-pointlessly, which is even more infurating (let me tell you where i think fwrite author should go gently caress themselves))
|
|
# ? Sep 11, 2017 18:17 |
...can you make that last step easier too?
|
|
# ? Sep 11, 2017 18:17 |
silvergoose posted:...can you make that last step easier too? yes, but it's literally one person except me that is interested in this, and they don't need it regularly so i'm too lazy to change working directory to google drive
|
|
# ? Sep 11, 2017 18:20 |
|
cinci zoo sniper posted:the data source in question is a steam of governmental financial reports
|
# ? Sep 11, 2017 18:28 |
|
JewKiller 3000 posted:one of our data providers changed the format of their files. last time i checked they were using the terrible oracle dba's favorite format: a sort-of csv but with pipes instead of commas, and no quoting/escaping rules. well i guess they ran into a situation with pipes in the data? so instead of using a well specified format, it's still kinda-csv, but with '~' (3 characters) as the comma and #@#@# as the newline followup: some of the data files still use pipes. there is no indication which ones they are, you just have to look at them
|
# ? Sep 11, 2017 18:47 |
|
i mean the fact that you're getting garbage in garbage out isn't the fault of the serialization format you are ostensibly using you do realize that people stringbash together "xml" that isn't actually valid xml, right? and then tell you to go gently caress yourself if you politely ask them to fix their poo poo
|
# ? Sep 11, 2017 18:59 |
|
at least people don't make xml become invalid just by adding comments to their files
|
# ? Sep 11, 2017 19:05 |
|
JewKiller 3000 posted:followup: some of the data files still use pipes. there is no indication which ones they are, you just have to look at them
|
# ? Sep 11, 2017 19:09 |
|
Sapozhnik posted:you do realize that people stringbash together "xml" that isn't actually valid xml, right? and then tell you to go gently caress yourself if you politely ask them to fix their poo poo people are gonna be idiots no matter what, but at least with xml there's a spec and a schema (i hope) and automatic validators you can run the data through John Big Booty posted:That's what you get for killing all those Jews. Karma, bitch. why does everyone always assume i kill jews? maybe i am the jew who does the killing of 3000?
|
# ? Sep 11, 2017 19:12 |
|
hey what's a good issue tracker these days
|
# ? Sep 11, 2017 19:12 |
|
quiggy posted:hey what's a good issue tracker these days
|
# ? Sep 11, 2017 19:13 |
|
quiggy posted:hey what's a good issue tracker these days it's definitely not zenhub
|
# ? Sep 11, 2017 19:13 |
|
JewKiller 3000 posted:people are gonna be idiots no matter what, but at least with xml there's a spec and a schema (i hope) and automatic validators you can run the data through because a shark killer kills killer sharks.
|
# ? Sep 11, 2017 19:13 |
|
|
# ? Sep 11, 2017 19:14 |
|
JewKiller 3000 posted:why does everyone always assume i kill jews? maybe i am the jew who does the killing of 3000? Change names to 3000-Killer Jew?
|
# ? Sep 11, 2017 19:18 |
|
quiggy posted:hey what's a good issue tracker these days taiga seems okayish, if you're working for an actual business with money though use jira
|
# ? Sep 11, 2017 19:20 |
|
JewKiller 3000 posted:why does everyone always assume i kill jews? maybe i am the jew who does the killing of 3000? u so edgy
|
# ? Sep 11, 2017 19:24 |
|
quiggy posted:hey what's a good issue tracker these days pivotal tracker, always and forever.
|
# ? Sep 11, 2017 19:34 |
|
obviously the one on github, I mean your project is open sores right? I mean whats the point of working on it if it isnt
|
# ? Sep 11, 2017 19:53 |
|
motedek posted:i can't understand why anyone consuming data chooses to use a store that guarantees exactly nothing about what's in a record. projecting our collections (which were being used as de facto tables) into real tables was the only way to see all the nulls where different records used slightly different element names. lmao, ouch
|
# ? Sep 11, 2017 20:51 |
|
quiggy posted:hey what's a good issue tracker these days
|
# ? Sep 11, 2017 21:01 |
|
quiggy posted:hey what's a good issue tracker these days literally anything except rally gently caress rally
|
# ? Sep 11, 2017 21:41 |
|
quiggy posted:hey what's a good issue tracker these days Jira software/service desk w/ confluence for documentation
|
# ? Sep 11, 2017 22:09 |
|
motedek posted:i can't understand why anyone consuming data chooses to use a store that guarantees exactly nothing about what's in a record. projecting our collections (which were being used as de facto tables) into real tables was the only way to see all the nulls where different records used slightly different element names. but mongo is web scale...
|
# ? Sep 11, 2017 22:10 |
|
JewKiller 3000 posted:why does everyone always assume i kill jews? maybe i am the jew who does the killing of 3000? See also: edgelord.
|
# ? Sep 11, 2017 22:20 |
|
JewKiller 3000 posted:why does everyone always assume i kill jews? maybe i am the jew who does the killing of 3000? that would be Jew Killer3000 you have been foiled by spacing (once again?)
|
# ? Sep 12, 2017 00:44 |
|
|
# ? May 26, 2024 10:19 |
|
JewKiller 3000 posted:maybe i am the jew who does the killing of 3000? hmmmmmm... where were you exactly 16 years ago?
|
# ? Sep 12, 2017 01:39 |