|
programmer shitposting
|
# ? Nov 15, 2017 16:55 |
|
|
# ? Jun 8, 2024 19:32 |
|
cinci zoo sniper posted:c tp mysql s:
|
# ? Nov 15, 2017 16:59 |
|
gonadic io posted:lol nice. would be better with "terrible programmers:" at the front though no room
|
# ? Nov 15, 2017 17:00 |
|
you don't truly become a terrible programmer until you know just enough to really gently caress yourself and your colleagues
|
# ? Nov 15, 2017 17:07 |
|
St Evan Echoes posted:you don't truly become a terrible programmer until you know just enough to really gently caress yourself and your colleagues this is the thing, newbie programmers aren't terrible programmers. they're bad, but they're bad and ineffectual. it's the subtle bugs that kill you not the glaring syntax errors
|
# ? Nov 15, 2017 17:23 |
St Evan Echoes posted:what do you want the result to be let me rewrite the query better first SQL code:
what i want to do is to retrieve foo.* for every 2016 record, and every historical foo.a (which is incremental) that matches the group id number foo.b, as a string. the problem im running in at the moment is oddball scenarios where i get loads of duplicates (for example, if i replace WHERE clause with "WHERE foo.a = 21969420" that matches some late 2016). i can fix this by slamming DISTINCT into GROUP_CONCAT, or by doing some seemingly painful (this query humongous irl) refactoring so that the innermost loop contains all foos, which seems to invoke a lot of problems related to filtering and ranking. im reluctant to do the latter, and i wonder if there is something faster than solution with DISTINCT, which appears to be significantly slowing down the already terrible query (100k target foo rows with joins to a couple dozen tables, lots of GROUP_CONCATS and general querying without existing indices) also i would like to understand what is going off in the first place, since the DISTINCT option sounds like a bandaid, and it really shouldnt be spitting out 10 copies of bar.a, each with all 5 baz.a in it (for that WHERE substition example). my best guess was that something related to joining a value on an array, but COALESCE(bar.b) in ON clause did not change anything the understand will help to fix other dozen of sub queries with the exact same problem cinci zoo sniper fucked around with this message at 17:28 on Nov 15, 2017 |
|
# ? Nov 15, 2017 17:23 |
|
gonadic io posted:this is the thing, newbie programmers aren't terrible programmers. they're bad, but they're bad and ineffectual. it's the subtle bugs that kill you not the glaring syntax errors It's like president trump vs president pence
|
# ? Nov 15, 2017 17:27 |
gonadic io posted:It's like president trump vs president pence there's nothing subtle about shocking the gay away
|
|
# ? Nov 15, 2017 17:29 |
|
javascript is losing some steam in the language popularity contest, it may become the new ruby I think
|
# ? Nov 15, 2017 17:53 |
|
cinci zoo sniper posted:let me rewrite the query better first hmm i got lost like a sentence and half into your post, just throw away the whole query and start again imo
|
# ? Nov 15, 2017 17:53 |
|
Symbolic Butt posted:javascript is losing some steam in the language popularity contest, it may become the new ruby I think not as long as it's the default language for web browsers.
|
# ? Nov 15, 2017 17:55 |
|
this thread title sucks, I thought maybe the terrible programmers thread got deleted or something.
|
# ? Nov 15, 2017 17:55 |
St Evan Echoes posted:hmm i got lost like a sentence and half into your post, just throw away the whole query and start again imo will see what i can do i guess. tl;dr on my post was that i want to look up same table entries preceding the scope i'm querying, and i keep getting duplicates without hacky solutions
|
|
# ? Nov 15, 2017 18:15 |
|
developer holding a screw driver and jtag cable: “this is a lot easier than fixing the JavaScript”
|
# ? Nov 15, 2017 18:23 |
|
cinci zoo sniper posted:will see what i can do i guess. tl;dr on my post was that i want to look up same table entries preceding the scope i'm querying, and i keep getting duplicates without hacky solutions what happens if you put code:
|
# ? Nov 15, 2017 18:31 |
|
hobbesmaster posted:developer holding a screw driver and jtag cable: “this is a lot easier than fixing the JavaScript” this but unironically
|
# ? Nov 15, 2017 18:35 |
St Evan Echoes posted:what happens if you put
|
|
# ? Nov 15, 2017 18:35 |
|
idk, but then idk why you would be able to do a group function with no group by in the first place, loving mysql
|
# ? Nov 15, 2017 18:37 |
|
hobbesmaster posted:developer holding a screw driver and jtag cable: “this is a lot easier than fixing the JavaScript”
|
# ? Nov 15, 2017 18:50 |
|
akadajet posted:not as long as it's the default language for web browsers. getting there: https://caniuse.com/#feat=wasm
|
# ? Nov 15, 2017 19:05 |
|
cinci zoo sniper posted:will see what i can do i guess. tl;dr on my post was that i want to look up same table entries preceding the scope i'm querying, and i keep getting duplicates without hacky solutions why not split it into two sub queries? like code:
and then concat that?
|
# ? Nov 15, 2017 19:06 |
|
no wait can't you do this instead?code:
i guess you can concat bar? If this was mssql I'd stick a for xml path('') on there to get the concatenation because there's no built in function for it yet but I assume group_concat will do that for you...
|
# ? Nov 15, 2017 19:12 |
Powerful Two-Hander posted:no wait can't you do this instead? i can't use external variables inside subquery for mysql reasons. at least in join part i can't - i think i did successfully do this in the select part, but with the number of variables im getting through this query it might cause repeated full scans (not sure how this things work out under the hood if i would pull 10 variables from the same table that way) what do you mean by concating bar itself, you mean inside the subquery? ill try that at work tomorrow cinci zoo sniper fucked around with this message at 19:26 on Nov 15, 2017 |
|
# ? Nov 15, 2017 19:23 |
|
gonadic io posted:this is the thing, newbie programmers aren't terrible programmers. they're bad, but they're bad and ineffectual. it's the subtle bugs that kill you not the glaring syntax errors worse: broken, unfixable architecture
|
# ? Nov 15, 2017 19:49 |
|
has anyone here used D? do you have opinions on D? i wanna write a network daemon thingy and I’m looking at D idk
|
# ? Nov 15, 2017 19:56 |
|
a witch posted:has anyone here used D? do you have opinions on D?
|
# ? Nov 15, 2017 20:02 |
St Evan Echoes posted:giggled like a child at this entire post this but unsuccesfully tried to make an adequate joke about it
|
|
# ? Nov 15, 2017 20:06 |
|
cinci zoo sniper posted:this but unsuccesfully tried to make an adequate joke about it we know your D is an inadequate joke
|
# ? Nov 15, 2017 20:10 |
Captain Foo posted:we know your D is an inadequate joke gdi, i walked right into this, didnt i
|
|
# ? Nov 15, 2017 20:12 |
|
From what I heard, Rust sucked away pretty much all consumers of D.
|
# ? Nov 15, 2017 20:12 |
|
cinci zoo sniper posted:gdi, i walked right into this, didnt i
|
# ? Nov 15, 2017 20:15 |
|
Zemyla posted:From what I heard, Rust sucked away pretty much all consumers of D. what if I love rust but hate tokio?
|
# ? Nov 15, 2017 20:18 |
|
the prequel to b-trees, a-hives
|
# ? Nov 15, 2017 20:26 |
|
is there a reason you are using a subquery instead of a normal join? also you can just use CONCAT() i think e: like SQL code:
HoboMan fucked around with this message at 20:40 on Nov 15, 2017 |
# ? Nov 15, 2017 20:34 |
|
a witch posted:what if I love rust but hate tokio? if i understand correctly
|
# ? Nov 15, 2017 20:45 |
HoboMan posted:is there a reason you are using a subquery instead of a normal join? i'm joining the table onto itself, but i havent tried doing it without a subquery. im not sure i can do that, but i can try it in the morning to fram the question the other way: i can "solve" this via code:
cinci zoo sniper fucked around with this message at 20:57 on Nov 15, 2017 |
|
# ? Nov 15, 2017 20:50 |
|
cinci zoo sniper posted:what i want to do is to retrieve foo.* for every 2016 record, and every historical foo.a (which is incremental) that matches the group id number foo.b, as a string. if more than one foo can have the same group id then you are going to have duplicates. (period) the solution is to use GROUP_CONCAT(DISTINCT foo.whatever), it's not hacky the reason COALESCE does nothing is because comparing null to anything (including null) is always false it is slow because you are concatenating 100k rows into a goddamn string wtf are you doing!?!?!
|
# ? Nov 15, 2017 20:58 |
|
redleader posted:yeah, you're right. i misspoke; i said 'overengineered' when i actually meant 'verbose and boilerplatey' I get this argument, however you can always build less components if you don’t need as many for ten plating. either way, it’s a super strong way of assembling components into a functional UI that doesn’t tend to bite you in the rear end, so much so that we use React for pretty much every client job no matter what. server side rendering, client side rendering, doesn’t matter, use React all the same and that way we can change seamlessly, because in the end of the day it’s just JavaScript objects in, DOM/HTML out, and React is a solid way to express that. definitely feel the verbose and boilerplatey though, so I made a project specific snippet/templating tool that asks a few questions about each component and where it needs to go, so you don’t have to worry about the boilerplate so much and don’t feel the friction of not wanting to right new components to avoid the boilerplate. works for me anyway.
|
# ? Nov 15, 2017 21:04 |
|
gonadic io posted:YOSPOS > i actually have reasons to post here unlike half of the people with impostor syndrome is the lowercase a's a nethack reference
|
# ? Nov 15, 2017 21:08 |
|
|
# ? Jun 8, 2024 19:32 |
HoboMan posted:if more than one foo can have the same group id then you are going to have duplicates. (period) check the pic above. category can have a few dozen requests, and i have around 100k categories, where i "forward concatenate" every request into the lastest one. no idea if that makes it any faster or slower than concatenating 100k rows into a string. a bit confused about coalesce, since docs say it just selects the first non-null value. quote:if more than one foo can have the same group id then you are going to have duplicates. (period) going to face the inevitable bloodshed and roll with GROUP_CONCAT(DISTINCT i guess, and cry when i need to refactor it because it runs for eternity or something. and our new etl dev is not here for another 3.5 weeks cinci zoo sniper fucked around with this message at 21:35 on Nov 15, 2017 |
|
# ? Nov 15, 2017 21:12 |