Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
"Peer-to-peer" was a dot-com-bubble thing after the explosion of Napster, where a bunch of companies did exactly this stupid idea (no need to buy servers!!!), but it was quickly found by basically everyone involved that client connections are worse than rubbish, upload speeds are bad, their normal computers are really slow, so you need servers anyway.

Skype tried to solve this by closely monitoring every peer in the network, and upgrading really consistent client connections to "super-peers". But this resulted in some hilarious big outages when super-peers went down at the same time. Microsoft seems to have wiped the old Skype blogs off the planet, but there are several outages that are just "oops turns out our super peers were just some phones and we think the battery ran out on them, your calls might be a bit rocky now"

I just think it's hilarious that Yahoo! (yodel noise) got snookered into peer-to-peer hype, 10 years after it died.

Adbot
ADBOT LOVES YOU

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings
Wasnt the peer to peer boom also a response to the fact that content abruptly grew in size, but bandwidth costs were slow to drop, so there was a period where it really was cost prohibitive for most companies?
As in... There was some logic to it? Unlike the current tidal wave of "my startup is making a realtime chat app built on blockchain tech for conversation integrity!"

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction

Suspicious Dish posted:

Skype tried to solve this by closely monitoring every peer in the network, and upgrading really consistent client connections to "super-peers". But this resulted in some hilarious big outages when super-peers went down at the same time. Microsoft seems to have wiped the old Skype blogs off the planet, but there are several outages that are just "oops turns out our super peers were just some phones and we think the battery ran out on them, your calls might be a bit rocky now"

And every time this comes up on HN, the they claim it was because this is how Microsoft installs the back doors. No, it's much more simply that the old architecture didn't scale at all, sorry!

SupSuper
Apr 8, 2009

At the Heart of the city is an Alien horror, so vile and so powerful that not even death can claim it.

Suspicious Dish posted:

Skype tried to solve this by closely monitoring every peer in the network, and upgrading really consistent client connections to "super-peers". But this resulted in some hilarious big outages when super-peers went down at the same time. Microsoft seems to have wiped the old Skype blogs off the planet, but there are several outages that are just "oops turns out our super peers were just some phones and we think the battery ran out on them, your calls might be a bit rocky now"
It was kinda impressive when Skype still worked when the rest of the internet was down though.

xtal
Jan 9, 2011

by Fluffdaddy
Skype was used for gaming chat between the eras of TeamSpeak and Discord, and you could query anybody's IP address by their public username, then DDOS them during important matches. Good times

Cuntpunch posted:

Wasnt the peer to peer boom also a response to the fact that content abruptly grew in size, but bandwidth costs were slow to drop, so there was a period where it really was cost prohibitive for most companies?
As in... There was some logic to it? Unlike the current tidal wave of "my startup is making a realtime chat app built on blockchain tech for conversation integrity!"

Well, it still is cost prohibitive for most companies, which is why everything hosted online is through a megacorp. P2P went away in part because of the security issues like the one I mentioned above, but mostly that it's more complicated and harder to build. You make more profit with centralized infrastructure and fixing the scale issues by throwing (ideally VC) money at them instead.

xtal fucked around with this message at 03:49 on Aug 28, 2019

MrMoo
Sep 14, 2000

Absurd Alhazred posted:

I mean, peer-to-peer content distribution isn't that bad of an idea. It's like realtime bittorrent! :v:

The various illegal IPTV services out of China still use real-time bittorrent approach.

Ruggan
Feb 20, 2007
WHAT THAT SMELL LIKE?!


This probably doesn't qualify as a coding horror but I'm going to post it anyway.

I work at a big tech company. We have a division that installs our software at customer sites, and another division that supports the software once it is live. Shouldn't be a surprise that Excel sees pretty wide use for various purposes (project management, ad-hoc trackers, random data analysis, etc). I would expect that through working here you'd pretty quickly pick up a basic understanding of Excel.

Over the past week, I've met two people who have worked here for 7 years and I think are top contenders for the "most proficient at excel" award:

  • Employee #1: Needed 30 minutes of instruction to understand how to write an excel formula that added two cells together ala "=A1+B1".
  • Employee #2: Was creating a CSV document from excel by manually concatenating cell values with a delimiter, copy pasting the result out of excel cell by cell into Notepad++, and saving that file as a CSV. Instead of... you know... saving the excel file as a CSV directly.

#2 in particular is hilarious to me because it seems like a hamfisted parody of the "roll your own CSV parser" that seems to come up in tech circles.

canis minor
May 4, 2011

My previous company - the myth: "we're doing Machine Learning to track which routes are profitable, so that our network of connections adapts itself to demand"; the reality - meet Bob, Bob will download a database as a CSV, open it on a custom purchased machine that has loving water cooling and all the fancy specs to handle the amount of data that has to be imported into Excel, make alterations to what the routes should be on given days and how should they be priced, then will upload it to the server.

That's my most recent Excel related horror story :v:

canis minor fucked around with this message at 16:04 on Aug 28, 2019

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



Excel is too powerful and access to it should be restricted

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!
Excel is a tough one for me because I end up using it like once every month or two which is just enough time to forget all the minutia of like how to make a cell reference row-bound vs column bound and so on. This makes it always a slightly frustrating experience to use, which makes me avoid it and so I never get those details ingrained.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
Just push F4 and then gently caress with the $ signs if you only want row or column

Doom Mathematic
Sep 2, 2008
Excel is crying out for a strict mode.

Ruggan
Feb 20, 2007
WHAT THAT SMELL LIKE?!


Munkeymon posted:

Excel is too powerful and access to it should be restricted

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

From HN:

https://news.ycombinator.com/item?id=20818398 posted:

Apples libc used to shell-out to perl in a function: https://github.com/Apple-FOSS-Mirror/Libc/blob/2ca2ae74647714acfc18674c3114b1a5d3325d7d/gen/wordexp.c#L192

Jazerus
May 24, 2011


https://twitter.com/Parent5446/status/1166179218188881920

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


canis minor posted:

My previous company - the myth: "we're doing Machine Learning to track which routes are profitable, so that our network of connections adapts itself to demand"; the reality - meet Bob, Bob will download a database as a CSV, open it on a custom purchased machine that has loving water cooling and all the fancy specs to handle the amount of data that has to be imported into Excel, make alterations to what the routes should be on given days and how should they be priced, then will upload it to the server.

That's my most recent Excel related horror story :v:

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

amazing how goatkcd is so apt so often

Kuule hain nussivan
Nov 27, 2008

I'd love it if someone could confirm that the below way of doing this is crazy and a true coding horror.

We have an application that fetches images from a hard drive that go with a form. Each form has a unique ID and each image is named according to the ID (so ID.jpg). All the files exist in the same folder, either in the root or a subfolder which can be easily parsed from the ID. When fetching these files, the application doesn't use a path to the file, but instead calls the terminal's find command. The folder has (at the moment) about 700 000 images in it and this is likely to incease to about a million sooner rather than later.

Is there any sane reason to use find. Why not just open the file with a path to the root and then the clustered folder if that fails, or the other way around since most of the files are clustered. This seems to be a huge loving waste of resources.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

Kuule hain nussivan posted:

I'd love it if someone could confirm that the below way of doing this is crazy and a true coding horror.

We have an application that fetches images from a hard drive that go with a form. Each form has a unique ID and each image is named according to the ID (so ID.jpg). All the files exist in the same folder, either in the root or a subfolder which can be easily parsed from the ID. When fetching these files, the application doesn't use a path to the file, but instead calls the terminal's find command. The folder has (at the moment) about 700 000 images in it and this is likely to incease to about a million sooner rather than later.

Is there any sane reason to use find. Why not just open the file with a path to the root and then the clustered folder if that fails, or the other way around since most of the files are clustered. This seems to be a huge loving waste of resources.

have you asked your coworkers why it does it that way?

Kuule hain nussivan
Nov 27, 2008

Hammerite posted:

have you asked your coworkers why it does it that way?
Closed source, outside supplier. When I asked, I got "It is how the system works" as an answer from a senior developer.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

Kuule hain nussivan posted:

Closed source, outside supplier. When I asked, I got "It is how the system works" as an answer from a senior developer.

oh ok. They are probably idiot hell fuckers, op.

Kuule hain nussivan
Nov 27, 2008

Hammerite posted:

oh ok. They are probably idiot hell fuckers, op.
Yeah, I personally can't figure out any reason why find should be used. This loving system is such a mess.

ynohtna
Feb 16, 2007

backwoods compatible
Illegal Hen
If you just learnt it, flaunt it all over the loving project base.

Tei
Feb 19, 2011
Probation
Can't post for 47 hours!

Kuule hain nussivan posted:

I'd love it if someone could confirm that the below way of doing this is crazy and a true coding horror.

We have an application that fetches images from a hard drive that go with a form. Each form has a unique ID and each image is named according to the ID (so ID.jpg). All the files exist in the same folder, either in the root or a subfolder which can be easily parsed from the ID. When fetching these files, the application doesn't use a path to the file, but instead calls the terminal's find command. The folder has (at the moment) about 700 000 images in it and this is likely to incease to about a million sooner rather than later.

Is there any sane reason to use find. Why not just open the file with a path to the root and then the clustered folder if that fails, or the other way around since most of the files are clustered. This seems to be a huge loving waste of resources.

this sometimes happend and could be caused by unforeseen changes in a application

the solution is easy, I think, either organize these photos in years 2018/14324234.jpg 2019/23232442.jgp, or using the first two numbers 14/4324234.jpg 23/232442.jpg

most filesystems will start to break with that many files, having operations taking longer and longer.

tryiing to check the folder with a graphic OS would probably put it into a long read operation

Ruggan
Feb 20, 2007
WHAT THAT SMELL LIKE?!


suggest they store the image binary in a local mysql database running on your local gaming laptop

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
honestly it is a weird way of doing it but it's not uncommon for weird poo poo like that to stick around long after requirements have changed. The important question is does it matter? Is it causing a problem? If it's not but an incoming change causes you to think it might be a problem, it's part of your job to optimise for that newly introduced problem.

But yeah, I'm trying hard to be kind here just in case.

Hughlander
May 11, 2005

Kuule hain nussivan posted:

I'd love it if someone could confirm that the below way of doing this is crazy and a true coding horror.

We have an application that fetches images from a hard drive that go with a form. Each form has a unique ID and each image is named according to the ID (so ID.jpg). All the files exist in the same folder, either in the root or a subfolder which can be easily parsed from the ID. When fetching these files, the application doesn't use a path to the file, but instead calls the terminal's find command. The folder has (at the moment) about 700 000 images in it and this is likely to incease to about a million sooner rather than later.

Is there any sane reason to use find. Why not just open the file with a path to the root and then the clustered folder if that fails, or the other way around since most of the files are clustered. This seems to be a huge loving waste of resources.

On a Unixlike that won't scale very well, the time to walk the inodes increases greater than lineally. A million years ago this used to be a big problem on news servers as each message would be a unique file in the directory with the path corresponding to the newsgroup name. (IE: /var/spool/usenet/alt/binaries/linux) The more active the newsgroup the longer it would take to determine if you had a particular message. I did a hack to the linux kernel that would let open() operate on a substring inode:%d where if it matched it'd just open that inode directly rather than try to treat it as a filename. That was a 20x speed increase at the time and would have gotten better with retention.

If you want to be a hero and have good tests put a shellscript 'find' in the path that checks if it's being called by your app and if so checks if the file exists at the right spot and return that and if not exec to the real find.

tak
Jan 31, 2003

lol demowned
Grimey Drawer

Hughlander posted:


If you want to be a hero and have good tests put a shellscript 'find' in the path that checks if it's being called by your app and if so checks if the file exists at the right spot and return that and if not exec to the real find.

If it's causing problems, propose a change to the implementation. If it's not causing problems, leave it alone and do something productive instead

Kuule hain nussivan
Nov 27, 2008

Hughlander posted:

If you want to be a hero and have good tests put a shellscript 'find' in the path that checks if it's being called by your app and if so checks if the file exists at the right spot and return that and if not exec to the real find.
This would actually be pretty great, but I can't trust this piece of poo poo to not break at the slightest modification. Maybe I'll do a small shellscript to log the time it takes for each find and complain to the provider using those.

tak posted:

If it's causing problems, propose a change to the implementation. If it's not causing problems, leave it alone and do something productive instead
Or in this particular case, where it's only one of the several likely issues causing piss poor performance, I can propose a change to the implementation, will be sent follow up questions that have nothing to do with said issue, there'll be a week or two of back and forth, they'll finally send me an email stating that they'll send it over to the technical department and in only 6-12 months I'll receive an email saying that this is a standard implementation in the software which is used by dozens of customers and cannot be the cause of bad performance, therefore it must be our server despite said server installation having been done by them (it was poo poo by the way) and the server currently running roughly 4 times the required resources they gave us for for about 10 more concurrent users.

I loving hate our software provider.

Edit: I seriously can't overstate how bad they are. A simple issue where copying a form didn't empty some text fields which it should have took half a year to fix. HALF...A...loving...YEAR!

Kuule hain nussivan fucked around with this message at 17:53 on Aug 29, 2019

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Kuule hain nussivan posted:


Edit: I seriously can't overstate how bad they are. A simple issue where copying a form didn't empty some text fields which it should have took half a year to fix. HALF...A...loving...YEAR!

You didn't dump them over that so they're putting in exactly the level of support that your organization will tolerate.

WRT using find, does it specify the exact path or does it use find with -name or grep? If the latter, I guaran-loving-tee it's because Karen at some other customer company wants to nicely arrange all of the files for all of the forms into their own folders, and this is their solution.

Edit: also their "stress test" if they have one involves maybe 1000 files, it's fast on the dev machine so what's the problem??????

Volmarias fucked around with this message at 18:14 on Aug 29, 2019

Kuule hain nussivan
Nov 27, 2008

Volmarias posted:

You didn't dump them over that so they're putting in exactly the level of support that your organization will tolerate.

WRT using find, does it specify the exact path or does it use find with -name or grep? If the latter, I guaran-loving-tee it's because Karen at some other customer company wants to nicely arrange all of the files for all of the forms into their own folders, and this is their solution.

Edit: also their "stress test" if they have one involves maybe 1000 files, it's fast on the dev machine so what's the problem??????

I've been saying for the past year or so that we just need to stop loving around and say that since they don't meet the support levels specified in the contract, we won't pay for support. The problem is that the rest of the team (it's a small team with only two technical people and two field specific people in it) are being pansies about it. It doesn't help that when the contract was written back before I even joined the organization, the support levels were specified by the provider. So only critical errors, which in this case means that the app is unreachable) have an actual time to fix alloted to them. Everything else is given a vague time limit, like "next update". So basically, they can postpone the deadline forever by just not delivering any updates to us. They don't do this because they don't give a poo poo about support levels, but they theoretically could.

It's find with -name, which is possibly limited to the uppermost level of clustering in some cases. I'd have to double check. But considering their track record, I always assume the worst.

And I wish they did even a simple stress test on this thing. I once spent 3 weekends in a row doing updates which never did anything because these fucks don't test poo poo. When I brought it up, I was told that they have very rigorous tests for their updates and told me to show any proof of updates that didn't do anything. When I offered to look them up I was told "Well that's not going to be constructive at all".

Edit: But enough about me raging. I'll try and look up some actual horrors for my next post. Shouldn't be hard!

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...
Sup Wrote The Goddamned Tests For The Vendor Because They Didn't buddy. I had to deal with that in my last job, and their support portal only worked in MSIE (unless you changed the user agent) so that was really the first clue of what you were getting. The only reason that we had gone with them was that their demo code actually worked. Most of the rest didn't even compile, let alone work.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe
Probably a benign WTF rather than a horror per se. Just learned while idly looking down the questions on Stack Overflow that PHP now has nullable type declarations, a bit like C# and probably other languages... but you have to put the ? at the start of the type name, e.g. ?int

I wonder whether they had to put it at the start because putting it at the end would be a breaking change in their crazy frankenstein language because it would be parsed as a ternary operator or something

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe
FUN FACT: "PHP" is actually a recursive acronym. It stands for "Pretty Horrible PHP"

YO MAMA HEAD
Sep 11, 2007

Hammerite posted:

Probably a benign WTF rather than a horror per se. Just learned while idly looking down the questions on Stack Overflow that PHP now has nullable type declarations, a bit like C# and probably other languages... but you have to put the ? at the start of the type name, e.g. ?int

I wonder whether they had to put it at the start because putting it at the end would be a breaking change in their crazy frankenstein language because it would be parsed as a ternary operator or something

https://news-web.php.net/php.internals/92276

"It's better to use ? position before the type, to reduce fragmentation with HHVM."

so blame hhvm i think

McGlockenshire
Dec 16, 2005

GOLLOCKS!

Hammerite posted:

I wonder whether they had to put it at the start because putting it at the end would be a breaking change in their crazy frankenstein language because it would be parsed as a ternary operator or something

You might wanna go back over the PHP7 RFCs, NikiC rewrote the entire parser. None of the things that used to be "we can't do it because the parser is dumb" are still valid.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

McGlockenshire posted:

You might wanna go back over the PHP7 RFCs, NikiC rewrote the entire parser. None of the things that used to be "we can't do it because the parser is dumb" are still valid.

You flatter me (wait, or do you?) by suggesting I might have any recent familiarity with PHP. I was just making an uninformed "lol PHP sucks" shitpost.

Tei
Feb 19, 2011
Probation
Can't post for 47 hours!
My problem with PHP is that dont have a personality. For a long time the devs of PHP copied things from Java. Now it seems the inspiration is the good parts of Javascript.
The original inspiration of PHP seems C and Perl with bits of C++.
In some parts, the language devs where not organized enough or up to the task.
The language have the elegance of a universal TV remote, unfortunally is usefull and hard to replace.
Maybe things get better, if Javascript can redeem itself, maybe Php too.

Soricidus
Oct 21, 2010
freedom-hating statist shill

Tei posted:

Maybe things get better, if Javascript can redeem itself, maybe Php too.

if

Adbot
ADBOT LOVES YOU

Pollyanna
Mar 5, 2005

Milk's on them.


Can someone sanity check me on this? We upload a file to a customer daily that looks like this:

code:
CompanyName_2019-08-30.csv
They came to us today with (effectively) this request:

quote:

We’re unable to get daily data from your file because we can’t overwrite our old copy of your file since the name is different cause of the timestamp. Can you please remove the timestamp in the below file?

code:
46962270854 August 30, 2019 456642454344.CompanyName_2019-08-30.csv.dat
We tried to remove it but failed.

First of all, that is not our file. They’re loving like, processing our file and deriving something from it. That’s their goddamn responsibility, not ours. Every one of our customers get the file with the timestamp and might depend on it, we’re not going to implement special casing just for you. Our responsibility is to put the file in your server in the specified location with the specified file name prefix and that’s it.

My question is this: is it unreasonable to put our foot down and say that we’re not going to change functionality just cause they’re too incompetent to strip a timestamp from a file name? Because we’re not going to do their engineering work for them.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply