|
Could a compression algorithm like this theoretically be used to help process larger files quicker? I currently deal with stupidly big data sets clocking in at over a gig for a csv and I can almost hear the excel reports that process this screaming at me as they slowly chug through the calculations.
|
# ¿ May 4, 2016 19:38 |
|
|
# ¿ Apr 27, 2024 21:47 |
|
enki42 posted:Depends on the bottleneck. If it's transferring the files over a network or something, maybe (this wouldn't be the case with CSV and excel reports). But you'd need to decompress it, at least in memory, to do anything useful with it. Oh yeah, don't get me wrong, i fully understand excel isn't the best tool for this (we typically use it as a test case before running it through our server), i'm just not familiar in the slightest with how processors handle/work through the data at a memory level and wasn't sure if there would be some nifty compression based mcguffin that would mean PiedPiper's application had the potential for much grander uses beyond data transfer and storage.
|
# ¿ May 4, 2016 23:33 |