|
ConfusedUs posted:My experience both personal and amongst friends/internet buddies is that Crashplan really chokes speed- and stability- wise once you get a lot of data, more than about 2TB or so. There used to be a way to modify some registry keys to allocate more memory to Crashplan, which would usually help. Last I looked (~9 months ago) it was on their website. Looks like these days it's not in the config file any more, you have to open a CLI to change the setting. But yeah the problem is that the Crashplan client by default only permits itself to use up to a certain amount of RAM, and the bigger your backup set the more RAM it uses. I have a 4TB backup set, and the Crashplan service is currently using just under 2GB of RAM. Code42 appears to recommend setting the maximum to 1GB per 1TB.
|
# ¿ Mar 9, 2015 16:24 |
|
|
# ¿ May 3, 2024 03:53 |
|
ConfusedUs posted:Hey all, it's World Backup Day! Who's offering deals and discounts? I'm not seeing any so far.
|
# ¿ Mar 31, 2015 17:34 |
|
22 Eargesplitten posted:This Black Friday I want to get an SSD and something approaching a backup system. For backups, a full-blown NAS is out of the current budget (maybe I'll ask larches about Buffalo's offerings ). I'm thinking of getting a ~2tb external for now. Is there a way to keep it hooked up physically, but only have it recognized when I actually want it to back up? I want to automate backups, but not have it connected 24/7 in case of cryptowall. I could write a batch or powershell script for it if I need to, I just want to make sure the idea is feasible. What about making the backup destination read-only to your normal user account, and configure a special user account for the backup job to run as?
|
# ¿ Dec 2, 2015 22:24 |