This is a thread where we discuss backups. There are tools and services available for every operating system, for every need, at every price range. Feel free to ask questions! It doesn't matter if you want to back up your phone pictures or your production database. It all needs a backup. I may eventually include a list of recommended backup applications here, but there are so many that it'd probably be easier to just post your requirements and ask what fits your needs best. Why do backups? Everyone should have a backup in place because failure is inevitable. You will face the loss of data eventually. Maybe a hard drive fails, your building burns down, or your laptop takes a nose dive into a speculum bucket. Without a backup, you’re in trouble. You should back up any unique data you have, as well as any data that would take too long to replace through some other method if the original data were to go missing. “Too long” is entirely subjective based on your needs. This means your documents, pictures, and anything of the sort with user-generated content should be backed up. Your movies that took weeks to rip from disk should be backed up. Your business-critical Excel spreadsheet with its Access backend (oh god) should be backed up. Your email server should be backed up. And so on. But you don't necessarily need to back up something you can download and install in 30 seconds, like Skype. When in doubt, back it up. Are there any good practices to follow? Of course! Other than just having a backup in the first place, there are a few very basic considerations you should have. First and foremost, If you have only one copy of something, it is not backed up. If you have a single point of failure, your data is not backed up. Your RAID array is still a single point of failure. A CD in a vault is still a single point of failure. Your stuff is not backed up unless at least two copies exist separately. RAID is not backup. RAID is not backup. RAID is not backup. Beyond that, I feel there are three things every backup plan should consider: Have an on-site and an off-site backup. Also known as a “hybrid” backup solution, this means you have a backup that is easily accessible and another that is not. This allows for speedy restores for anything short of disaster, and any restore at all if, say, your building burns down. For your home or small business users, the off-site backup will often be on the cloud. For people/businesses with large data and/or slow connections, off-site backup often means physically taking some form of media to another location. Keep more than one version. Every backup solution will handle this a bit differently, but the core idea is that you want to keep multiple versions of everything going back some period of time. This allows you to call forth a past version of any/all data to recover from an issue that affects the latest version. This is especially important these days, thanks to Cryptolocker and other ransomware. A backup solution that doesn't allow you to pick a restore point prior to infection is virtually useless when faced with Cryptolocker. Test your restores. A backup that you cannot restore is no backup at all. You should, at the very least, know how to do the restore, so you’re able to actually do it when the time comes. If this is business critical data, you should be testing your restores on some regular basis. Now, go forth and back that data up! ConfusedUs fucked around with this message at 02:07 on Mar 5, 2015 |
|
# ¿ Mar 5, 2015 01:58 |
|
|
# ¿ May 2, 2024 17:24 |
To get the ball rolling, I figure I'll post my home backup scheme! My primary storage for my home is a Synology NAS. All of my laptops/desktops (I have four in my home) back up to the NAS daily. The macs use Time Machine for this. The NAS itself is backed up to an external harddrive every day and to the cloud once a week. This gives me three layers of redundancy (NAS + EHD + CLOUD) for all my regular systems, and the NAS itself has two layers (EHD + Cloud).
|
|
# ¿ Mar 5, 2015 02:06 |
fattredd posted:What's the most cost-effective way of backing up my junk? The most "valuable" information I have is about a terabyte of movies/shows. How many copies should I have? Is it best stored on an external drive, or in "the cloud"? Can you replace this stuff easily? Like would it be a pain in the rear end to re-rip everything? I assume it would. I'm also assuming you got this stuff legally and can't just re-torrent it in an afternoon. If you can't easily replace it, you should follow the best practices up there: get yourself an on-site and an off-site backup with versioning. It could be rotating a couple of harddrives, or cloud based. Cloud is fine if you can upload a TB in a reasonable time frame, since your movies and music won't be changing a lot. I'd look into something like crashplan. A NAS could be really nice if you wanted to stream the content to multiple devices simultaneously, but I wouldn't get one JUST for backup.
|
|
# ¿ Mar 5, 2015 16:59 |
AgentCow007 posted:Wow, great timing! Crashplan can save local backups, I believe. I'd just use that. Crashplan > NAS & Cloud Edit: Unless you're looking for a complete "image" backup? If so, Acronis is actually one of the best. Carbonite has a "Mirror Image" option also, but it has some pretty steep limitations.
|
|
# ¿ Mar 5, 2015 17:00 |
The Gunslinger posted:I really dislike Crashplan, numerous times I've encountered between clients due to version differences due to fubar'd upgrades. Becomes an absolute nightmare to troubleshoot, often requiring multiple reinstalls to sync everything back up. This is a problem with auto-update on their end and it continues to surface every now and then, I've seen it happen to a number of clients over the years. They don't seem to bother with official packages for various NAS distros too and the user ones are often broken by changes. The backup inheritance is also confusing for users and I've had a few clients accidentally kill their archive set because of a stupid pop up related to it, often precipitated by a client version mismatch and some other nonsense. All of the consumer-level backup applications are more or less the same. Each has its own quirks, limitations, and drawbacks. They all have mostly the same result, and even act mostly the same on the backend. This includes Carbonite, Crashplan, and Backblaze. So try one of the others and see what you think. Crashplan tends to freak out if your data size is large (over 2TB you're more or less guaranteed to have issues). Carbonite hates large file counts (several million) but size doesn't matter much. I'm not as familiar with Backblaze. They all cost about the same and all have free trials, so give them a shot and see which works for you. Crashplan is my favorite of the three, but YMMV. Carbonite's support is way better than Crashplan's but their client lacks some of the features. This is assuming you mean Windows; it's really hard to beat Time Machine for macs. Just do a TM backup and upload that stuff to Glacier or something if you have a Mac.
|
|
# ¿ Mar 5, 2015 17:33 |
Peever posted:Anyone have experience with Amazon Glacier? I currently use Crash Plan as an off site backup but was thinking about adding Glacier since it is so dirt cheap. Glacier is just cloud storage. You just upload your stuff to it however you want. The big gotcha is in how long it takes to get stuff back. You can upload any time, but to download you have to request the data and there's a delay of a few hours before it's eligible for download. It's cheap, but it's not unlimited for one price, like Crashplan or Carbonite.
|
|
# ¿ Mar 5, 2015 18:38 |
jammyozzy posted:Could you elaborate a little more on what issues are likely? I don't have 2TB of stuff at the moment but my long term plan is to build a NAS and chuck my DVD collection on there amongst other things. Especially if I sign up for the family plan and get my sister involved we'll easily surpass 2TB without much effort. My experience both personal and amongst friends/internet buddies is that Crashplan really chokes speed- and stability- wise once you get a lot of data, more than about 2TB or so. There used to be a way to modify some registry keys to allocate more memory to Crashplan, which would usually help. Last I looked (~9 months ago) it was on their website.
|
|
# ¿ Mar 6, 2015 00:36 |
Yeah the client Back-end is pretty solid. No problems that I'm aware of.
|
|
# ¿ Mar 6, 2015 14:38 |
I hate tapes. I recognize that they have their uses, but I still hate tapes.
|
|
# ¿ Mar 6, 2015 23:51 |
Lots of home backup chat in here, which I expected, but we're open to business backup chat too! I'm pretty knowledgeable about Windows Server backups so if anyone wants to know anything, such as why VSS flips out if you back up the same MSSSQL database with two different applications, just let me know.
|
|
# ¿ Mar 7, 2015 03:14 |
Not that I know of. You could roll your own thing if your systems are on the same network. Robocopy or rsync could do it for you, if you can script.
|
|
# ¿ Mar 8, 2015 03:38 |
alkanphel posted:If your NAS is pretty large, for example 10GB, how would you back that up to external HDs? You have a few choices 1) Use a backup program that allows you to create customized backup sets, and use multiple EHDs. 2) Roll your own backup script (robocopy or rsync, as appropriate), and use multiple EHDs. 3) Invest in a second large array to use as a backup target. 4) Invest in a tape drive. I prefer #1. Rolling your own backup is perfectly fine, but I dislike maintaining it. #3 is expensive. I hate tapes.
|
|
# ¿ Mar 8, 2015 06:04 |
Sheep posted:Good timing on this. We've got like forty laptops at work that aren't part of the domain and don't have any sort of backup software on them so its just a matter of time before one dies and some user loses all their poo poo because they ignore my warnings to always work on the server via RDP (which IS backed up). Good luck. If you were on a domain you could force a solution through software deployments and GPOs, but since you're not... You could invest in something like a crashplan or carbonite sub for each computer, and hope the users don't disable it. Someone probably will. Or you could make it official policy to work on the server via RDP (or create some network shares, or whatever) and just say "tough poo poo" when things go badly. When, not if. If it goes badly enough, you can leverage that to get your systems on a goddamned domain so you can prevent the issue.
|
|
# ¿ Mar 8, 2015 06:07 |
SlayVus posted:What is the best way to back up when you have slow upload speeds? My cable provider sucks for their upload speeds, 50 Mbps down with 4 Mbps up. I really don't have anything sensitive that can't be stored on a flash drive though. Really, I don't have anything sensitive at all. Store all my passwords with last pass, don't have anything like wills or such. All my data is games or TV or movies. 4Mbps isn't terrible. If you have a couple TB, yeah, maybe it'll take a few weeks. But you should be good after that. (Edit: Crashplan offers a seeding option in the US where you ship them your data on an EHD) If you really can't do cloud, you can always rotate some externals and physically take one off-site on a regular basis. ConfusedUs fucked around with this message at 07:31 on Mar 8, 2015 |
|
# ¿ Mar 8, 2015 07:16 |
Farmer Crack-rear end posted:Looks like these days it's not in the config file any more, you have to open a CLI to change the setting. That's a truly ridiculous amount of memory, in my opinion.
|
|
# ¿ Mar 9, 2015 17:05 |
Hey all, it's World Backup Day! http://www.worldbackupday.com/en/ Many backup services are offering deals and discounts for their products today, if you've been waiting for something like that.
|
|
# ¿ Mar 31, 2015 15:49 |
Farmer Crack-rear end posted:Who's offering deals and discounts? I'm not seeing any so far. Backblaze is for sure, for new subs. Carbonite is, but via email campaign I think.
|
|
# ¿ Mar 31, 2015 17:44 |
GuyGizmo posted:I'm glad this thread is here, since I have a Windows related backup question, and I'm hoping someone here can help me out. This is the $64,000 question for Windows users. Frankly, Time Machine is so goddamned good! Everyone who uses it wonders why the hell Windows doesn't have something like that. The answers are quite complicated, but it really comes down to basic platform design and the massive number of possible hardware configurations. Anyway, you can come close in a couple of different ways. Imaging/cloning backups like Acronis True Image and Carbon Copy Cloner create an image of the entire drive(s) involved. You can restore those, sometimes over top of your existing Windows install, sometimes to a bare system, and almost always to a second drive. Depending on the program, you can sometimes restore to different hardware too. System State backups are something that you may run across. When combined with a backup of your entire file system, you can get everything back. All your applications, all your files, everything. Windows Server Backup, NTBackup, and a bazillion third-party apps will back up system state. Big failings here are that you need to have Windows already installed, then you restore this over it. Identical hardware is best, similar usually works, but big differences cause issues, because system state includes device drivers.
|
|
# ¿ Mar 31, 2015 19:48 |
Sheep posted:FWIW I can't find Backblaze's discounts (if they exist?) and it doesn't look like Crashplan is doing anything. I had a link earlier, lost it, and now I can only find this http://www.appsumo.com/backblaze-wo...%3D%3D&bdp=1581 YMMV ?
|
|
# ¿ Mar 31, 2015 21:16 |
I put this in the OP, but man, file-sync options like Dropbox and OneDrive aren't really backups. They're great for content access but lack crucial functionality you need in a true backup solution. You should look into a NAS for "everywhere" access and back that up using some service of your choosing.
|
|
# ¿ Apr 1, 2015 03:08 |
stevewm posted:Didn't notice there was a Backup thread, so I had originally posted this in the Ticket Came in thread.... The "general consensus" you speak of is a very old school of thought. Sure, 12 years ago, when VSS was new, third-party backup stuff sucked. Today? Nah. There's a dozen applications that can do this for you. Carbonite Server Backup can do your MSSQL databases (and all sorts of other things), and is the one I'm most familiar with. It supports Differential (delta) and Incremental (log-based, for FULL or BULK databases) backups, uploads automatically, has throttling you can use to limit it during hours, allows you to schedule at what time you want, etc. It also has compression. Databases typically compress very well--over 80% isn't uncommmon, and over 90% is possible. I wouldn't be at all surprised if your full backups compressed to 10GB or less. Almost certainly under 15GB. You'll still need to perform a full backup on some regular basis (weekly, monthly, etc) but that's true for any backup process. The only real gotcha here is that MSSQL itself has a limitation where it doesn't keep track of what is backing it up; only that a backup occurs. If you throw multiple applications at the same database, it'll break your inc/diff backups for one or both until that application performs its next full backup. This is true for any application, but some people don't realize this, and then get pissed when they try to run both during a trial phase.
|
|
# ¿ Apr 2, 2015 17:48 |
stevewm posted:I kinda figured it that, but really couldn't find much to support it. I am not a DBA by any means. Just the lowly computer janitor responsible for making sure it gets backed up. CSB will do local backups also. It can replace the entire system. Or you can just use it to backup and compress your .bak files if you want.
|
|
# ¿ Apr 2, 2015 19:37 |
I'd try the extra space.
|
|
# ¿ Apr 4, 2015 20:15 |
alanthecat posted:I've MS SQL Server running on 2008r2 and I'm just using Windows Server Backup. Is this ok? It'd better than nothing, but I'd look into some way to get some more granular backups somehow. It's often nice to be able to restore just one database to a specific point in time.
|
|
# ¿ Apr 13, 2015 18:05 |
havenwaters posted:I'm curious what do you use to actually backup the windows computers to your NAS. Do you just use Windows 7 Backup/Windows 8 File History or do you use something else. I have some 300 GB of documents and photos from work that would suck to lose and I should probably do more than just copying them over to an external hard-drive once in awhile. I'm trying to keep this thread company/product agnostic, focusing on best practices, so I didn't name it. And it would be absolute overkill for the average home user, as it's the server-level product I work on. Anything that backs up to a disk would work in this scenario. Windows backup would be fine. Crashplan would work. Acronis or other imaging software would work. You could do it with robocopy batch script, even. Seriously, whatever.
|
|
# ¿ Apr 22, 2015 22:08 |
wyoak posted:Is it possible to restore transaction logs on top of a SQL backup made with VSS? As far as I can tell, VSS backups (or maybe just these VSS backups) are only the DB and log files, so I can mount the databases from the restore, but can't restore anything on top of them since they won't be in norecovery You would probably need to do some command-line work, but I don't know exactly what commands. I'm pretty sure you can apply logs to a database though, if you're in a consistent state without a gap. Most backup applications worth a drat will just do it for you. Pick your backup to restore, select a time, everything's there as of that time. Edit: https://msdn.microsoft.com/en-us/library/ms177446.aspx?f=255&MSPPError=-2147217396 Looks like restore everything, one at a time, in order, with special flags like NORECOVERY.
|
|
# ¿ Apr 22, 2015 22:41 |
teagone posted:Just to contribute, I have a Plex Media Server I built running Windows 8.1 with a little over 7TB of storage that's currently using DrivePool to orgranize/protect my media. I was looking for a super simple backup solution I could use with my existing setup after I had 2 hard drives crash, and so far the DrivePool app has been pretty solid. It was super easy to set up (for someone like myself who had never looked into back up solutions or even knew what drive pooling was) and it only cost me $20 too. DrivePool is pretty sweet, but I wouldn't call it a backup solution. It's sort of a weird mix of software RAID and Storage Spaces. And RAID is not backup! Had a customer case a year ago where whatever funky drive-spanning thing that DrivePool does was playing hell with one of our products. Caused a lot of files to be backed up twice.
|
|
# ¿ Apr 22, 2015 22:50 |
teagone posted:Hmm, what would you suggest be a good alternative in my case? Cloud backups? I do plan on sticking with DrivePool on my Plex Server for the foreseeable future, but another added layer of protection wouldn't hurt Local external drives, a NAS, cloud backups, or some combination. You want a separate, independent copy of the data.
|
|
# ¿ Apr 22, 2015 23:04 |
So you think it's a good solution when it doesn't cover two of the top three reasons for data loss? It's good software but come on.
|
|
# ¿ Apr 22, 2015 23:47 |
redeyes posted:I think its a great solution for home users. Top reason for data loss is hard drive crashing, at least around the computers I get to mess with. Yeah, maybe that's the top reason, but it's certainly not the only reason. Fire, theft, and power-related failures are the other primary reasons. If only one copy of your data exists, it's not backed up. RAID is not a backup solution for this very reason. DrivePool is no different. All it does is add some redundancy to address one specific potential cause of data loss. Backup is a catch-all for any problem.
|
|
# ¿ Apr 23, 2015 00:47 |
Shaocaholica posted:Looking for a recommendation for a backup service, preferably unlimited. I have many many computas but mostly work with 3. I like backblaze but I'm not sure if its worth the cost to scale to ~3 machines. I mostly have photos and video to backup which make up the biggest share. I don't care about full OS/App backups. The rest of my files are small project files. In all right now less than 2T although that could explode to 4T in the next 18 months. Thing is each of my machines has a different purpose so my main photo machine would have all the photos, my laptop would have misc small files and my 2nd desktop might have projects. Get a NAS. Back up all of the machines to the NAS. Get one subscription to one provider of your choice, and back up the NAS to it. This is what I do. People back up their rippedmovies because it's often easier to just re-download them from your backup provider than it is to re-rip. Assuming you didn't just sell the disk when you were done.
|
|
# ¿ Apr 23, 2015 22:20 |
Shaocaholica posted:So...is there any way to have a backup service run on my NAS as opposed to running on a PC with the NAS mounted? Synology NAS devices have both Crashplan and Amazon Glacier packages available.
|
|
# ¿ Apr 23, 2015 23:10 |
alkanphel posted:I assume you partition the NAS for each machine to backup to? Would you still need to backup the NAS physically, besides to the cloud? I just have a separate folder for each machine on mine. So my wife's macbook air goes to one folder, my macbook pro to another, and my son's PC to a third folder. I do back up my NAS locally, to an external HDD, because there's a lot of stuff that only exists on it. All my music, pictures, and other media are on the thing. It's not just a backup drive for me. I do exclude the backup folders from that backup, though. No point in backing up the backups. So my setup looks like this Endpoints > folders on NAS NAS (excluding backup folders) > external HDD NAS (everything) > Cloud
|
|
# ¿ Apr 24, 2015 03:17 |
Shaocaholica posted:How are you personally doing this step? My NAS is a Synology and I've used both the Crashplan and Amazon Glacier packages. Both work fine. Currently I use a server-level product that I work on--it's free for me, so why not? But totally overkill for normal use.
|
|
# ¿ Apr 24, 2015 04:52 |
Shaocaholica posted:So what did people do before online backup services became mainstream? Backup to external and pray the house doesn't burn down? Keep an external offsite and rotate every so often? Backup to optical media? Tapes. I wouldn't trust a used mechanical hdd. Nor do most people. There's not much of a market there. It's one of those things that you're better off just buying new.
|
|
# ¿ Apr 27, 2015 02:47 |
Shaocaholica posted:I meant home enthusiasts. Or did that crowd use tapes? Oh, home users, if they backed up at all (most didn't) it was usually with external hdds, or CDs, or DVDs. The cloud age has really brought backup to the home market.
|
|
# ¿ Apr 27, 2015 03:00 |
brylcreem posted:I (re)read the OP, but there's nothing about those options in there? Wow, you're right. I know I wrote up something, but I must have somehow left it out. I'll add it back to the OP soon, but the biggest part of it is that most (all?) of these services don't have mass restore options to previous versions. The line between file sync and backup is getting increasingly blurry, but that one thing is a big differentiator. Sync services great if you need to pull down a few files or the latest version of everything, but you're screwed if you get something that trashes a large segment of your stuff or, worse, something like Cryptolocker that requires a mass restore to a previous version of mostly everything.
|
|
# ¿ Apr 27, 2015 16:17 |
Flipperwaldt posted:Saw someone link Veeam Endpoint Backup in another thread here in SH/SC and it looks pretty good. I've not (yet) had the opportunity to use their Endpoint backup, but their VM backups are amazing.
|
|
# ¿ Apr 27, 2015 19:30 |
DrBouvenstein posted:Oddly enough, I just came into here to complain about Veeam! That's actually a really common problem on Windows systems. Blame MS for making a maximum path length way back in the day, and preserving it for backwards compatibility reasons to this day. This isn't an issue unique to Veeam. This is a pretty complex topic once you start digging into the technical stuff. I think the max path length for NTFS is actually like 32k characters, and there are ways for applications to allow lengths like that through various APIs. But what happens if Program-A puts a long path object somewhere, and then Program-B doesn't know how to deal with it? What if Program-C uses a completely different method and doesn't understand Program-A's stuff? So in short, there are ways around the max path length, but they kinda get flaky when you start interacting with other functions/programs that don't use those workarounds or use different ones entirely. Many applications just honor the max path length to avoid the headache. I don't really have a lot of advice here except, maybe, reconsider your folder structures if they're long enough for this to cause you problems on a regular basis. You might also be able to set up some symlinks using the 'subst' command. Like linking C:\Some\Crazy\long\path to X:\ ConfusedUs fucked around with this message at 21:04 on Apr 28, 2015 |
|
# ¿ Apr 28, 2015 21:01 |
|
|
# ¿ May 2, 2024 17:24 |
Acer Pilot posted:I'm thinking about getting a Dropbox Pro account. Any negatives aside from the 30 day only file history? Are there ways to prevent accidentally pushing corrupted files? It'll push any changes. It can't differentiate changes of one type from another. Make sure you can restore folders or groups of files to a past version. Last time I tried, you could only restore to the latest; everything else was one at a time. Fine if you deleted a spreadsheet. Sucks if you get Crypto viruses. This is why Dropbox isn't a true backup solution.
|
|
# ¿ Apr 29, 2015 04:29 |