|
I just upgraded to a new laptop, but haven't bothered to transfer really any files over because I don't need all the files on my old laptop. My old laptop though is a pain in the rear end to use because its very slow. Using my new laptop, is there a way to have my old laptop just work sort of like a cloud server or accessible storage device over wifi so anytime I need to grab an old file I can just open up it up in my new laptop and grab it? What's the easiest way to do this?
|
# ? Aug 1, 2015 05:20 |
|
|
# ? May 4, 2024 13:39 |
Melian Dialogue posted:I just upgraded to a new laptop, but haven't bothered to transfer really any files over because I don't need all the files on my old laptop. My old laptop though is a pain in the rear end to use because its very slow. Using my new laptop, is there a way to have my old laptop just work sort of like a cloud server or accessible storage device over wifi so anytime I need to grab an old file I can just open up it up in my new laptop and grab it? What's the easiest way to do this? Uh, turn on something like Windows File Sharing?
|
|
# ? Aug 1, 2015 07:43 |
|
ConfusedUs posted:Uh, turn on something like Windows File Sharing? Would that work across different networks? I.E. could I be on a seperate wifi network and access it like I would DropBox or Google Drive?
|
# ? Aug 3, 2015 00:24 |
|
Melian Dialogue posted:Would that work across different networks? I.E. could I be on a seperate wifi network and access it like I would DropBox or Google Drive? \\ipaddress\share IIRC the default firewall settings for "Home" networks only allow windows file sharing on the same network, you'll need to sit and edit firewall settings or change it to "Work" network which has its own restrictions that I don't remember.
|
# ? Aug 3, 2015 05:43 |
|
Much like that last dude, I just got a new laptop. It's a desktop replacement, and will be my everything machine. I'm going to be on the road with a wireless hotspot, so I'll have limited bandwidth and NAS isn't really an option. Any advice on what options I have for backup? I'm about to upgrade to Win 10 as soon as my new SSD shows up, if that makes a difference. (Came with 8.1.)
|
# ? Aug 5, 2015 01:03 |
|
surc posted:Much like that last dude, I just got a new laptop. It's a desktop replacement, and will be my everything machine. I'm going to be on the road with a wireless hotspot, so I'll have limited bandwidth and NAS isn't really an option. I mean a locally attached external drive is about the only thing I can think of. Veeam endpoint can be configured to run the backup job when a specific disk is attached. Should be fairly seamless.
|
# ? Aug 5, 2015 01:55 |
surc posted:Much like that last dude, I just got a new laptop. It's a desktop replacement, and will be my everything machine. I'm going to be on the road with a wireless hotspot, so I'll have limited bandwidth and NAS isn't really an option. The last guy was wanting some kind of filesharing thing, which isn't really the point of backup. Anyway, you're looking at doing some kind of backup when online sharing isn't possible. In those cases, you're looking at something that will back up to an external HDD (or two), either on demand or automatically. The previous suggestion of Veeam Endpoint isn't a bad one for this use case. You could also run some kind of script (robocopy!) that will move files to the EHD on demand.
|
|
# ? Aug 5, 2015 02:43 |
|
So, apologies if this is the wrong thread. I'm trying out Comodo Backup and also trying to learn some useful stuff like batch files. I have a virtual machine running on this computer that I'd like to backup after Comodo is finished backing up other things like my documents. If I used Comodo's "run task after backup" setting to run the following as a .bat, would it work? Is there a better way to backup the virtual machine? code:
|
# ? Aug 13, 2015 04:40 |
Sorry I don't know either program well enough to comment.
|
|
# ? Aug 13, 2015 14:32 |
|
I've been given a somewhat odd backup task that I would love to get some suggestions on. One of my colleagues wants to periodically mirror roughly 30TB of data from one of our institutional network shares onto a pair of Drobos he has in his office. Both Drobos are configured with 8 4TB drives, giving each device ~22TB usable for storage exposed as a pair of NTFS-formatted "16" TB volumes for a total of 4 volumes with ~11TB (usable) each. Since the total size of the source data to be mirrored exceeds the size of any single Drobo volume, my colleague's objective is to have a tool that will automatically distribute the source data across the four target volumes. (Of course, since optimal bin packing is NP-hard, he's not expecting a perfectly even distribution, he just doesn't want to have to manually partition the data.) The core of this objective could be satisfied by a fairly simple script backed by du and rsync. However, the data on the network share changes at a rate of ~500GB/week, so it's unlikely that the "ideal" distribution calculated during the first sync will remain ideal, so the script would also have to be capable of rebalancing, which adds a layer of complexity that, while not intractable, I'd rather avoid. So my question is: does there already exist software that addresses this need? We're open to solutions coming in the form of anything from an already-existing script describing the above to a full-fledged backup system to a method for simply exposing the four Drobo volumes as a single large volume. Free would be ideal, but we'd rather pay for a good solution that "just works" than spend lots of time on even a great solution. A couple of other notes that may be relevant:
I appreciate any suggestions anyone can provide!
|
# ? Aug 20, 2015 04:07 |
Break it down into chunks that fit your available storage, along some logical lines. Like maybe folders a-f go to one drive, g-m to another. And so on. Leave a bit of space for growth in each segment.
|
|
# ? Aug 20, 2015 06:07 |
|
ConfusedUs posted:Break it down into chunks that fit your available storage, along some logical lines. You could have it check the size of each folder in the root, sort them by size, then alternate them between each pool. That's guarantee a more or less even split as long as the folder sizes aren't hilariously lopsided. And even then, you can split them into sub-chunks.
|
# ? Aug 20, 2015 14:37 |
Methylethylaldehyde posted:You could have it check the size of each folder in the root, sort them by size, then alternate them between each pool. That's guarantee a more or less even split as long as the folder sizes aren't hilariously lopsided. And even then, you can split them into sub-chunks. That'd work too. My only concern is that it seems like it might be hard to find stuff for restore purposes if he's got lots of root folders. Most people look for things by location rather than by size. It'd be a pain if /folder_a/ and /folder_s/ were on drive 1, while b, c, f-k, and w are on drive 2. Folder Z is huge, and thus it has drive 3 to itself. Everything else is on drive 4. I'd hate to find anything in a mess like that.
|
|
# ? Aug 20, 2015 16:53 |
|
Artine posted:I appreciate any suggestions anyone can provide!
|
# ? Aug 20, 2015 17:21 |
|
ConfusedUs posted:That'd work too. My only concern is that it seems like it might be hard to find stuff for restore purposes if he's got lots of root folders. A mountain of symlinks maintained by a million lines of shell script to make it all seamless.
|
# ? Aug 20, 2015 20:29 |
thebigcow posted:A mountain of symlinks maintained by a million lines of shell script to make it all seamless. Or...just split it up along some logical lines in the first place.
|
|
# ? Aug 20, 2015 21:04 |
|
Thanks for the suggestions, everyone. It sounds like there isn't going to be a clean and easy way for my colleague to get precisely what he wants using only the hardware resources already available to him. I think at this point that the path of least resistance will be to statically assign specific source directories to be mirrored to a specific destination volume, per ConfusedUs' recommendation. thebigcow posted:A mountain of symlinks maintained by a million lines of shell script to make it all seamless. Haha, yeah, I had that same thought for about 5 seconds before snapping back to my senses!
|
# ? Aug 21, 2015 00:49 |
|
Hey backup thread, I've been tasked with setting something up for our little office here and I think I've got a decent plan but I didn't know what I was doing before the few hours of research I put in so please tear my idea apart - it can only help. I have four machines that are using Crash Plan's software to backup automatically to a dedicated harddrive(A) on my workstation. Crash Plan on my workstation will also be backing up to their online solution. Once every two weeks (or month or every week, we haven't decided yet) the owner will bring, from home, an identical harddrive(B) that I will mirror the harddrive(A) backup to which the owner will take back home at the end of the day. If my thinking is right, this gives me a few points of failure before we can't recover and even if everything fails we can always call them up and have them mail us a physical copy. Am I missing anything?
|
# ? Aug 26, 2015 17:04 |
|
Zigmidge posted:Hey backup thread, I've been tasked with setting something up for our little office here and I think I've got a decent plan but I didn't know what I was doing before the few hours of research I put in so please tear my idea apart - it can only help. Full disclosure upfront, I work for Code42. One thing that should be pointed out right off the bat is that you cannot back up the CrashPlan backups to the CrashPlan cloud. See here for more details. So the only thing getting sent to the cloud would be stuff from your workstation. It sounds like you're using a combo of our home and free products to enact this. Personally, I'd recommend looking at the CrashPlan PRO option (The blue box on the far right under "Business"). $50 a month for those 4 machines plus yours, you can back up to an external drive on-site as well as the cloud plus you get more control over the backups. Also, thank you thank you THANK YOU for trying to have multiple destinations. Ideally, you'd have one locally for "need it right the gently caress now" restores, plus one or more off-site for "holy poo poo the whole building went up in flames" kind of situation.
|
# ? Aug 26, 2015 23:49 |
Hi-5, backup industry buddy! My critique of his plan--outside of the technical limitations that I did not know--is that the human part of the "swap drives" is always the first to fail. People are going to forget to swap drives, or leave the drat things in their car on a hot summer day, or something of the sort. If you go that route, you want it to be as automatic and painless as possible. I'd suggest doing something like getting two drives to rotate in addition to the one you have permanently attached. Sync (via a script or something) to the rotating drives on a very frequent basis. Then just have the people swap them out on a regular basis. That way they don't have to wait for it to do something; it's always ready to go. Dimestore Merlin posted:Also, thank you thank you THANK YOU for trying to have multiple destinations. Ideally, you'd have one locally for "need it right the gently caress now" restores, plus one or more off-site for "holy poo poo the whole building went up in flames" kind of situation. Local backups are for when you delete the wrong thing or you lose a hard drive. Off-site backups are for when your building burns down. Cloud backups have the benefit of working if your whole TOWN burns down.
|
|
# ? Aug 27, 2015 01:55 |
|
ConfusedUs posted:Hi-5, backup industry buddy! Oh yeah. Having what sounds like a single drive for the storage location sounds like a bad time. RAID may not be backup, but ideally you'd like your backup location to be a RAID. You get a bad sector with a single drive and you're SOL. More generally, for any backup situation, you need to consider your environment and your goals. What data are you backing up? Where is it? Will the solution you've come up with be able to get the data back in a reasonable amount of time? Seems like every day I'm reminding people that trying to cram a 20TB file server into a single backup archive is no bueno. Good luck doing a full restore in any reasonable amount of time if it's not local. You need to consider network, disk iops, permissions, all sorts of things. The good thing about taking care of that on your end is that if you've done your due diligence your users shouldn't really notice a thing.
|
# ? Aug 27, 2015 04:46 |
Dimestore Merlin posted:More generally, for any backup situation, you need to consider your environment and your goals. What data are you backing up? Where is it? Will the solution you've come up with be able to get the data back in a reasonable amount of time? Seems like every day I'm reminding people that trying to cram a 20TB file server into a single backup archive is no bueno. I feel your pain, buddy. I don't take direct support calls anymore, but when I did, I also saw this kind of thing every day. Why yes, sir, you backed up 15TB of data directly to the cloud. This is despite our defaults of going to disk also, which you deliberately changed against our advise in case #12345. No, we can't magically make your data appear on your system. You did not keep local backups. It is going to take you on the order of a month to download that at the absolute max speed your connection allows. Yes, I know your entire business is down. I cannot change the laws of physics. I don't miss those calls.
|
|
# ? Aug 27, 2015 15:49 |
|
First, thank you so much for the insight, both of your input already helps a lot. We're a tiny company, like mom and pop with a few hands in the warehouse tiny and it seems to me like setting up RAID and paying for Pro is a bit of overkill in the overhead. There's about a hundred gigs of working files, raw and published media, and OS user configs. The owners wanted to avoid paying 30$/month for just the three vital machines - 50$ for pro is a harder sell. If the higher ups put the kibosh on that idea, is there an alternative I can look into? I have enough savvy to restore each machine myself from the raw files if everything falls apart. Could we use some kind of online large data storage maybe? I know, this plan sounds like a gigantic nightmare if poo poo hits the fan but at least it'd be doable. Sounds like I should do a bit of cost effectiveness to see which would cost less.... About the human element, instead of building a whole new rig to handle cycling drives (our workstations have no expandable room) could I just extend the chain by dumping the contents of drive(B) on a machine at home? The home machine would be out-of-date by up to two weeks if something happens in-transit but at least it's there and barring a freak fire happening at work at the same time we can set up a new drive(B) from the at-work backup. I think that makes sense? Downtime is not so huge a deal. Yeah we need timeliness, but manufacturing is our deal and we can keep doing business even if it takes a day or two to restore the workstations back to normal. I'm hope I don't come across as ungrateful, I'm really appreciative of the help so far. e: Actually, Dimestore if you can I'd love to talk more about this privately. Send me a line to zigmidge at gmail Zigmidge fucked around with this message at 20:44 on Aug 27, 2015 |
# ? Aug 27, 2015 17:38 |
|
I currently keep all my important files on a free DropBox account, however I have no external or physical off-site backups. My plan is to pick up two external drives and use TimeMachine and Carbon Copy Cloner on both. One will stay connected to my MacBook and the other I’ll keep at work and bring in once a week to update it. Does that seem like a reasonable backup plan or is there something obvious I’m missing?
|
# ? Aug 31, 2015 13:59 |
It's a bit manual/old school, but it'll work if you keep up with it.
|
|
# ? Sep 5, 2015 02:32 |
|
ConfusedUs posted:Synology NAS devices have both Crashplan and Amazon Glacier packages available. Are you referring to patters' Crashplan package? If so, have you been having issues with it lately? It looks like Code42 updated something or other with the Crashplan client, and now people are having problems getting the client to work on their Synology NAS.
|
# ? Oct 5, 2015 22:18 |
I don't use it personally, so I dunno.
|
|
# ? Oct 5, 2015 22:23 |
|
Deacon of Delicious posted:Are you referring to patters' Crashplan package? If so, have you been having issues with it lately? It looks like Code42 updated something or other with the Crashplan client, and now people are having problems getting the client to work on their Synology NAS. It's a constant tug of war updating that thing all the time. Synology breaks it through some update, you have to go to the retarded Oracle Java site and download the exact piece of crap you need for it which also requires a stupid login. Then you have to hope and pray it works after you're all done or you'll be forced to troll blog comment sections for a fix. It is not worth it at all in my opinion, I ended up just using a different workaround so that Crashplan treated it like another drive.
|
# ? Oct 5, 2015 23:53 |
|
The Gunslinger posted:It's a constant tug of war updating that thing all the time. Synology breaks it through some update, you have to go to the retarded Oracle Java site and download the exact piece of crap you need for it which also requires a stupid login. Then you have to hope and pray it works after you're all done or you'll be forced to troll blog comment sections for a fix. It is not worth it at all in my opinion, I ended up just using a different workaround so that Crashplan treated it like another drive. From what I've been reading, it looks like the easiest thing is to forget about Crashplan on a NAS. Just put it on a computer and treat the NAS like an external drive, like you said.
|
# ? Oct 6, 2015 00:28 |
|
So I HATE our Datto devices and have been shopping for a VM backup/cloud solution I have a conference call with Intronis in the morning anyone have any experience with them or have a better recommendation?
|
# ? Oct 6, 2015 03:58 |
Why do you hate Datto? What particular needs do you have that they don't meet?
|
|
# ? Oct 6, 2015 04:03 |
|
ConfusedUs posted:Why do you hate Datto? What particular needs do you have that they don't meet? The complete lack of control, part of it I blame on our sales people. They sold a device with more local storage then cloud storage but the device tries to force everything to the cloud and overcharges us on storage we can't just delete stuff from the cloud either everything has to be a bloody ticket with their "white glove support". Also they have messed up our seeding every single time and we just found out they don't support Windows 2012 Deduplication.
|
# ? Oct 6, 2015 05:19 |
socialsecurity posted:The complete lack of control, part of it I blame on our sales people. They sold a device with more local storage then cloud storage but the device tries to force everything to the cloud and overcharges us on storage we can't just delete stuff from the cloud either everything has to be a bloody ticket with their "white glove support". Also they have messed up our seeding every single time and we just found out they don't support Windows 2012 Deduplication. I'm not too familiar with Datto from a consumer perspective. They're mostly a competitor (loosely), and one I have a very healthy respect for. They do a lot of things RIGHT. I'm not sure to whom I would recommend you if Datto doesn't fit the bill. Especially from a technical perspective. As for the non-technical stuff, what I call the "soft" parts of the experience, like support and management and monitoring, I have next to no experience with how Datto does that. But from a technical perspective? They've got some good stuff. Let's slip sideways a moment. I may be way off base, but your frustrated tone is something I frequently hear from my customers when they find something doesn't work the way they expect it to, or that they were mislead to believe (gently caress you, sales guys), or that they don't understand. The frustration isn't always because they've chosen the wrong product. Sometimes, sure, but often it's just a lack of understanding somewhere. For example, take cloud storage issues. Common frustration, among my customers. Lots of friction points. Maybe they don't realize how long it will take to upload. Or that we're not going to delete an old backup until a new one is finished. Or they're using more storage than they expected for any one of ten million reasons. Whatever, they're ticked because they're out of or low on online storage space. If you're feeling frustrated because it doesn't work how you expect it to, see if you can get some resources before you jump ship. My recommendation would be to contact your account manager or your "white glove" support folks and express your frustrations. See if you can get them to explain how things work.
|
|
# ? Oct 6, 2015 06:08 |
|
Bit lower-level than most of the backup talk in here, but I need to give advice to a home business about backing up and the problem is the desire for local (to an external) daily backing up of enormous (50GB) goddamn PST files that are kept open basically all day (and even a shadow copy might not be counted on, since outlook might merrily send/receive halfway through a copy). Any suggestions besides forcing them to get into the habit of shutting outlook down and manually running a backup script? There seems to be a variety of no-name commercial software out there that claims to handle it that I have no idea about from sites I don't trust to recommend things.
|
# ? Oct 8, 2015 10:10 |
|
Most backup programs have an option to run a command before starting ajob. Could just have it run taskkill /im outlook.exe before starting. At the simplest, a batch file that runs the taskkill and then a robocopy of the PST is a better backup process than about 99% of the home business types I've dealt with. To spice things up, you could use a simple VBS file with this to gracefully close Outlook first: code:
Edit: Found a better script that checks to see if Outlook is running first. JBark fucked around with this message at 05:51 on Oct 9, 2015 |
# ? Oct 9, 2015 05:49 |
|
I'm currently using CrashPlan and while I haven't had (many) problems with it, I'm always looking for something better and/or cheaper. Is there any quality backup software that could backup to Google Cloud storage with a similar feature set to the Crashplan software? Notably, I'm not looking for a tape-equivalent system with Fulls/Partials/Deltas. but instead a system that will properly keep so many versions/revisions of my files, encrypted, in the cloud storage system on a rolling basis. Based on my usage of the Crashplan software I'd be saving money by moving to a Google or Amazon system. And while I'm venting, why does Crashplan only have never and 1 year for the remove deleted files options? Why not 2,3,or 5 year?
|
# ? Oct 14, 2015 22:52 |
You'll be hard pressed to find a subscription based service that backs up to Amazon or google for $60 a year, which is the price point for most consumer backup. It's far cheaper for the company to run their own storage than to pay Amazon. And easier for them to manage. If you find one, it's probably a larger annual expense. Carbonite Server Backup uses Google cloud storage, for example, but is far more expensive than their endpoint product. And if they let you use your own Amazon account, there is less incentive for you to keep a subscription. Solutions like this usually cost more up front. You'll probably have to find something that backs up locally, perhaps for a larger up front cost, and upload it to your own Amazon account. Something like Acronis maybe?
|
|
# ? Oct 14, 2015 23:12 |
|
Email I got while I was on holiday: "I need some email archives from some ex-employees from about 15-20 years ago. Could you get me some PSTs when you're back on Monday?" The backups from before my time consist of an archive box full of CD/DVDs, about 50 DDS-2 tapes (I have no drive for these), and 200-300 DLT and SDLT tapes (which I do have drives for). Very few of the tapes are labelled, a few with dates and some with a server name, and I'm not even sure what was used to create the backups. I do know that the more recent DLT/SDLT backups were done using Retrospect, which is probably the worst application ever created. Hilariously enough, I found a 12 year old CD with a 100 part RAR file that contained a 1GB PST file for a person I was looking for. Not a single error extracting the rars, and pst is still good and has email going back to the mid 90s. I certainly didn't expect the CD to still be in perfect condition, and even some CDs I found going back to 98 or so were all perfect. Once the ancient Adaptec 29160 I ordered off eBay gets here so I can actually connect the old DLT drives, it's back to the horror of ancient tapes and terrible backup apps. Even better, my predecessor tried to restore some data from tapes about 8 years ago, and every single tape was useless. From what I gather in his old emails, the tapes were likely written with garbage data from the beginning, and nobody had ever attempted a single restore so they had no idea until they went to restore years later. Data recovery company found nothing of use on the tapes, even the tape headers were junk. Yep, it's pretty much the stuff a backup engineer's nightmares are made from.
|
# ? Oct 27, 2015 07:53 |
Wow that's a hell of a lucky story. Also, man, 15-20 years ago is a HELL of a long time. Is there a reason you keep backups from that long ago?
|
|
# ? Oct 27, 2015 15:26 |
|
|
# ? May 4, 2024 13:39 |
|
I'm glad to find this thread today. I do part time computer janitoring at a school, and I've finally had a chance to try and get some proper backups working on a new (old) server I've got my hands on to supplement my current 2 servers. I've got 2 VM host servers with a shared folder (RAID1) on each that I have been using to robocopy the user files into twice a week. Not enough room to store any more than that, not enough money to buy anything better. I'm not happy with it but so far it has sufficed and I've been able to restore a file every time I've needed to. One server has lost both disks in its RAID1 now, not both at the same time obviously but far enough apart that it was possible to get replacements and no data was lost. It's a bad feeling when you're sitting there with a degraded array waiting for the replacement disk to arrive though, knowing you don't have anywhere else to put your data. With the new (old) server, I've finally had enough extra space to do daily backups of user files, with additional weekly server backups (which I've never had space to do before wheeeee). The server has a LTO tape drive and today I've been trying to work out how to use it to get a real concrete actual backup that isn't just sitting on a raid disk in the same cabinet on the same network as everything else, waiting to be wiped out by fire or virus. I got the drivers installed, I even got the utility from HP that let me wipe a tape ready to use. Unfortunately for me, today I also learned that windows server 2008+ no longer supports tape drives in windows backup. I've got no budget to speak of, the internet connection is a flappy wifi link shitpile so cloud storage is out of the question (we're banned from using "the cloud" due to security concerns anyway), I'd really pinned my hopes on getting the tapes working. I haven't found so far a single free backup software that will let me use my tapes on Server 2012 r2 without charging me money for it or isnt limited in how much data you can back up at a time (I have about 400GB total I think). Is what I want possible? Is there some kind of free open source software that will let me just use the drat tape drive? It doesn't appear as a drive letter in windows, which, if I remember correctly, is how it was before this server was wiped of server 2003 and handed to me. I don't have access to server 2003 and I wouldn't really want to go backwards anyway if possible. Any recommendations?
|
# ? Oct 27, 2015 18:17 |