|
Don't do it unless you can run a hybrid system of some sort where there's a local cache on-site, or your users are comfortable with syncing the folders they actually need instead of everything in the namespace.
|
# ? Feb 27, 2015 23:17 |
|
|
# ? May 31, 2024 21:38 |
|
Thanks Ants posted:Don't do it unless you can run a hybrid system of some sort where there's a local cache on-site, or your users are comfortable with syncing the folders they actually need instead of everything in the namespace. Speaking of that, anyone running Egnyte, Nasuni, Panzura, etc. and have any experiences they can share?
|
# ? Feb 27, 2015 23:41 |
|
NevergirlsOFFICIAL posted:Anyone here work in an environment where the traditional file server environment was replaced with something like box, dropbox, google drive etc? What does it look like if you have like a 500gb shared drive that is moved to dropbox - do all your users keep that entire 500gb folder locally and sync back and forth (like how my personal dropbox does)? Do they just go via web interface and download on demand? This is configurable to what you want to see. Generally speaking, using OneDrive or Dropbox leaves a local copy for user access when not connected to the internet. OneDrive is snazzy in that you can configure that web interface (a la sharepoint) to suit your needs -- particularly useful for securing out-of-org collaboration. The question seems kind of simple and I wonder if I'm missing something -- are you basically asking if a shared storage location on OneDrive / Dropbox behaves like your personal Onedrive/dropbox? By default, yes.
|
# ? Mar 1, 2015 04:33 |
|
Potato Salad posted:This is configurable to what you want to see. Generally speaking, using OneDrive or Dropbox leaves a local copy for user access when not connected to the internet. OneDrive is snazzy in that you can configure that web interface (a la sharepoint) to suit your needs -- particularly useful for securing out-of-org collaboration. Sorry here's what I'm asking... My company has a shared file server with one main drive called S:\ and in it are a bunch of subfolders for example S:\Finance S:\HR S:\Sales S:\Utilities S:\Marketing Everyone sees S and certain things are restricted, for example only HR can access HR. If we switch to dropbox what would happen? I assume the following: 1. Everyone would get LOCAL COPIES on their desktop/laptop of everything they have permission to 2. HR would also have permission to HR folder so members of that dept would all have local copies of that folder 3. I'd remove and add permissions (probably through AD with okta https://www.dropbox.com/en/help/362 or something like that) and that would remove or add local copies of stuff all this as opposed to 1. Users pick whatever folders they want to sync locally (for example I don't want to waste 50gb on my SSD for marketing poo poo) 2. anything that I didn't pick to sync I can access via web interface Dans Macabre fucked around with this message at 22:10 on Mar 1, 2015 |
# ? Mar 1, 2015 21:57 |
|
Does anyone know if this live event was recorded and put up somewhere to view? I had to miss it http://www.microsoftvirtualacademy.com/liveevents/getting-started-with-powershell-desired-state-configuration-dsc
|
# ? Mar 2, 2015 23:37 |
|
ghostinmyshell posted:Does anyone know if this live event was recorded and put up somewhere to view? I had to miss it I too had to miss and was hoping to see a cool email in my inbox with a recording of the session. Hasn't show up yet so I'm guess probably not going to happen. I would really love on though.
|
# ? Mar 2, 2015 23:39 |
|
No but this looks similar https://www.youtube.com/watch?v=lP6noSW6Vr4
|
# ? Mar 2, 2015 23:40 |
|
They say it takes about two weeks for recordings of those academy events to be made available, so just be patient. I tuned in for a little bit, and I was pretty disappointed. It seems like the kind of thing that could really be condensed into an hour, rather than be spread out over 8 hours with crappy examples. For example, when I was watching, they kept rebooting a machine to try and demonstrate how DSC would remediate the machine, but their VM was too fast to reboot to actually show the change happening. It kind of soured me on the whole training.
|
# ? Mar 2, 2015 23:50 |
|
I feel that way about the virtual academy videos. Its a solid 10 minutes of two fat dudes cracking jokes before they get to the meat of the content. I'm a busy loving dude, just explain to me how this poo poo works so I can get on with my life. *of the content I have watched.
|
# ? Mar 2, 2015 23:55 |
|
NevergirlsOFFICIAL posted:1. Everyone would get LOCAL COPIES on their desktop/laptop of everything they have permission to <--BY DEFAULT - THIS CAN BE CONFIGURED IF YOU WANT SOMETHING ELSE Yes. This is a good example of the kind of granular control you will have. You can pay for something like Onelogin or Otka for Dropbox SSO, but do look into Active Directory Federation Services / AD Sync with OneDrive as well. Note that, depending on how your volume licensing for Office works, the combination of Office licenses and cloud storage [edit: with O365] may end up saving you in the long run. Are you by any chance looking at encryption / data control as well?
|
# ? Mar 3, 2015 01:12 |
|
Sorry for double post, but do look at WatchDox -- it may be worth your time. https://www.watchdox.com/en/
|
# ? Mar 3, 2015 01:14 |
|
ghostinmyshell posted:Does anyone know if this live event was recorded and put up somewhere to view? I had to miss it gently caress! Did anyone make it? When they put it up someone make sure to post it in this thread.
|
# ? Mar 3, 2015 01:26 |
|
Potato Salad posted:Yes. This is a good example of the kind of granular control you will have. You can pay for something like Onelogin or Otka for Dropbox SSO, but do look into Active Directory Federation Services / AD Sync with OneDrive as well. Note that, depending on how your volume licensing for Office works, the combination of Office licenses and cloud storage [edit: with O365] may end up saving you in the long run. thanks dude. the answers you gave were specifically for dropbox? onedrive is an option but honestly I've had poor luck with it just as an end user trying out onedrive for business. sync errors and stuff. dropbox I know ~just works~
|
# ? Mar 3, 2015 04:18 |
|
NevergirlsOFFICIAL posted:thanks dude. the answers you gave were specifically for dropbox? Yes, generally. Before you do stuff, consider calling https://www.dropbox.com/business/contact . My rep has been helpful in the past, even if we didn't end up buying Dropbox enterprise from her. Free trial = very important to make sure things behave as you want.
|
# ? Mar 3, 2015 04:24 |
|
for sure -- thank you
|
# ? Mar 3, 2015 04:40 |
|
What are you guys using where you have local copies of data but once you jump back online there aren't a bunch of sync errors? That's the problem I've had with most "sync" solutions and with most phones able to tether I've haven't seen a reason to bother with it.
|
# ? Mar 3, 2015 05:41 |
|
devmd01 posted:Speaking of years of awful with an AD environment, I just inherited one. No GPOs other than default domain policy, 2003 domain controllers, passwords not set to expire, no password complexity, dhcp isn't centralized for proper dns registration, literally everything in Computers and Users OUs, everyone local admin, the one admin at the site runs his normal account as domain admin, etc etc etc. Hello me from two years ago.
|
# ? Mar 3, 2015 16:01 |
|
Do you guys know of any website/blog where there are PoC guides for the System Center suite, for example? Something like "for a SCOM PoC you generally need this, this and this and it will take x days to finish all tasks". I'm going to have to do it, so I'm just seeing if there's some kind of shortcut I can take right now.
|
# ? Mar 3, 2015 16:30 |
|
Rhymenoserous posted:Hello me from two years ago. Is your liver still functional?
|
# ? Mar 3, 2015 17:48 |
|
devmd01 posted:Speaking of years of awful with an AD environment, I just inherited one. No GPOs other than default domain policy, 2003 domain controllers, passwords not set to expire, no password complexity, dhcp isn't centralized for proper dns registration, literally everything in Computers and Users OUs, everyone local admin, the one admin at the site runs his normal account as domain admin, etc etc etc.
|
# ? Mar 3, 2015 17:58 |
|
If you'd like another option, I just put through an enterprise wide Box.com rollout. I'd consider it to be a bit more feature-rich than Dropbox. Happy to answer any questions you might have.
|
# ? Mar 3, 2015 20:07 |
|
AlternateAccount posted:If you'd like another option, I just put through an enterprise wide Box.com rollout. I'd consider it to be a bit more feature-rich than Dropbox. How big is your environment? Any regulatory/compliance issues? Are these primarily people out in the field? What did you replace, just a standard windows file server?
|
# ? Mar 3, 2015 20:32 |
|
Tab8715 posted:What are you guys using where you have local copies of data but once you jump back online there aren't a bunch of sync errors? I have literally no sync errors ever with my personal dropbox account.
|
# ? Mar 3, 2015 21:11 |
|
AlternateAccount posted:If you'd like another option, I just put through an enterprise wide Box.com rollout. I'd consider it to be a bit more feature-rich than Dropbox. I have a ton of questions, starting with: - did you move your entire server to box.com or is it strictly for specific purposes - how are you backing it up - do you maintain any sort of centralized local copy - how's the mobile apps
|
# ? Mar 3, 2015 21:12 |
|
NevergirlsOFFICIAL posted:I have literally no sync errors ever with my personal dropbox account. Sorry, I should have been more specific but when Bob goes off-site, edits his local company of MarchSales.xls and re-joins the company network what occurs? What if Janet also edited the same doc?
|
# ? Mar 3, 2015 21:24 |
|
loving HP printers, I have three on the floor that stop printing after a few jobs, and I have to unplug them and plug them back in to get them to resume printing. When they stop printing, the big hint is that they'll respond to a ping over ethernet, but they will stop generating their self-served webpage. They will always print over USB even when ethernet printing is stuck. - Win7/8 laptops both have the issue - I just bought and received a totally different model of HP printer, also having the issue, so I guess I've ruled out drivers and model-specific hardware - I checked to make sure they're auto negotiating the right switch settings, they are, 100mbps/Full - Laptops on the same VLAN as the printers have the issue aaaahhh wtf HP I hate you
|
# ? Mar 3, 2015 21:37 |
|
Maneki Neko posted:How big is your environment? Any regulatory/compliance issues? Are these primarily people out in the field? What did you replace, just a standard windows file server? >500 users, ~12 different locations. No regulatory/compliance issues that we could come up with. That was mostly handled by our audit/legal departments, though. People are mostly stationary, but spread out. A good percentage do a fair amount of traveling. We're sort of letting a natural progression happen to phase out some windows file servers and some external facing sharepoint and sharing. NevergirlsOFFICIAL posted:I have a ton of questions, starting with: No, our server infrastructure is intact, but this is going to replace basically anything that involves heavy collaboration or external sharing. I really, really hate just using a directory on a server for any sort of sharing or multi-person work. It's terrible. We're working on setting up a nightly FTP back to a local server. Not quite set yet. See above, but not for users to access. They can use Box Sync if they want to sync a few folders down to their desktops. The mobile apps are pretty fantastic, I am basing that on getting basically zero user questions or problems with them so far.
|
# ? Mar 3, 2015 22:24 |
|
Tab8715 posted:Sorry, I should have been more specific but when Bob goes off-site, edits his local company of MarchSales.xls and re-joins the company network what occurs? What if Janet also edited the same doc? then it creates two documents with the date of the conflict appended to the doc. Users would have to merge manually.
|
# ? Mar 4, 2015 04:06 |
|
AlternateAccount posted:No, our server infrastructure is intact, but this is going to replace basically anything that involves heavy collaboration or external sharing. I really, really hate just using a directory on a server for any sort of sharing or multi-person work. It's terrible. OK so how are you defining this policy? Like are you letting users decide what constitutes "heavy collaboration"? This is where I see it getting tricky with me. Users will say "oh Box, cool" and save everything there. And then "why is this super important folder all the way on the file server when all my other docs are in ~the cloud~ so let's move that folder there too". I'm not inherently against that but I'm having trouble with how to set the company policy on what goes in a traditional file server vs what goes in box. quote:We're working on setting up a nightly FTP back to a local server. Not quite set yet.
|
# ? Mar 4, 2015 04:09 |
|
Edit: figures 30 seconds after I type my problem out, I figure out that it's a group policy not applying properly issue. Hooray.
Sheep fucked around with this message at 18:12 on Mar 4, 2015 |
# ? Mar 4, 2015 17:17 |
|
box.lol
|
# ? Mar 4, 2015 17:31 |
|
NevergirlsOFFICIAL posted:OK so how are you defining this policy? Like are you letting users decide what constitutes "heavy collaboration"? Our policy is pretty loose right now. Some of the various divisions have jumped all over it, others might need some more coaxing. I am not sure why there has to be a hardcore policy about what goes where, but for our organization, issues and decisions beyond simple data integrity are left to individual departments to manage for themselves. We don't dictate too many things from the top down. And apparently yeah, direct ftp from their servers, but I haven't made it work yet. More meetings in the coming weeks.
|
# ? Mar 4, 2015 18:58 |
|
This is a pretty stupid question, I'm sure, but I'm still new to working in IT, and I'd rather look like an idiot to other goons than to my coworkers by asking them. I work for a MSP, and one of our customers requested that we back up a directory for one of their mission-critical programs onto their server (small company, one server.) The directory contains backups of a database used by the program (the program makes it's own backups of the database.) It lives on one single workstation, because this company is cheap and didn't want to shell out a few thousand more for the server-based version that would allow easier access from multiple workstations. They want these files copied over to the server because on more than one occasion, something happened in the program/its database backups and a bunch of data was lost/corrupted. The software provider worked with them for a while but couldn't restore anything. They want it copied over to the server daily, so that then it will get backed up by their Barracuda appliance nightly, and so if the data gets lost again, they can restore the server-based folder from the last known good configuration, then re-copy that back to the workstation. I know there are multiple ways to get the files from the workstation to the server (Win 7 laptop, Server 2012 R2 server), so I'm wondering what I should use. My initial thought was a scheduled task of robocopy. The files that will be getting copied over aren't the working files for the program, they are backups created from the working database, so as long as I time it correctly to not occur when the program backs up, robocopy should be able to copy them and they won't be flagged as open files. I also want to to be as unobtrusive as possible. These aren't tech savvy users, so seeing a command window pop up in the middle of the day while it does its thing would freak him out, I'm sure. On the flip side to that, he frequently takes his laptop home, turns it off at night, etc... so just telling it to run at midnight or something isn't an option, either. I'm wondering if instead it would be better to use a logon script. The advantage here is I don't think I'd have to worry about scheduling it around when the program does its own backups, and users are used to seeing random "things" flash on the screen when they login, so he might not be as concerned about it. But I'm not sure how often this user actually logs in, and then if he logs in to another machine that doesn't have this program, wouldn't that create problems if it looks for directories that don't exist? I'm also wondering if it's possible to use offline files/Sync Center in some way to do this...but I don't think so, since offline files is from the server side, right? I.e., it lets workstations access existing folders/files on the server and then update themselves automatically/update the ones on the server from changes they make, but it won't take an existing folder on a workstation and place it on the server and update it automatically, right? And then there's the possible option of simply putting a barracuda agent on the laptop and configuring it to just backup straight from there...probably the simplest, but again, with this guy taking his laptop out of the office and turning it off so much, that might not be reliable. Though maybe the schedule for that job could be changed to be a different time of day. Though I'm also not familiar with Barracuda's licensing scheme, so I'm not sure if another agent has to be paid for or anything like that. DrBouvenstein fucked around with this message at 20:13 on Mar 4, 2015 |
# ? Mar 4, 2015 20:00 |
|
I'm not saying robocopy is the best answer (though it probably is), but a) you can schedule a task to run hidden and as a different user so no one will see it and b) robocopy can be set to wait and retry when it can't access the file, so it's not a problem if it hits it during a backup, assuming the file is inaccessible to it during the backup. A bigger issue might be if it actually does copy the backup file while it's being written to, but presumably the date will change on the file when its finished and robocopy will then copy it the next time around.
|
# ? Mar 4, 2015 20:10 |
|
Fonts! Does anyone know a bit about managing the system fonts on Windows automatically? The situation I have is that we deploy a server that uses the system fonts to do some document processing. These get pulled from an AWS auto-scaling configuration based on an image. So if someone requests a new font I gotta go through the trouble of rebuilding the entire image for it. That is all annoying, and no way of knowing if you have the correct set of fonts because its done by hand. I have figured out how to pass powershell scripts that run at boot via AWS's cloud formation templates. I have this working to configure the web server post deployment. If it can pull a font package from some place like s3 and install it maybe it can do that on boot. It still kinda sucks because it will add to the time to spin up the server but usually these things are running for a while before they get terminated anyway. The other method I saw was building the fonts into an MSI. If building that can be automated somehow (we are using Bamboo to build and deploy) then it might take the headache out of it, and if people need new fonts they will just have to be added to the source repository.
|
# ? Mar 4, 2015 22:18 |
|
Not sure if you've seen this, but there's some Powershell here to deploy fonts http://blogs.technet.com/b/deploymentguys/archive/2010/12/04/adding-and-removing-fonts-with-windows-powershell.aspx
|
# ? Mar 4, 2015 23:39 |
|
Erwin posted:I'm not saying robocopy is the best answer (though it probably is), but a) you can schedule a task to run hidden and as a different user so no one will see it and b) robocopy can be set to wait and retry when it can't access the file, so it's not a problem if it hits it during a backup, assuming the file is inaccessible to it during the backup. A bigger issue might be if it actually does copy the backup file while it's being written to, but presumably the date will change on the file when its finished and robocopy will then copy it the next time around. There is an open-source program called Hobocopy that is just like Robocopy, but it can leverage Shadow Volumes to copy files that are in-use or otherwise locked. I used to use that with Schedule Tasks and it kicked rear end.
|
# ? Mar 4, 2015 23:55 |
|
Zero VGS posted:There is an open-source program called Hobocopy that is just like Robocopy, but it can leverage Shadow Volumes to copy files that are in-use or otherwise locked. I used to use that with Schedule Tasks and it kicked rear end. Just for the record (I know he was talking about backing up .baks, and not .mdbs) but do NOT loving do this to live database files. This will result in 100% unusable backups.
|
# ? Mar 5, 2015 00:03 |
|
Thanks Ants posted:Not sure if you've seen this, but there's some Powershell here to deploy fonts http://blogs.technet.com/b/deploymentguys/archive/2010/12/04/adding-and-removing-fonts-with-windows-powershell.aspx Nope, that particular one didn't come up when I searched. That seems more fleshed out, which is good. So I could do something like have CloudFormation unzip an archive of fonts, put the ps1 script in a specific location, and then have it install them automatically. If I get it working with cfn-hup which checks for changes in the script it should automatically re-run the command if the font archive changes. At least this is all in theory, i haven't done much more than have it run embedded code in the template at this point. Eventually everything will be machine controlled and I won't have to log into anything. Its going to be so sweet.
|
# ? Mar 5, 2015 03:05 |
|
|
# ? May 31, 2024 21:38 |
|
nexxai posted:Just for the record (I know he was talking about backing up .baks, and not .mdbs) but do NOT loving do this to live database files. This will result in 100% unusable backups. Yeah, no worries there. I spoke with a support tech from the software vendor (software called UDA Construction Suite) and he made very sure I knew what folder I CAN backup, and what folders I cannot touch under any circumstances. I suspect the underlying problem relates to that. Since they are using the client-based version, if another person wants to work with the software on their workstation, they have to do a manual sync of the database. My guess is that perhaps both of them were working on it at the same time, and something with the sync screwed up the database, and they didn't notice it until a day or two later, with backup files now over-written by the corrupted database. I don't know a lot about how to software backs up its own data, but for whatever reason, there wasn't a way to get data from before the corrupted database. I mean...you'd think it would have a retention policy set of at least a week or so, so hopefully you would notice and have backups to restore from from before all of them were overwritten with corrupted data, but I guess not. The more I think about it, the more I'm leaning towards a Barracuda agent on his system, and then seeing if I can set the schedule so that agent backs up during the day. Since the goal is to get them on the Barracuda anyway, it seems simpler to just go there directly rather than do an end-around to the server first and THEN to the barracuda. DrBouvenstein fucked around with this message at 17:58 on Mar 5, 2015 |
# ? Mar 5, 2015 17:55 |