Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hypnobeard
Sep 15, 2004

Obey the Beard



GreenNight posted:

Yeah I love this. Mapping drives based on security groups is awesome.

Yep, I quite agree. Except that in my case, it won't work, because the GPO will fire and try to map drives before the NAC authenticates the user and gives them access to the network. So, I have to fall back to using a script, which will eventually be run from a GPO.

Adbot
ADBOT LOVES YOU

oblomov
Jun 20, 2002

Meh... #overrated

CanOfMDAmp posted:

I'm working on building something of a VDI using Hyper-V (Yeah, I know it might not be ideal but just roll with it) and I can't figure out how licensing will work.

I'll be building a Windows Server 2012 based back-end, with a few machines handling domain and file server capabilities, then with 5+ Hyper-V Server 2012 hosts connected to those two. If the Hyper-V hosts are running the operating systems that each user will be connecting to, would I need a User CAL for each person connecting to those systems, or can I operate under the impression that it's the guest OS they're using, which won't be directly or indirectly connected to the server back-end?

EDIT: I suppose they would "be connected" as they're running on the Hyper-V Hosts, but they'll be operating independently of the domain and network entirely, connecting to a separate network setup for remote access. The current setup is those same OSes running on physical hardware, this is just a transition to a more manageable virtualized setup.

So, there are different sorts of CALs you need here.

1. Every user connecting needs either a server CAL or user (potentially could be device) CAL (unless this got change during last year, we have EA, so not sure.
2. You need to cover VMs running on Hyper-V. Since it's 2012, you need to buy Dacenter license procs for the hosts.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
What's the advantage of using item level targeting over defining the scope of the GPO? Obviously scope can only do inclusive and not exclusive, but the only way to really prevent someone from accessing the data is to also take away their NTFS access, and I'd rather do that by taking them out of the group that has access than creating a deny ACL.

Maybe that last part's just me.

GreenNight
Feb 19, 2006
Turning the light on the darkest places, you and I know we got to face this now. We got to face this now.

Because I have 1 GPO that maps 10 or so drives and some drives all users get while other drives only a group needs to get. It's nice to have all your drive maps in 1 place than have 10 GPOs.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Didn't even think about that. Makes me want to change all my file and printer GPOs to be in a single GPO.

Sudden Loud Noise
Feb 18, 2007

peak debt posted:

Nightmare as in "extract the MSI then install that"?

Little old but...

Java is an undeniable nightmare (although I haven't had to deal with it for a couple months, maybe things improved?) 32bit and 64bit can coincide on the same machine, the automatic update only works on new-ish versions, they increment MSI product codes for small releases, the in place upgrade literally just uses a "msiexec /x" to uninstall the new version so if you have a broken install the upgrade will do absolutely nothing to fix the issue, if IE is open during the upgrade you can completely wreck the install so you either have to force a restart or write your own script to notify the user and close IE.

It is without a doubt the worst piece of software I've ever had to manage the distribution of. Although third party encryption with SCCM 2007 is a close second.

vty
Nov 8, 2007

oh dott, oh dott!
I need a sanity check.

I've got a forest with multiple domains. Two of which have a one way trust (prod1, prod2).

We've recently been deprecating the prod2 and moving everything to prod1.

The method to the madness was;

1. Remove all servers from prod2 by switching them to a workgroup
2. Rejoin all servers to prod1
3. Test logins, walk away, sip tea

What I'm experiencing now is that the servers were never fully removed from prod2 (still in ADUC), this is causing (and this is whats confusing me the most)-

1. Kerberos errors (eventid 4, kerberos) which is basically a duplicate entry error
2. The servers can't access prod1s sysvol now (due to said kerberos error)

So I'm sure I just need to remove them from PROD2 completely, although I may need to remove them from prod2 and prod1 and then rejoin them again to prod1.

What I'm not understanding is this- the servers were in separate domains?! Why would duplicate server entries with different ending suffixes (server01.prod1.com, server01.prod2.com) matter? Apparently whatever is currently going on I CANNOT have any objects between the two domains that have the exact same hostname- and that is baffling me.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

What's the recommended book for Windows server? Something that covers AD and GPO and bonus points for getting into WSUS and other stuff. I've been away for two years and am not really sure what all has improved or changed since Server 2003.

vty
Nov 8, 2007

oh dott, oh dott!

Bob Morales posted:

What's the recommended book for Windows server? Something that covers AD and GPO and bonus points for getting into WSUS and other stuff. I've been away for two years and am not really sure what all has improved or changed since Server 2003.

Typically the "Inside Out" books if you don't want to bore yourself with an entire MCP/MCITP course.

http://www.amazon.com/Windows-Server-2012-Inside-Out/dp/0735666318/ref=sr_1_1?s=books&ie=UTF8&qid=1371828037&sr=1-1&keywords=windows+server

Nebulis01
Dec 30, 2003
Technical Support Ninny

Bob Morales posted:

What's the recommended book for Windows server? Something that covers AD and GPO and bonus points for getting into WSUS and other stuff. I've been away for two years and am not really sure what all has improved or changed since Server 2003.

If you want the book on AD and GPO. I would highly recommend Jeremy Moskowitz's group policy book (http://www.amazon.com/Group-Policy-Fundamentals-Security-Managed/dp/1118289404/ref=la_B001ILM9BS_1_1?ie=UTF8&qid=1371829668&sr=1-1)

As for server, which version will you be working with? 2008R2 is significantly different than 2012.

My go to for Server 2012 has been the unleashed book which has been a great resource (http://www.amazon.com/Windows-Server-2012-Unleashed-Morimoto/dp/0672336227/ref=sr_1_1?s=books&ie=UTF8&qid=1371829778)

kzin602
May 14, 2007




Grimey Drawer
I am going to preface this by saying that I am not certified for and I know I'm going to mess up on a lot of the terminology but here goes.

Warning: long and ranty

I was hired as a part time 'computer janitor' for a company with 30 employees in 6 locations, however in the past year and a half we have grown to about 75 employees in a dozen locations. These locations are all across the country, and are literally just a bunch of windows machines on a local network hooked up to a router and then to whatever the local DSL / Cable / T1 provider is. As the company grew I was promoted to full time to take over a web developer's job and have been engaged in a long term project to basically revamp all of the companies public websites and get a handle on our lead generation and management systems (there wasn't one).

I also took on the duties as the IT guy. I use that term loosely because there was, until I came in, no policy whatsoever. The guy who I was hired to take some workload off had every user set up as a local admin, had some ancient XP machine on each location with a free trial of Logmein installed that he would then use VNC (also free trial) to jump to whatever machine at the location he needed.

Because of the industry our company is in, most of our retail staff on these machines are completely computer illiterate, and we would get at least once a week, a machine completely hosed by people messing around with windows settings or, more often, some awful virus... The one that says the FBI has locked your machine and you have to pay a fine via anonymous wire to unlock the computer seemed to be a favorite of our staff. Having a local tech come out and run some antivirus off a thumb drive to clear these issues was costing us about a grand a month across our locations.

Of course the previous IT guy had installed antivirus, he had a million subscriptions for whatever trendmicro's consumer level AV was, and this Antivirus application would sit there saying 'Your PC is Protected' while that machine was being hijacked by the FBI Moneypack virus, or was in the middle of a remote control session due to Zeus.

I moved us to a paid subscription to logmein and got logmein going in every machine in the company, I removed trendmicro and replaced it with Microsoft Security Essentals, and have not had any infections since. The next phase was creating a new local user on the machines and then migrating all their documents out of the admin account, creating a new admin account that was 'clean' and had a secret password and then deactivating the old local admin account.

As the company grows, it's becoming unsustainable to remote into every machine every time we gain or lose an employee. I badly need some kind of windows account management. Active Directory and a Domain Controller seems to be what I need, except we have a dozen sales offices, and if I were to ship a local AD Controller to each location, I don't trust them with the physical maintenance. I know that it will get put in a closet, covered with papers, sold for copper or get eaten by a gator and have a hardware failure within a year.

My actual question

It looks like you can actually run AD over the internet, and local credentials are cached, so even if a location loses internet connectivity, they can still use and log into that machine as long as they are not switching from one workstation to another. My users are not moving from one PC to another very often.

I don't need to have complete roaming profiles, each location now has a drop box account that they can use if they want to move their documents from one station to another, but I would like things like their outlook settings to follow them and perhaps take advantage of SSO in the future. It looks like I can use Samba 4 as the actual AD controller http://wiki.samba.org/index.php/Samba#Samba_AD and somehow people could log into it's domain.

However I was told by a friend of mine that this is impossible and you always need n+1 AD servers, where n is the number of locations, plus one master that the slaves replicate from and synch to; this seems strange to me because how do you handle somebody on a laptop, I mean as long as the central server is reachable; and from what I've seen, as long as you are not mapping their actual local documents folder, it's not very bandwidth intensive.

I would like to make this as 'cloud' based as possible, I really don't want to have a big old windows server and it appears that as long as Samba acts as the central server, I can use a widows machine to build group policy or manage user accounts that are then stored and distributed by the Samba server; I can easily host samba on a linux host somewhere ( I know alot more about linux administration than I do Windows ) and make it highly available.

edit: I'm not a Linux evangelist, and am open to using MS/Windows tools to do this, but I'm not exactly sure what I would need, the licensing structure for Windows Server seems very complex, not to mention the cost of the software itself. and I'm surprised MS hasn't introduced a web/azure based management tool along the same lines of Office365 and Intune, a much easier to manage $X / User / Month.. Or maybe they have and I am not looking in the right place.

So I'm asking you goons, is this even possible or is there a better way to approach my situation. My goal is to at the least be able to enforce group policy and enable/disable windows accounts; having application preferences and credentials pushed to users would be nice.

kzin602 fucked around with this message at 18:28 on Jun 21, 2013

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
You don't need a server at each location, whoever told you that is wrong.

What you really want to do is setup a VPN at each site that connects to the main office where your AD servers are, that way you don't need to expose everything to the open internet.

Thanks Ants
May 21, 2004

#essereFerrari


I'd recommend a VPN endpoint built into the router you drop at each location, split tunnel it so only the traffic that needs to travel back to the head office does, and possibly drop a NAS at some locations for stuff like profile storage if you think it's necessary.

The Meraki access points do this pretty much by themselves, and then you have the advantage of decent wifi at each location if you need it.

kzin602
May 14, 2007




Grimey Drawer

FISHMANPET posted:

You don't need a server at each location, whoever told you that is wrong.

That way you don't need to expose everything to the open internet.

What would I be exposing? The AD server requires credentials to access, and would be on a domain not known to the public (security through obscurity is not security, I know). The connection between the AD controller and the windows client is encrypted, isn't it? I'm not dismissing the need for security, but it seems that a VPN is an unnecessary complication. It doesn't seem like it would be any more or less exposed than any other server on the internet. Are AD DC's notoriously easy to hack or something, is there some backdoor/default account that is always enabled?

He was saying you need a server at each location because what happens if the Domain Controller is unreachable; I would assume that things would continue as normal using cached credentials (as long as a user is not changing machines), once the server is reachable things would sync back up.

Erwin
Feb 17, 2006

We're looking into different methods of maintaining configurations on our production application environment (5 domain-joined Windows VMs, not including infrastructure like domain controllers and file servers). We've played with Puppet and Chef, and now we're looking into the idea of "Immutable Servers" - essentially cloning your production machines from templates, and updating those templates with code and software changes - never changing production, only tearing it down and deploying new VMs.

Since our application environment is pretty small, we thought about doing it like this:

1) Clone production VMs to an isolated network. Also clone a DC and any necessary file servers.
2) Make code changes/software upgrades there, and test test test.
3) If all is well, destroy production and clone the staging machines out to the production network.

Has anyone done this? My biggest concern with this is that they're joined to the domain. If the VMs are separated for too long, we'll end up with domain trust relationship issues. Is it better to rejoin them to the domain? Sysprep them maybe? Not do this at all because it's crazy?

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

kzin602 posted:

What would I be exposing? The AD server requires credentials to access, and would be on a domain not known to the public (security through obscurity is not security, I know). The connection between the AD controller and the windows client is encrypted, isn't it? I'm not dismissing the need for security, but it seems that a VPN is an unnecessary complication. It doesn't seem like it would be any more or less exposed than any other server on the internet. Are AD DC's notoriously easy to hack or something, is there some backdoor/default account that is always enabled?

He was saying you need a server at each location because what happens if the Domain Controller is unreachable; I would assume that things would continue as normal using cached credentials (as long as a user is not changing machines), once the server is reachable things would sync back up.

Generally you don't want stuff like that open to the wide open internet unless you have a really good reason to. Not wanting to bother getting a couple VPN in a box routers (I can't recomend any because I don't work in that space, but based on what Caged said they shouldn't be too hard to find) isn't a very good reason. You shouldn't be asking yourself why you need to bother taking your domain controllers off the public network, you should be asking yourself why you'd want them on the public network in the first place.

If you lose your link you won't be able to authenticate users whose credentials aren't already cached and your group policies won't refresh, so it's not that big of a deal. Is there any kind of centralized inventory database or anything that these clients are accessing? You said retail. If you're not already dependent on a steady connection back to the mothership, you probably will be soon, so you should make sure your network connections are robust and worry less about outages. If you really wanted to put DCs in each locatio you could do something like a read only domain controller in each site, but I don't really think there's a need for it.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

Erwin posted:

We're looking into different methods of maintaining configurations on our production application environment (5 domain-joined Windows VMs, not including infrastructure like domain controllers and file servers). We've played with Puppet and Chef, and now we're looking into the idea of "Immutable Servers" - essentially cloning your production machines from templates, and updating those templates with code and software changes - never changing production, only tearing it down and deploying new VMs.

Since our application environment is pretty small, we thought about doing it like this:

1) Clone production VMs to an isolated network. Also clone a DC and any necessary file servers.
2) Make code changes/software upgrades there, and test test test.
3) If all is well, destroy production and clone the staging machines out to the production network.

Has anyone done this? My biggest concern with this is that they're joined to the domain. If the VMs are separated for too long, we'll end up with domain trust relationship issues. Is it better to rejoin them to the domain? Sysprep them maybe? Not do this at all because it's crazy?

Welp double posting.

You'll want to sysprep the machines, as otherwise the clone is going to have the same GUID as the source machine, and that's bad bad not good. If you're using VMWare some of the cloning and template deployment will do sysprep for you, so make sure you're aware what's going on under the hood.

Also if it were me, and as above I don't work in this area either, I would build a clean machine every time and deploy the new code onto it, test test test, then put the new machine in production. Being able to quickly stand up a new machine with something like SCCM/MDT at a moment's notice is pretty amazing and can really change the way things are done (no idea how much build stuff puppet/chef can do on the Windows side, so you may already be able to do this with what you've got).

Erwin
Feb 17, 2006

FISHMANPET posted:

Welp double posting.

You'll want to sysprep the machines, as otherwise the clone is going to have the same GUID as the source machine, and that's bad bad not good. If you're using VMWare some of the cloning and template deployment will do sysprep for you, so make sure you're aware what's going on under the hood.

Also if it were me, and as above I don't work in this area either, I would build a clean machine every time and deploy the new code onto it, test test test, then put the new machine in production. Being able to quickly stand up a new machine with something like SCCM/MDT at a moment's notice is pretty amazing and can really change the way things are done (no idea how much build stuff puppet/chef can do on the Windows side, so you may already be able to do this with what you've got).

Right, but I'll be replacing the production machine with the upgraded production machine. Basically the same machines will exist in two different networks. Since the domain controller in the isolated staging network will be cloned from production, it'll be basically the same domain. When I push to production, I'll basically script out 'delete prod1 -> clone prod1 from isolated network -> power on prod1. Same GUID because it is the same machine. Just like you don't need to sysprep when restoring from backup.

So, I don't need to sysprep to avoid duplicate GUIDs, because only one machine at a time will have the same GUID (well, two at a time, but in two different networks), but I'm worried about the trust relationship. Roll back a machine several weeks from backup or a snapshot (not that I'd do this) and you get a broken trust relationship. This is the part I'm not sure how to work around.

We've got the quickly standing up a new server part down with VMware templates and Chef, but it just feels inefficient joining and unjoining servers every two weeks. I guess that's not a bad thing, but we thought we'd explore this avenue as well.

peak debt
Mar 11, 2001
b& :(
Nap Ghost
In general you don't really clone stuff nowadays anymore. The problems you get with unwanted settings and drivers being taken over to the other machine are just too troublesome. Automated installations do quick provisioning of new machines much better, whether you do it homebrew style with unattend files and scripts, fancier with SCCM task sequences or really fancy with Orchestrator.
That way you have a clean new install where you know exactly what is and isn't on the machine.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Why are you replacing the DCs each time?

Erwin
Feb 17, 2006

I'm not. Staging would get a DC cloned from prod, but it wouldn't be cloned back out.

wyoak
Feb 14, 2005

a glass case of emotion

Fallen Rib
Has anyone messed with DirectAccess in Server 2012? I got it setup and going, but the setup was probably a little TOO transparent, as when I screwed something up I couldn't get it fixed. I accidentally made the client GPO apply to the server, which caused the server to be unable to communicate with the domain, which caused the server to be unable to de-apply the client GPO until I fixed its NPRT settings. Once I fixed the name resolution table I was able to get everything green in the DA dashboard again, and DirectAccess clients would connect via IPHTTPS, but they couldn't resolve any internal names. They could ping internal IPv6 addresses though, so the tunnel was at least partly working. The DA server was responding to DNS64 queries from internal hosts, but not from external hosts, and those are the ones that matter obviously. Firewall on the server didn't seem to be blocking anything, but it didn't seem to be seeing requests on port 53 from the external clients period. I eventually scrapped the config and just started over since we're still in a testing phase, but that sucks because the wizard re-does all of the server IPv6 addresses so any clients that have the settings already get locked out until they see the new GPO, which in a real deployment would require them connecting via VPN or something...it'd probably be possible to figure out how the IPv6 addressing and routing works between all the different adapters and be able to re-setup the config manually, but the wizard does everything on its own so you don't really see anything.

Anyway, wondering if anyone else has messed with it and seen anything similar? Also, when it's working (IE until I break it), DirectAccess is really really cool and I recommend it.

GreenNight
Feb 19, 2006
Turning the light on the darkest places, you and I know we got to face this now. We got to face this now.

Does anyone backup all their enterprise data to the cloud? The boss and I are sick to death of tapes, tape libraries, a poo poo ton of b2d storage and so forth and are looking at the feasibility of just backing everything up to a cloud provider such as Rackspace.

Frag Viper
May 20, 2001

Fuck that shit
I recently started at a medium sized non profit company, and one of my goals is to update the workstations from XP Pro, to Win 7. We currently use Ghost, but I feel that its old and outdated. I've been looking at MDT and another thread suggested WDS and said I should bring my question to this thread.

Long story short, am I better off learning how to roll out images with MDT, or WDS? Which of the two is more user friendly to use?

dotalchemy
Jul 16, 2012

Before they breed, male Mallards have bright green/blue heads. After breeding season, they molt and become brown all over, to make it easier to hide in the brush while nesting.

~SMcD
WDS is your Microsoft equivalent of Ghost, for the most part. You capture a prebuilt image, sysprep and WDS pushes it down to a client that you PXE boot to WDS.

MDT let's you do all sorts of awesome scripting and customization, but it's a bit more complicated. You can use USMT, push programs during build, run updates - pretty much everything OSD that SCCM can do, you can do via MDT. Amusingly, you can "extend" SCCM's OSD capability by plugging MDT into it.

Long story short, throw WDS out there (as I think MDT requires it anyway for actual deployment of it's images anyway - it's the boot server) and then learn as you go with MDT once you have a base image that you're pushing.

Frag Viper
May 20, 2001

Fuck that shit
Ok WDS sounds a little more right for the job then. I'll go bug the Sys Admin about getting it setup.

Is this documentation good to use? Or should I go with a non Microsoft set of instructions?
http://www.microsoft.com/en-us/download/details.aspx?id=7258

Sacred Cow
Aug 13, 2007

spidoman posted:

Little old but...

Java is an undeniable nightmare (although I haven't had to deal with it for a couple months, maybe things improved?) 32bit and 64bit can coincide on the same machine, the automatic update only works on new-ish versions, they increment MSI product codes for small releases, the in place upgrade literally just uses a "msiexec /x" to uninstall the new version so if you have a broken install the upgrade will do absolutely nothing to fix the issue, if IE is open during the upgrade you can completely wreck the install so you either have to force a restart or write your own script to notify the user and close IE.

It is without a doubt the worst piece of software I've ever had to manage the distribution of. Although third party encryption with SCCM 2007 is a close second.

After futzing around with deploying Java v7u25 I've found the best solution (for my SCCM 2012 environment anyways) is to have 2 simultaneous deployments. One Java package that is set to only run when the user is not logged in and a second package that is deployed as "Available" in Software Center. I set a force restart over the weekend and Java gets installed after the machines come back up. I'll keep an eye out for any failed deployments and follow up to have the user manually install it using Software Center. We also send out an email letting users know how to download and install it on their own using Software Center but, well...users.

I haven't had luck Googling this but is there a decent process to uninstall unapproved software remotely without having to remote into a users computer and manually remove it through Control Panel? My boss has declared that software like iTunes and Steam should not be on company computers and just wants them gone with as little impact to the user as possible.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
If you can figure out their MSI product codes you can run msiexec to uninstall. You can setup either an application or program to do it and run that with SCCM.

Erwin
Feb 17, 2006

dotalchemy posted:

MDT let's you do all sorts of awesome scripting and customization, but it's a bit more complicated. You can use USMT, push programs during build, run updates - pretty much everything OSD that SCCM can do, you can do via MDT. Amusingly, you can "extend" SCCM's OSD capability by plugging MDT into it.

Hey, speaking of this, we're looking at a way to automate deployment of our application servers. We've tried Chef and Puppet and I was going to set up SCCM as a comparison, but maybe MDT is a better answer? We need something that will:

-Install Apache Tomcat, Java, some other stuff
-Set up some config files, preferably by checking them out of version control
-Set environment variables
-Change some start-up types for services (e.g. set them to manual instead of automatic)

It would also be cool if you could check a server to make sure it's compliant (install missing software, reset variables, etc). Is this a job for MDT?

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I think you're looking for PowerShell Desired State Configuration. A friend of mine who works at Microsoft just showed this to me yesterday, it pretty much looks like Puppet for Windows:
http://technet.microsoft.com/en-us/library/dn249918.aspx

Also, if you can afford SCCM, do it, and then bolt MDT on top of it. You can run MDT deployments from with SCCM, so it's the best of both worlds. I think MDT has a more comprehensive setup for installing an OS, but SCCM is better at everything else.

Sudden Loud Noise
Feb 18, 2007

FISHMANPET posted:

If you can figure out their MSI product codes you can run msiexec to uninstall. You can setup either an application or program to do it and run that with SCCM.

You can also uninstall many programs using the win32_product class.

But you'll run into challenges with both.

In my experience, mandatory (and invisible) uninstalls of user's programs is a very messy process that managers shove on to the technical support team when in reality it is a personnel/communication issue.

The technical people will get tons of calls complaining that their stuff got deleted when an email should have at least been sent out by management saying "If you have the following products installed, please uninstall them. If you do not uninstall them by x date they will automatically be uninstalled."

dotalchemy
Jul 16, 2012

Before they breed, male Mallards have bright green/blue heads. After breeding season, they molt and become brown all over, to make it easier to hide in the brush while nesting.

~SMcD

FISHMANPET posted:

I think you're looking for PowerShell Desired State Configuration. A friend of mine who works at Microsoft just showed this to me yesterday, it pretty much looks like Puppet for Windows:
http://technet.microsoft.com/en-us/library/dn249918.aspx

Also, if you can afford SCCM, do it, and then bolt MDT on top of it. You can run MDT deployments from with SCCM, so it's the best of both worlds. I think MDT has a more comprehensive setup for installing an OS, but SCCM is better at everything else.

So, the problem with Microsoft and their DCM solutions (as remember, SCCM has one too, if you can call it that)... it's a significant time sink to set it up, and a lot of the time it just generates reports - it doesn't do anything in an enforcement sense. DCM in SCCM will let you specify a configuration baseline, but it'll only report against it. If there's skew on the configuration, you gotta go fix that yourself - be it something removed or something added.

I'm not hugely sure as to how Puppet works, but one thing I've seen which I like for configuration management is cfEngine - annoyingly the Windows version isn't free. It does a "promise" based deployment, where x group of systems is promised y software - they go out and get it, install it, configure based upon scripts and you're done. You can use the same mechanism for uninstalling for modifying the configuration.

I used it a lot with Linux to basically remove manual configuration on whatever we needed to deploy. As far as I understand, the Windows version has capabilities mostly on par.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
CFEngine is ooooooold, and Puppet/Chef does the same thing. Also, from my understanding, Powershell DSC enforces it as well, doesn't just generate reports.

My Microsoft friend also showed me this, a comparison between a Puppet manifest and a DSC whatever it's called:
code:
Powershell DSC:

File MyFileExample
{
 Ensure = "Present"
 Type = "Directory“
 Recurse = $true
 SourcePath = $WebsiteFilePath
 DestinationPath = "C:\inetpub\wwwroot"
 Requires = "[WindowsFeature]MyRoleExample"
}


Puppet:

file { 'MyFileExample':
 ensure => file,
 recurse => true,
 source => $WebsiteFilePath,
 path => 'C:\\inetpub\\wwwroot',
 requires => package['MyRoleExample'],
}

Sacred Cow
Aug 13, 2007

dotalchemy posted:

So, the problem with Microsoft and their DCM solutions (as remember, SCCM has one too, if you can call it that)... it's a significant time sink to set it up, and a lot of the time it just generates reports - it doesn't do anything in an enforcement sense. DCM in SCCM will let you specify a configuration baseline, but it'll only report against it. If there's skew on the configuration, you gotta go fix that yourself - be it something removed or something added.

I'm not hugely sure as to how Puppet works, but one thing I've seen which I like for configuration management is cfEngine - annoyingly the Windows version isn't free. It does a "promise" based deployment, where x group of systems is promised y software - they go out and get it, install it, configure based upon scripts and you're done. You can use the same mechanism for uninstalling for modifying the configuration.

I used it a lot with Linux to basically remove manual configuration on whatever we needed to deploy. As far as I understand, the Windows version has capabilities mostly on par.

Configuration Baseline in SCCM 2012 has remediation now. I can't vouch for how dependable it is since I've never used it outside of the lab in my training class but the option is there now.

Moey
Oct 22, 2010

I LIKE TO MOVE IT

Caged posted:

I'd recommend a VPN endpoint built into the router you drop at each location, split tunnel it so only the traffic that needs to travel back to the head office does, and possibly drop a NAS at some locations for stuff like profile storage if you think it's necessary.

The Meraki access points do this pretty much by themselves, and then you have the advantage of decent wifi at each location if you need it.

A little old but seconding Meraki. Dead simple to use and not too costly either. Pick up some Z1 boxes for the remote offices and an MX60 for your main site. Meraki also does dynamic DNS so even if the remote offices are on a consumer line and their IP changes, you don't have to bat an eyelash.

We have around 10 Z1s, 2 MX60s and 2 MX80s deployed. The amount of visibility it will give you is amazing. If a site is yelling about crappy internet speeds, you can go and see what device is consuming, and where that bandwidth is going to.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

I have PTR records in a Windows 2003 DNS server with a name of 10_21_win7 which maps to x.x.10.21
Also have one for other clients such as 10_22 (without the _win7, it's an XP machine) which maps to x.x.10.22 (all of our clients are static IP).

I can't get an IPv4 address through DNS for just that record, but can get an IPv6 address.
code:
ping 10_21_win7 -4
Ping request could not find host 10_21_win7. Please check the name and try again

C:\Users\rob.vasquez>ping 10_21_win7

Pinging 10_21_win7.example.local [2002:8001:a15::8001:a15] with 32 bytes
Another host looks up fine:
code:
ping 10_22 -4

Pinging 10_24.example.local [128.1.10.24] with 32 bytes
I can, however, ping the host by IP

code:
ping  128.1.10.21

Pinging 128.1.10.21 with 32 bytes of data:
Reply from 128.1.10.21: bytes=32 time=51ms TTL=122
Reply from 128.1.10.21: bytes=32 time=114ms TTL=122
Reply from 128.1.10.21: bytes=32 time=93ms TTL=122
Ideas? It also seems to reverse look up fine using the IPv4 address:

code:
nslookup 128.1.10.21
Server:  r-21.example.local
Address:  128.1.2.41

Name:    10_21_win7.example.local
Address:  128.1.10.21
I checked to make sure the hosts file didn't have anything but I'm not sure where to go from here. I don't see any errors in the Event Log on the DNS server either.

Bob Morales fucked around with this message at 16:54 on Jun 28, 2013

wyoak
Feb 14, 2005

a glass case of emotion

Fallen Rib
PTR records are for IP -> hostname lookup, A records are for hostname -> IPv4 lookup; do you have an A record setup for 10_21_win7?

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

wyoak posted:

PTR records are for IP -> hostname lookup, A records are for hostname -> IPv4 lookup; do you have an A record setup for 10_21_win7?

Derp...That was it, it didn't have an A record only AAAA. Thanks.

Cpt.Wacky
Apr 17, 2005

Moey posted:

A little old but seconding Meraki. Dead simple to use and not too costly either. Pick up some Z1 boxes for the remote offices and an MX60 for your main site. Meraki also does dynamic DNS so even if the remote offices are on a consumer line and their IP changes, you don't have to bat an eyelash.

We have around 10 Z1s, 2 MX60s and 2 MX80s deployed. The amount of visibility it will give you is amazing. If a site is yelling about crappy internet speeds, you can go and see what device is consuming, and where that bandwidth is going to.

Do these effectively replace whatever existing router you have at the main and remote sites? Are there any ongoing costs after the initial purchase, like a subscription to their cloud management stuff?

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
It looks like A VSS writer has poo poo the bed on my R2 install, and it seems microsoft has no way of fixing it. Symantec has done all they could and pointed me a the direction of microsoft. Basically the issue is outlined here but instead of windows backup, its any backup solution that uses VSS.

Kind of at a loss. Everything says "DONT RUN REGSVR32" for R2 machines, and I cant really flat and replace.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply