|
Noghri_ViR posted:I can add clients onto my ESET install at 9 bucks per client per year. No way I'm paying 11 bucks a month. Does your ESET install include Windows Enterprise (upgrade), internet based MS patch management and remote support tools?
|
# ? Jun 10, 2011 12:28 |
|
|
# ? May 21, 2024 15:23 |
|
ESET does monitor updates for the computer it is installed on, yes. It's pretty nice.
|
# ? Jun 10, 2011 13:57 |
|
Are there any good sites out there for setting up Remote Desktop Services (Terminal Services)? The initial setup and app installation is straightforward, but I'd appreciate any blogs etc that have good information.
|
# ? Jun 10, 2011 21:33 |
|
Just wanted to say thanks for the exposure to powershell. I am now in love. Did a few extremely simple things using it and its already making my life easier,
|
# ? Jun 11, 2011 06:45 |
|
Drumstick posted:Just wanted to say thanks for the exposure to powershell. I am now in love. Did a few extremely simple things using it and its already making my life easier, Powershell is the poo poo, I have a thread in cavern of cobol that gets occasional attention with some examples if you want more help. http://forums.somethingawful.com/showthread.php?threadid=3286440 It's definitely a lifesaver for admin work and incredibly versatile.
|
# ? Jun 11, 2011 06:48 |
|
Help me Goons, I'm doing an upgrade in an environment by ripping out their Windows Server 2000 and Windows 2000/XP infrastructure and replacing it with Server 2008 R2 and Windows 7 Enterprise. Thanks to this thread I've got a functioning deployment server setup doling out Windows 7 Lite Touch installs with a press of F12 and nothing more. However I'm running into a problem pulling over the old profiles/data from the previous servers. Right now I am creating the new users in AD and setting up their profiles. After their profile is created and populated I go back and paste in the files from their old Win 2000 profiles. I've got about 100 users so it's not impossible, it's just a lot of manual point and click and I figure there's an easier way. I read some papers on USMT 4.0 but I couldn't get it functioning. The only way it would work is if I called up the USMT script from the MDT2010 and it would drop the files into the migration folder on the server and error out. Then it would die when I tried to manually run a LoadState after the machine was imaged. I was running out of time so I started doing it manually. Is USMT what I need to use? Is there a good howto video out there? The MDT2010 video that was posted a few pages back was excellent.
|
# ? Jun 14, 2011 06:16 |
|
In SCCM is it possible to use both Auto apply drivers and apply driver packages within one task sequence? Auto Apply drivers doesn't seem to be working for me. There's no DP for imported drivers correct? (non-driver packages.) I'm use to creating driver packages.. but now I'm trying to import drivers based on some examples I read in the forums. Under the drivers section, I created folders. ie. Display -> ATI RADEON 57XX Series then imported the drivers.. I'd imagine this is not the reason why it's not working.
|
# ? Jun 14, 2011 07:46 |
|
johnnyonetime posted:Is USMT what I need to use? Is there a good howto video out there? The MDT2010 video that was posted a few pages back was excellent. USMT is an amazing tool, but it can take some time to set up properly, or at least it did with version 3. Is the 2008 R2 domain a completely new domain, I assume?
|
# ? Jun 14, 2011 13:31 |
|
lol internet. posted:In SCCM is it possible to use both Auto apply drivers and apply driver packages within one task sequence? My understanding is that you are correct, imported drivers do not live on the DP. However, from this technet article, it looks like the drivers you want to use in the 'Auto Apply Drivers' step have to be in a driver package, it just doesn't matter which: http://technet.microsoft.com/en-us/library/bb680990.aspx I've personally always been warned to never use auto-apply, but of course never been told why.
|
# ? Jun 14, 2011 13:36 |
|
quackquackquack posted:USMT is an amazing tool, but it can take some time to set up properly, or at least it did with version 3. Yes it's a completely new domain. The Windows 2000 server had a strange naming scheme so I setup a brand new forest name.
|
# ? Jun 14, 2011 15:44 |
|
quackquackquack posted:My understanding is that you are correct, imported drivers do not live on the DP. However, from this technet article, it looks like the drivers you want to use in the 'Auto Apply Drivers' step have to be in a driver package, it just doesn't matter which: http://technet.microsoft.com/en-us/library/bb680990.aspx seriously Sucks to find this out 6 months into using SCCM. I think driver packages works better due to keeping everything clean. The reason why I've switched is because there's just too much different hardware configurations existing within the organization.
|
# ? Jun 14, 2011 19:17 |
|
johnnyonetime posted:Yes it's a completely new domain. The Windows 2000 server had a strange naming scheme so I setup a brand new forest name. I would consider the "semi-automated" method. Write a script to create each of the users with dummy passwords and create their profiles on the new server. Then copy over the data (robocopy or similar) on top of their fresh profiles, reset their passwords, and they should be good to go. Not that much different than what you were doing already, mind you. USMT is pretty awesome though, and moving from 2k/XP to 7 I would personally want some of the behind the scenes magic it uses.
|
# ? Jun 14, 2011 21:26 |
|
Could use some Goon help Currently I work in an office where the majority of the office uses linux based thin clients (LTSP). They work fairly well but have the occasional lock up and generally the users hate them because they are used to using either windows or osx at home. Therefore I have been asked with looking into what it would take to convert everyone to windows. oh joy. anyway, I'm much more of a linux person than I am windows - I do have some experience with windows server 2003 and 2008 (and exchange). what am i looking at for managing peoples computers using windows? I want to be able to have the users machines on total lockdown, not allow the to install any sort of software (and of course do my best to combat the viruses these guys will no doubt install by clicking on things they don't understand). Can all of this be managed through Active Directory and group policies? what should I be looking at for this kind of control? It would also be nice to be able to do things like remote patch management - basically anything that would allow me to manager their computers while not being at their desk. it was so easy with LTSP clients.....oh well...ugh, not looking forward to this. Also, since we will be purchasing new computers for this....would you say in the long run it would be cheaper to purchase computers without operating systems, then contact microsoft and get some sort of bulk install of windows? I'll have somewhere between 30-50 desktops.
|
# ? Jun 15, 2011 20:48 |
|
Regarding Windows licensing you'll get a better deal at your number by getting OEM licenses attached to your hardware. Assuming you go to Dell or something, just make sure you're getting Windows 7 Professional on the computers. Managing Windows machines is pretty dang easy. By default your users can't do much without administrative rights, and you can use Group Policy to push certain settings and preferences to the clients. *** *** This is assuming you have a proper Windows Active Directory domain setup, which you probably will want at least on a basic level to manage 50 machines. Patch management is easy with a WSUS server, approve the patches in the console and your Group Policy for patching will take care of the rest. Honestly Windows shines when it comes to this kind of setup.
|
# ? Jun 15, 2011 21:08 |
|
Got Haggis? posted:Could use some Goon help If you're going to buy entirely new machines you should look into xenapp and vmware. I'm not sure what your price range is but we've found it substantially cheaper to deploy thin wyse xenith terminals and pair it with citrix xenapp on beefy servers. What kind of terminals are you using right now?
|
# ? Jun 15, 2011 21:14 |
|
Got Haggis? posted:Also, since we will be purchasing new computers for this....would you say in the long run it would be cheaper to purchase computers without operating systems, then contact microsoft and get some sort of bulk install of windows? I'll have somewhere between 30-50 desktops. Can you even do this? Assuming you're buying from Dell or something. Just buy them with the OS you need (which is windows 7 professional)
|
# ? Jun 15, 2011 21:20 |
|
sanchez posted:Can you even do this? Assuming you're buying from Dell or something. Just buy them with the OS you need (which is windows 7 professional) Nope. Unless something has changed, what you're generally volume licensing for client software is an upgrade + software assurance, so you get anything new for freez. They're expecting you to buying OEM licenses.
|
# ? Jun 15, 2011 22:30 |
|
If you're going to be buying Office as well as windows, it's probably close to as cheap to get a volume licensing deal, and it might save you money on upgrades down the road.
|
# ? Jun 15, 2011 23:11 |
|
Thanks guys. I was thinking that MS volume licensing might be cheaper, but yeah we will need Office as well, so will look into some cheap dell computers. I already have a Windows Server 2008 machine set up that is acting as our Exchange server as well as being used for AD Authentication for our linux machines. I'll have to look more into group policy administration I suppose..any good resources for that?
|
# ? Jun 16, 2011 14:23 |
|
Got Haggis? posted:I already have a Windows Server 2008 machine set up that is acting as our Exchange server as well as being used for AD Authentication for our linux machines. I'll have to look more into group policy administration I suppose..any good resources for that? Take a look at the GPO megathread. BangersInMyKnickers is a wizard when it comes to GPO stuff.
|
# ? Jun 16, 2011 15:14 |
|
You don't save a lot of money by going with Volume Licensing unless you are buying large quantities. The savings comes with ease of administration and centralization of management. With OEM licensing you have to enter a key every time you install a piece of software. There's no way to automate it. You also need to keep track of all the keys, licenses, etc. Last, there's no way to get an upgrade - if something new comes out you buy that license now. Volume Licensing lets you install on multiple computers using the same key. You can go over your license agreement by a little bit and just true up. This allows for automatic installs of software because the keys are static. Licensing is tracked through your agreement with Microsoft and any licensing partner can look up your license counts and availability for you. You can purchase Software Assurance, which is about 15% of the original price tacked on - this allows for unlimited upgrades for new versions of the software so long as you keep it (think Windows XP Pro to Windows 7 Pro, or Office 2007 Pro Plus to Office 2010 Pro Plus). Over the long run you can save money, but it takes making a larger investment up front. If you check a 4+ year budget spread you will see the savings.
|
# ? Jun 16, 2011 15:18 |
|
Your management team is going to be super happy when they see the licensing bill for that new windows environment, too!
|
# ? Jun 16, 2011 15:27 |
|
Just started playing with MDT2010 the other day and it seems pretty slick so far. Haven't set up much besides a basic Win7 + Office 2010 deployment, but I was impressed with how little time it took to set up (~2.5 hours including installing WDS/MDT/WAIK, setting everything up, and testing the deployment to a VM.) One tip for anyone that's just starting out with MDT: I've been looking into it passively for a while now, and the amount of info out there was a bit overwhelming for someone who has never used anything other than Ghost. I'd recommend grabbing the MDT step-by-step ebook from here; it'll step you through getting a basic MDT environment set up as a proof of concept.
|
# ? Jun 16, 2011 23:59 |
|
chizad posted:Just started playing with MDT2010 the other day and it seems pretty slick so far. Haven't set up much besides a basic Win7 + Office 2010 deployment, but I was impressed with how little time it took to set up (~2.5 hours including installing WDS/MDT/WAIK, setting everything up, and testing the deployment to a VM.) I have always used ImageX.exe to take a clone of a windows install + sysprep + oobe, what does MDT do really different?
|
# ? Jun 17, 2011 00:02 |
|
Corvettefisher posted:I have always used ImageX.exe to take a clone of a windows install + sysprep + oobe, what does MDT do really different? I'm still very much getting my head around what all MDT can do (and how I want to use it), so this may not be the best explanation. (Those of you more familiar with MDT are more than welcome to jump in and help me out.) In a nutshell, MDT (built on top of Windows Deployment Services) provides you with a lot more flexibility and options than just capturing an image with ImageX/Ghost/etc and deploying it to the machine. It also provides a lot of canned scripts/tasks to automate deployment/refresh tasks so you don't have to waste the time building that stuff yourself (more on this later). From what I understand, MDT doesn't do a whole lot that you couldn't already do before, it just automates things so you don't have to mess with the details. From what I've read, a lot usefulness of MDT will depend on the specific organization. Where I work there's a baseline of software every machine gets, software that everyone in a given department gets, and then stuff that's on a case by case basis, so right now our images only have that baseline and then everything else gets loaded manually. Being able to automate a lot of that stuff will mean less time spent babysitting software installs and more time doing other stuff. We also have IT staff deploying machines in three different locations, so we have the challenges of 1) making sure we al have the latest images and 2) the manual tasks done post-imaging are done by all of us in the exact same way. The linked deployment shares (I make changes here and push them out to servers in the other offices) would eliminate those problems and make it so that no matter who does the build, every machine that goes out has the exact same config. If you do refreshes on a fixed cycle (we don't), every employee gets the exact same set of applications, and you only have one location/all the people doing deployments are in the same place, then obviously you don't care about any of that stuff. - This isn't unique to MDT, but there's a couple different approaches you can take with your images. You can do a traditional thick image for your systems that has Windows + patches + your standard software. Or you can do a thin image that's just Windows + patches and then use the logic MDT provides to automate the installation of everything else. Or you can do a hybrid approach where the image is Windows + patches + only the software that absolutely every single PC in the organization needs to have and then use logic to deploy the rest of the software based on department/location/etc. IMO a lot of the cool stuff about MDT only really comes to play if you're using thin or hybrid images. - With the thin image approach maintaining your image(s) becomes a lot easier, especially if you decide to have images for each model with the drivers already installed. Instead of deploying the image to a reference machine of model X, doing the updates and then capturing a new image, all you'd have to do is update the application install folder you've got MDT set to use and everything you deploy afterwards has those updates. - If you've got multiple models of machines in your environment, you can load all the different drivers into MDT and then let it figure out the rest (at least theoretically, I haven't tested this so I don't know how well it actually works). So conceivably you'd have one base image (two, if you're doing a mix of x86 and x64 builds) that you could deploy to any machine you want; MDT would deploy the image and then install the right drivers automatically. And if you don't trust the automatic detection, you can also set up groups of drivers for different models. Then you would still have one/two base images and then a deployment task for model X that installs X's drivers, another task for model Y that installs Y's drivers, etc. - It also has logic built in for refresh (new hardware)/upgrade (new OS on same hardware) tasks. In the case of an upgrade, MDT would run the User State Migration Tool to capture all the user's data and save it to the network, and can optionally take a full backup of the machine. It will then deploy the image and do whatever other post imaging tasks you have set up, and restore the user's data back to the machine. Similarly, with a hardware refresh it would grab the user's data from the old machine and save it to the network, do the deployment to the new machine, and then restore all their data. - If you've got it set up so PCs can PXE boot from the server, you can (if you want to) make deployment tasks available to end users. I don't see myself ever using this, but I'm guessing it's so you can walk the end-user through reloading their machine instead of sending a tech out to do it. With PXE boot in general you can also prestage PCs in AD so only the machines you choose are able to PXE boot from the server. - If you have multiple deployment servers for different locations, you can set up linked deployment shares (term for where the images/applications/drivers/etc are stored so the clients can access them). One person/team can maintain either just the baseline images/applications/etc or all images/applications/etc and have the changes replicated to everyone else. - There's some other common tasks MDT can help automate. Run Windows Update after the image is deployed, and optionally again after all applications have been installed (for stuff like Office updates or security updates for a MS component installed by one of the applications.) Join the machine to the domain, including putting it in the right OU. Enable BitLocker. Check the specs of the system (CPU/RAM/disk space and make sure it currently has a client OS installed) and only do the deployment if it meets whatever minimum requirements you set up. Phew, that's a lot of , but I hope it helps.
|
# ? Jun 17, 2011 03:58 |
|
I have about 1500 user folders that need to be cleaned off my nas. However, a few of them have removed the admin account from their folders security. I tried to take ownership of the folder they all belong too but im getting errors saying that there is not enough space on the disk. And I can add myself to their folders till I take ownership. Not all of the folders have done this. Any ideas to quickly get rid of these files?
|
# ? Jun 21, 2011 18:42 |
|
Drumstick posted:I have about 1500 user folders that need to be cleaned off my nas. However, a few of them have removed the admin account from their folders security. I tried to take ownership of the folder they all belong too but im getting errors saying that there is not enough space on the disk. And I can add myself to their folders till I take ownership. Not all of the folders have done this. Any ideas to quickly get rid of these files? This sounds like a job for... powershell code:
There is a nice scripting guys article that covers most of this: http://blogs.technet.com/b/heyscriptingguy/archive/2008/04/15/how-can-i-use-windows-powershell-to-determine-the-owner-of-a-file.aspx adaz fucked around with this message at 19:27 on Jun 21, 2011 |
# ? Jun 21, 2011 19:17 |
|
When im running it im getting this error: Get-Acl : Attempted to perform an unauthorized operation
|
# ? Jun 21, 2011 20:16 |
|
Drumstick posted:When im running it im getting this error: If you're using windows 7/vista with UAC (gently caress uac) make sure you are launching powershell as an administrator
|
# ? Jun 21, 2011 20:21 |
|
Yep, it is launched as administrator and the same error is happening. Here is how I have this setup, I probably have something wrong. [qoute] $user = New-object system.security.principal.ntaccount("domain.local","adminaccount") $dirs = Get-Acl \\server\folder\* foreach($dir in $dirs) { if($dir.owner -notmatch "\\builtin\administrators"){ $dir.SetOwner($user) Set-Acl -aclobject $dir -path $dir.path } } [/qoute]
|
# ? Jun 21, 2011 20:27 |
|
Nope I don't think you are doing anything wrong the behavior wasn't what I expected. I had to reproduce the error myself by removing myself from all permissions and setting a different owner than the builtin\administrators. Hrrm this is going to be more difficult than I thought with a script, give me a few.
|
# ? Jun 21, 2011 20:54 |
|
Drumstick posted:I have about 1500 user folders that need to be cleaned off my nas. However, a few of them have removed the admin account from their folders security. I tried to take ownership of the folder they all belong too but im getting errors saying that there is not enough space on the disk. And I can add myself to their folders till I take ownership. Not all of the folders have done this. Any ideas to quickly get rid of these files? Quick question, why did they have the rights to modify the folders permissions?
|
# ? Jun 21, 2011 21:46 |
|
This was surprisingly annoying, but it'll probably come up again for me at some point. Essentially, since you can't read the acls most of the cmdlts will fail or alternatively attempt to automatically set the owner for you but fail anyways since the type you are passing to them won't be in the objectSecurity format it needs. code:
Also, strictly speaking, takeown.exe is probably easier but I got annoyed and wanted to do it in powershell adaz fucked around with this message at 21:55 on Jun 21, 2011 |
# ? Jun 21, 2011 21:50 |
|
Moey posted:Quick question, why did they have the rights to modify the folders permissions? The guy before set everything up. Hes also no longer working here. My last month was spent unfucking everything. Its been a mess. The more I dig into things the more I find to correct. Thank you Adaz! Ill check it out when I get in tomorrow. I really appreciate the help.
|
# ? Jun 21, 2011 22:01 |
|
Can anyone recommend something to manage software registration/serials? My outfit is currently storing hundreds of pieces of software that has thousands of registration info for various employees. When a computer blows up or an employee leaves, our records (word document/text file) instantly become incorrect. We have a piece of junk software that tracks the initial purchase of the software and who it was bought for, but there's no way for us to track any transfers of ownership our HelpDesk team might have to do due to a computer breaking or employee leaving. Free would be awesome, but we can afford to spend.
|
# ? Jun 22, 2011 06:10 |
|
How do I document my domains GPOs? The documentation I need to create is intended for my only-semi-technical managers, and possibly the not-yet-competent tech they hire to replace me when I leave. If they see a setting thats greyed out in Outlook, they need to be able to easily find out three things: 1) A GPO was used to configure that setting 2) The reason it was set 3) What GPO it is in\Security group considerations for that GPO. I want to create an extremely thorough doco, but I am profoundly poo poo at documentation. How have you done yours? Just dumping the settings .htm is not good enough in this case.
|
# ? Jun 22, 2011 11:47 |
|
Swink posted:How do I document my domains GPOs? What I'd probably do is go through each GPO you have, and say: - Who does this affect? - What does it do? - Why is it in place? Then once you've done that, if you're still feeling enthusiastic, cross-index it by "what does it do". If that's too much effort, just categorise them (i.e. 'Mail', 'Printers', 'Login' etc.)
|
# ? Jun 22, 2011 12:16 |
|
Speaking of gpo's and AD how does everyone organize their computer accounts in AD? Do you just leave client computers in the computers ou or do you separate then out into different ou's? I already have the computer ou and one I created of servers. But I was thinking about creating a clients, tablets, then old computer accounts ou's to sort it out a bit more. The old computer accounts ou would just be a dropbox for accounts that aren't active anymore but need to be tested before they are deleted
|
# ? Jun 23, 2011 01:57 |
|
Our AD is broken down into geographical OU's, and then further departmental/functional OU's under that. I personally like separating things as much as reasonable, it makes life easier on me. Makes GPO's easier as well as you're usually applying a policy to a group of folks.
|
# ? Jun 23, 2011 02:55 |
|
|
# ? May 21, 2024 15:23 |
|
Machines on network blobbed into one, machines off network lobbed into another. Servers separated out more, they have 6 or so OUs, one for DCs, one for virtual machines, one for citrix boxes, etc, etc We don't really use too many group policies though, and they tend to be universal to all users/computers. I can see why you'd want to separate it out a bit more but I've seen some orgs have these incredibly complicated OU structures that make things a bit confusing.
|
# ? Jun 23, 2011 04:10 |