|
I don't know why you guys are working so hard on deligation, I'm the only enterprise admin
|
# ? Jan 24, 2014 05:36 |
|
|
# ? May 30, 2024 12:14 |
|
Pffft, look at these scrubs that aren't schema admins.
|
# ? Jan 24, 2014 06:42 |
|
At the simplest level, do the delegation by going to the OU where everybody is in, and delegate Full Control to some "AD Administrators" group you create then put everybody into that group. That's functionally identical to people being Domain Admins, so nobody will be able to bitch about not having the rights to do their AD jobs, but you have the certainty they can't do additional stuff like shut down DCs and move FSMOs. You can always finegrain it down further by location and job role later (aka never, but the important part of the job has been done).
|
# ? Jan 24, 2014 12:46 |
|
Anyone done a server side mailbox archive on Exchange 2010 SP3 before? I'm cleaning a few (dozen) ex-employee's mail from our Exchange server. I'm running: New-MailboxExportRequest -mailbox XYZ -filepath \\unc\archive\xyz.pst and for every few users I try I get The operation couldn't be performed because object 'XYZ' couldn't be found on 'domaincontroller.domain.local'. + CategoryInfo : NotSpecified: (0:Int32) [New-MailboxExportRequest], ManagementObjectNotFoundException + FullyQualifiedErrorId : D3736D58,Microsoft.Exchange.Management.RecipientTasks.NewMailboxExportRequest When I'm clearly looking at the user object XYZ while logged into domaincontroller.domain.local. I've tried adding the -domaincontroller flag pointing at our other domain controller, but I get the same error. Then for the next user on my list it'll work just fine. I look at the users and they're identical in every way. They're both marked "disabled", etc. I've tried re-enabling them, adding them to weirdo groups, etc. Tried using the display name instead of the shortname, their @domain.local ID, their @domain.com email address. edit: I'm a big dumb babby who didn't bother to check whether the user had a mailbox in the first place. The users in question aren't set up in exchange at all. loving DUH. In my defence that's literally the worst error message you can throw when you mean to say "I can't find a mailbox for user XYZ" <> some kinda jackal fucked around with this message at 23:22 on Jan 24, 2014 |
# ? Jan 24, 2014 19:37 |
|
Couple SCCM 2012 R2 questions: 1. For applications, is there anyway to actually change the "application size" after it's created? In the appcatalog, everything's now reporting as 1MB heh. Having a hard time finding that field. 2. I have a vbscript that enables wake-on lan, what would be the best way to detect that? I'm thinking have the vbscript create a text file, and the detect rule would be that. Just curious if there's a better way to approach this scenario.
|
# ? Jan 29, 2014 02:40 |
|
So, we finally got a non-zero budget this year and ordered 200 workstations to replace our Windows XP boxes. Since we've never ordered in bulk before, I took the initiative to setup a WDS server so we can sysprep and capture an image. This is my first time doing this but I believe I have everything ready to go. I put the computer in audit mode, configured it and installed apps, then I syspreped it and captured the image to WDS. Then I downloaded the Windows 8.1 ADK and built some unattended answer files. I did a test deployment and everything works great. My question is, what is MDT and why do people say to use WDS + MDT? I haven't touched MDT yet but what benefits will it give me over just using WDS + ADK? edit: These are the tutorials I followed. Part 1: http://www.petenetlive.com/KB/Article/0000735.htm Part 2: http://www.petenetlive.com/KB/Article/0000737.htm Part 3: http://www.petenetlive.com/KB/Article/0000738.htm kiwid fucked around with this message at 15:34 on Jan 29, 2014 |
# ? Jan 29, 2014 15:30 |
|
Think of WDS as the service that handles the actual deployment - network booting, pushing the images out etc. MDT is a nice way of interfacing with WDS in terms of driver packaging, and ties it in with System Center if that's your bag.
|
# ? Jan 29, 2014 15:35 |
|
Caged posted:Think of WDS as the service that handles the actual deployment - network booting, pushing the images out etc. MDT is a nice way of interfacing with WDS in terms of driver packaging, and ties it in with System Center if that's your bag. So with MDT, would I technically be able to build one image for a range of different models of desktops and then through MDT inject the drivers, or would I still be doing separate images for each model?
|
# ? Jan 29, 2014 15:40 |
|
You don't have to bother with an image if you don't want to - you can pull the install.wim straight off the OS media and then deploy software in SCCM if you want. It's a trade off between speed of imaging and flexibility. But you are right, you don't need to make an image per system type any more, just have the drivers for each system on your server and it will use them.
|
# ? Jan 29, 2014 15:47 |
|
Unfortunately we don't have SCCM yet.
|
# ? Jan 29, 2014 16:31 |
|
SCCM is a beast as you can see with all the SCCM questions that aren't getting answered.
|
# ? Jan 29, 2014 17:00 |
|
Anyone on SCOM 2012 or 2012 R2? I'm curious how you have it set up, VM vs physical, SAN vs local disks, etc. I'd be looking at monitoring 600+ servers.
|
# ? Jan 30, 2014 00:51 |
|
I need a scriptable, enterprise level FTP product, a maintenance contract and under $2000. We have outgrown our script kiddie implementation of CuteFTP's console support and it's becoming painful. Another department has a license + maintenance for Globalscape EFT which is something like $20K MSRP, and it's not looking like are willing to share Caveats, need both SFTP and FTPS, as well as FTP, along with the standard suite of encryption ciphers. I guess I *could* just script stuff through Putty for the SFTP and continue using script kiddie junk for the handful of FTPS connections we do. I'm tempted to have my friend sell our company a "maintenance contract" for the open source Putty
|
# ? Jan 30, 2014 01:21 |
|
Does WS_FTP Server fit the bill?
|
# ? Jan 30, 2014 01:26 |
|
WDS just a service that allows you to PXE boot, MDT actually allows you to deploy a OS. It builds a bootable disc which connects to the MDT server to finish the rest of the deployment. You take that boot disc and pass it off to WDS. WDS gets a PXE request, and it throws that bootable MDT disc at that machine who is trying to boot off LAN Also in regards to Caged's comment... The ideal MDT setup in my opinion would be.. 1. You hit F12 on a computer to PXE boot off lan 2. MDT boot environment loads (WDS passes the boot enviroment to the computer PXE booting.) 3. WIzard starts, it asks you what the hostname is 4. Asks you what OS you want to install (ie. Win7, Server 2008 etc.) 5. Asks you what APPS you want to install (just check off whichever apps, you can group apps. ie a checkbox has "base install" and if you check that off, it installs office and skype) 6. MDT formats drive and installs blank OS that has nothing on it (aka install.wim) 7. MDT scans the computer, applies nescessary drivers (from a driver database where you import the drivers previously) 7. MDT then installs each application one bay one 8. Computer deployment done There's way to hack it together to be fully automated, but SCCM is the actually fully automated approach. To be honest, there's not that much user involvement with MDT when booting, it takes less then 2mins. Pros of this setup, is in the long run, it will save you time from maintaining multiple images and adding\removing different softwares to the deployment. Cons is it will be time consuming at the beginning, and you need to learn a bit about scripted\silent installs edit: Oh to answer your actual question. I never even heard of the WDS + ADK option. WDS is pretty easy to setup.. there's almost no configuring. You just add the role. GreenNight posted:SCCM is a beast as you can see with all the SCCM questions that aren't getting answered. Seems like it. I have 2 years under my belt for 2007, new to 2012 but yeah just mind boggling. I also actually have a couple more which I now have forgotten. lol internet. fucked around with this message at 02:23 on Jan 30, 2014 |
# ? Jan 30, 2014 02:03 |
|
Hadlock posted:I need a scriptable, enterprise level FTP product, a maintenance contract and under $2000. We have outgrown our script kiddie implementation of CuteFTP's console support and it's becoming painful. The only thing I'm personally familiar with in this space is Sterling Direct Connect, which is now owned by IBM. It's also crazy expensive as is most of the EDI software packages. Try looking for EDI SFTP and see what pops up, you might be able to find something. EDI will be the keyword that will probably help you find you what you want.
|
# ? Jan 30, 2014 02:12 |
|
Caged posted:Does WS_FTP Server fit the bill? I'd seen that but the price is a little high, we might be able to swing that
|
# ? Jan 30, 2014 02:39 |
|
double postin'~ Yeah we are using S:CD right now and the company is scrambling to get off of it as IBM is raising the maintenance prices by about 15% a year, also gently caress everything about NDM, I hope it dies in a fire. Also NDM lines are super expensive so everything we convert to SFTP saves us mid-5-figures a year once you figure in DR duplication costs.
|
# ? Jan 30, 2014 03:33 |
|
lol internet. posted:Cons is it will be time consuming at the beginning, and you need to learn a bit about scripted\silent installs Also this isn't actually a con, because "built the entire imaging infrastructure and process from the ground up for X number of workstations. Saved the company Y man hours and $Z a month." is a great resume bullet point edit: quoted wrong thing at first
|
# ? Jan 30, 2014 03:38 |
|
kiwid posted:My question is, what is MDT and why do people say to use WDS + MDT? I haven't touched MDT yet but what benefits will it give me over just using WDS + ADK? MDT gives you automation and dynamic deployment. Dynamic, meaning different PCs will come out differently based on rules that you define. You can make certain PCs deploy with different apps, for example. Also, you can use a single image and task sequence for all models, by making it select different driver packages. By autmation, I mean you are practically scripting the whole build process, which runs on every deploy. It can start with a clean DVD install, then it adds all the settings and packagesyou want on top of that. This way, if there's something you want to change in the build, you can make adjustments to the task sequence, rather then go the old school way of manually installing, clicking stuff, capturing, then re-adjusting and re-capturing, then finding out your build is screwed beyond repair, then starting all over again from scratch. For another example, you can automate naming and domain joins, so all PCs you deploy can be ready to use without having to touch it. For 20 PCs, it's probably not worth the effort; you're gonna have to pour time into making every part come out the way you want. But it's pretty useful for hundreds and SCCM is even better for thousands.
|
# ? Jan 31, 2014 19:59 |
|
If the 20 pcs are identical it's probably worth doing. Good to have in 6 months time when you need to flatten and reinstall due to malware or something.
|
# ? Jan 31, 2014 23:44 |
|
Docjowles posted:Also this isn't actually a con, because "built the entire imaging infrastructure and process from the ground up for X number of workstations. Saved the company Y man hours and $Z a month." is a great resume bullet point Possible con, depends on how the person views it of course. I've done it about 5 times on SCCM/MDT so.. I don't even bother with the saving man hours on my resume. Anyways, remembered my other SCCM question. For application packages, is there anyway to access the content directly when deploying the application to a machine? Default SCCM client cache size is 5gb which causes problems for CS5.5 design suite (7GB) and Autocad design build (30gb.) You can change the cache size manually, but it's not a realistic option for Autocad.
|
# ? Feb 1, 2014 03:47 |
|
lol internet. posted:Possible con, depends on how the person views it of course. I've done it about 5 times on SCCM/MDT so.. I don't even bother with the saving man hours on my resume. No. SCCM does some sort of deduplication of Application data on distribution points that makes it impossible to execute content directly from the DP. You can up the cache though. I set mine to 10GB and haven't had any problems. There should be a way using Compliance Settings to increase the cache size across your whole environment easily.
|
# ? Feb 1, 2014 06:23 |
|
Anyone want to critique my driver management logic? We've got over 50 computer models that we support, and are doing the total control method of driver management in SCCM 2007. What this means? We have 4000 drivers accounting for 20GBs of space. Because we're doing total control, we have lots of repeat drivers. Querying the database shows 2700 potential duplicates. Exact same inf name, version, model info, and manufacturer. However, there are a couple different issues: -Drivers are often grouped together since they will use some of the same reference files. Simply separating each driver with its referenced files results in a ton of duplicate files, no space is actually saved. -Those duplicate drivers? They may not be duplicates. Manufacturers are lazy with their infs. All the inf info may be the same, but the files may be different. So, I'm thinking my process will be: 1. Get all inf info from database including SCCM categories and Driver Packages. 2. Get list of referenced files in each inf. 3. Get the hashes of each referenced file. 4. Check whether or not there is another inf in the inf's directory. 5. Find duplicates by checking inf info and hashes. 6. For duplicate drivers join SCCM category and driver package info. 6. Make copy of directory if there are multiple drivers. 7. Mark all drivers in said directory as done. 7. Mark any duplicates as done. 8. Go thru remaining drivers not marked as done. 9. Then it's just a fairly simple process of scripting the import of drivers to a new SCCM environment using the source, category, and Driver Package info. It's a huge terrible convoluted process, but I'm not sure if there's a better solution?
|
# ? Feb 2, 2014 23:05 |
|
To be honest, 2007 for me was a gong show when it came to drivers. No matter what you do, it will never be organized properly. I just ended up dropping everything into one directory and imported it. Added to a package called All Drivers and auto-apply drivers worked fine for me. I don't see the point of total control due to the duplicate issues as SCCM doesn't let you import duplicates. It will never be perfect. In 2012, driver management is a lot more better as you can create folders in the SCCM console under the drivers section and have duplicate drivers in the database. edit: if you want to cut down on size, best bet would to use auto-apply drivers. Rebuild your driver by importing drivers one by one and importing only missing drivers on the next machine. ie. Import drivers from machine 1 and test. Test machine 2, see which drivers are missing and import those drivers. (Perhaps drivers from machine 1, also cover machine 2.) I went doing it this way. lol internet. fucked around with this message at 23:18 on Feb 2, 2014 |
# ? Feb 2, 2014 23:12 |
|
So here's a secret to 2007 driver management: You don't have to import them into the console. The only drivers you need to import are drivers you need to import into a boot image (so, network, storage, maybe USB3). Here's what you do. 1) Create a driver package for each model of computer you manage. Part of this will be setting a folder to store the driver package. Each driver package should have its own unique folder. 2) Put all your driver files into that folder. We have Dells, so I can just dump the extracted Driver cab for the model into the folder, but you should be able to figure out a way to do it. 3) Push the driver package to distribution points, and use it. The secret here is that all that when you push to the distribtuion point, it just copies all the files. In the OSD, it just injects those files into the driver store for the OS to find. Note this doesn't work in 2012, because of the way it manages files, but its also much easier to manage drivers in the console with 2012.
|
# ? Feb 2, 2014 23:21 |
|
^^^ Yeah, this is exactly how we've been doing it. And it's resulted in so much bloat. lol internet. posted:To be honest, 2007 for me was a gong show when it came to drivers. No matter what you do, it will never be organized properly. I just ended up dropping everything into one directory and imported it. Added to a package called All Drivers and auto-apply drivers worked fine for me. I don't see the point of total control due to the duplicate issues as SCCM doesn't let you import duplicates. It will never be perfect. Yeah, this is all basically hoping to start clean in 2012. I wish I could do an auto apply, but we would run into issues of newer versions of drivers being applied to models where the manufacturer doesn't support that driver version. A big no no in our environment. Also we use standalone media quite heavily, which means we have to keep model specific driver packages. Ooof. Normally I wouldn't mind bloat that much, but with such a heavy reliance on (DVD standalone media) 10gb of drivers (other 10 is XP drivers) is making the OSD experience a pain. Edit: But I guess I won't get rid of the bloat in standalone will I? Since each package will have it's own copy of the driver. Sudden Loud Noise fucked around with this message at 23:34 on Feb 2, 2014 |
# ? Feb 2, 2014 23:22 |
|
spidoman posted:Anyone want to critique my driver management logic? Save yourself some trouble and don't make drivers more complicated than they already are. 20gb is cheap, and that's actually pretty good for 50 models. I'd let it go, considering the amount of effort it takes to make things efficient. If anything, you might want to cut down unused drivers IF some driver packages get big enough to slow down deploys. But even then, it's not worth it to do that 50 times. The best way to stay out of trouble is download all drivers for your model from the OEM, unpack them, then import them into a new driver package for that model. Take them at face value, the OEM says you should use those files for that model, and you can waste lots of time by second-guessing them. Then test deploy that model. If they work, call it done. I personally try to delete un-needed driver files when I'm importing (32bit folders, etc), but it's only worth it if your downloads include extra GPU drivers that you don't need, only because those tend to be hundreds of MB. I'm experienced with SCCM2012, so your experiences might be more difficult. I think 2012 actually cross-references duplicate driver files in its database, so that's one thing you can look forward to in the future.
|
# ? Feb 3, 2014 01:40 |
|
.
Hadlock fucked around with this message at 06:57 on Feb 3, 2014 |
# ? Feb 3, 2014 04:05 |
|
What I found out is that many driver packages as downloaded from the manufacturer are extremely oversized. I think I once saw a 50MB ZIP download that consisted out of 700kB of drivers and 49.3MB of poo poo nobody needs. So you can definitely slim down your packages if you want to do the work.
|
# ? Feb 3, 2014 19:04 |
|
I have a GPO question for y'all. I've been tasked with pushing down our corporate wifi via GPO. I created it, went to computer config -> policies -> windows settings -> security settings -> wireless network (802.11) policies and did my thing. Everything looks good. I link it to our Users OU, remove the Authenticated Users group and applied it to my account, for testing. My test laptop shows via gpresults /H - the new Wireless GPO is Denied. Reason: Empty. The gently caress? The settings are clearly there in the GPMC.
|
# ? Feb 3, 2014 20:48 |
|
Nevermind, I'm an idiot. A computer policy won't work if you only link it to a User OU. Duh.
|
# ? Feb 3, 2014 21:16 |
|
Was just gonna post that. If you enable loopback processing, you can apply a computer policy to a user object.
|
# ? Feb 3, 2014 21:19 |
|
Yeah I just applied it to entire desktop OU and added my test pc in the Security Filtering. What is better when we go live - add the Domain Computers group or apply loopback processing and add Authenticated Users group?
|
# ? Feb 3, 2014 21:22 |
|
peak debt posted:What I found out is that many driver packages as downloaded from the manufacturer are extremely oversized. I think I once saw a 50MB ZIP download that consisted out of 700kB of drivers and 49.3MB of poo poo nobody needs. So you can definitely slim down your packages if you want to do the work. My favorite: Bluetooth driver package 450MB. Actual driver size: 3MB.
|
# ? Feb 3, 2014 21:25 |
|
AFAIK you can't selectively loopback process a single GPO, so just be aware that turning loopback processing on turns it on for all the user objects in the OU that you are using it on.
|
# ? Feb 3, 2014 21:26 |
|
Demie posted:The best way to stay out of trouble is download all drivers for your model from the OEM, unpack them, then import them into a new driver package for that model. Take them at face value, the OEM says you should use those files for that model, and you can waste lots of time by second-guessing them. Then test deploy that model. If they work, call it done. Lenovo has "SCCM packages" of drivers for most of their models, which is a nice thought. Except they have a bunch of "gotcha"s where you have to install one driver before the other etc. And the touchpad driver made OSD crash.
|
# ? Feb 3, 2014 22:29 |
|
zapateria posted:Lenovo has "SCCM packages" of drivers for most of their models, which is a nice thought. Except they have a bunch of "gotcha"s where you have to install one driver before the other etc. And the touchpad driver made OSD crash. Dell does the same thing http://en.community.dell.com/techcenter/enterprise-client/w/wiki/2065.dell-driver-cab-files-for-enterprise-client-os-deployment.aspx
|
# ? Feb 3, 2014 22:31 |
|
Nebulis01 posted:Dell does the same thing zapateria posted:Lenovo has "SCCM packages" of drivers for most of their models, which is a nice thought. Except they have a bunch of "gotcha"s where you have to install one driver before the other etc. And the touchpad driver made OSD crash. I've personally never ran into these drivers issues (HP\Lenovo\Custom built PCs) which I always hear people have and I've always used auto-apply in both 2007/2012. I don't bother downloading driver packs from the manufacturer website. When you get a new laptop from Dell\HP\Lenovo, there's the driver database on C:\SWSHARE which I just import. Yeah there's probably some outdated drivers, but meh. On the side note since I'm replying. Has anyone found out any advantages\real life implementation examples of OSD\VHD? and App V in SCCM 2012 R2? and I finally got a deployment strategy for software updates
|
# ? Feb 4, 2014 02:02 |
|
|
# ? May 30, 2024 12:14 |
|
One issue that was 100% reproducible was that if you added the touchpad driver for the 2530p to SCCM, all installations of 2510p laptops bluescreened. I had to edit the inf file for that driver to remove the hardware ID of the 2510 device to get that to work.
|
# ? Feb 4, 2014 16:25 |