|
I'm having a really weird problem with SCCM 2007 and Software Updates. I've created a package for Office 2010 updates and the clients just won't detect the updates. SCCM says that the clients need them but the clients themselves don't seem to detect them at all. It's probably something really simple and stupid but I've just been banging my head against a wall with it for a day now and I'm not getting anywhere. Anyone have any ideas? e: I should mention that other software updates appear to be working fine.
|
# ¿ Sep 15, 2011 23:11 |
|
|
# ¿ Apr 29, 2024 02:10 |
|
I've been forcing the client to update and watching the logs. The regular Windows updates show up as installed but the Office ones just never show up in the logs at all. I have a distribution point and the update package is installed. It's setup exactly the same as my other package for regular Windows updates so I'm pretty confused. I don't get how they took something that works so well (WSUS) and broke it so horribly for SCCM. e: gently caress it, I'm just going to delete my SUP and use WSUS. This isn't worth it. e2: my "naw gently caress man it wouldn't work like this" try worked. Something about having two packages targeting the same collection...they get detected consecutively and the second one overrides the first or something vv Number19 fucked around with this message at 00:01 on Sep 16, 2011 |
# ¿ Sep 15, 2011 23:16 |
|
It was actually even stupider than that. I was trying to deploy SP1 for Office 2010. It appears that if you don't download certain languages then the update fails to deploy because...uh...gently caress if I know. Yay Microsoft
|
# ¿ Sep 23, 2011 19:16 |
|
FISHMANPET posted:Yeah, I just hit that poo poo. It's because Office 2010 installs a bunch of secret language proofing packs, so you basically need to put that in a seperate package and tell that package to download all the languages. This falls into that special kind of stupid where I'm actually in awe of it. It's just so...Microsoft.
|
# ¿ Sep 23, 2011 22:52 |
|
Quarantining devices is very handy. I get the odd user that tries to add their phone without getting company approval and they always get all sheepish when they have to come ask to have it activated.
|
# ¿ Dec 30, 2011 04:42 |
|
vladimir posted:This might have been answered before, but I'm having a hell of a time finding any decent results. Open Active Directory Users and Computers, click on the View Menu and select Advanced Features. Then right click on the OU and select the Security tab. Click on Advanced in there and you'll be able to find who's been given what. It's still a bit murky to figure out what's what but you should be able to figure it out. It might help to create a test OU so you can see what the defaults look like and that you can test delegation on to see what permissions end up getting changed by what delegations.
|
# ¿ Jan 25, 2012 03:31 |
|
It was nice of Microsoft to completely wreck the build and capture mechanism in SCCM 2012. You're now not able to let a DP fall back to HTTP if the client is roaming or in a workgroup (which is where you should do BnCs from). No, now you have to either have your DP in HTTP mode or have a seperate HTTP DP. I tried this: http://www.jamesbannanit.com/2012/05/how-to-build-and-capture-in-configuration-manager-2012-using-https/ Which isn't working due to the problems people mention in the comments. I've fallen back to manually building my reference image now and then capturing it when I'm done. This has just been a huge waste of my time overall.
|
# ¿ Feb 7, 2013 22:10 |
|
I'm looking into building a new file server. Part of it will use DFS to create a central location for all shares to be located. I've found that if I mount the namespace root to a drive letter it will report the size and free space of the share as whatever the DFS root drive has. Everything I've read indicates that this is the expected behaviour but that just seems goofy to me and will end up confusing the users. Is there some way around this without making a fake DFS root volume that has nothing on it and look like it has "space"? Can I just make Windows not report the size of mapped network drives?
|
# ¿ Mar 1, 2013 02:04 |
|
devmd01 posted:Our setup does this as well. Users shouldn't be concerned about disk space on a server, that's your job to set up monitoring and alerting on disk space thresholds so you can address space issues before they become a problem. Oh I know they shouldn't be concerned, but that doesn't stop them from sending in tickets or running to my office telling me THE FILESERVER ONLY HAS 30GB WE NEED MORE!!!!!! I was just seeing if there was a way to mask that but since there isn't I'll just deal with it in documentation or something and have a quick link available
|
# ¿ Mar 1, 2013 17:21 |
|
I've had some freak about something similar before which is why I'm even bothering to mention it at all. Most of my users are smart but sometimes they're a bit too smart...
|
# ¿ Mar 1, 2013 23:05 |
|
Openfire is very good and should be the default option if you need an in-house IM system. I get the feeling I'm going to be asked to switch us to HipChat soon though...
|
# ¿ May 1, 2013 17:16 |
|
We use Psi since it's pretty lightweight and does most of the things that Jabber needs to do. Spark needs Java and hogs a lot of RAM for what it does.
|
# ¿ May 1, 2013 18:12 |
|
El_Matarife posted:Warning though, Backup Exec 2012 SP2 won't use Server 2012 as a server, just as a target for backups and restores. And they don't have granular restore on 2013 yet. You've still got to run it off 2008R2 and there's few other things it can't target like Sharepoint 2012 granular. This is why I'm ditching Backup Exec. Why am I paying for support and maintenance if they can't even support current platforms properly anymore? It took them almost a year to support vSphere 5.1 and 9 months or so for Server 2012 as a backup target. It's completely unacceptable and I'm looking forward to the phone call asking me to renew so I can tell them that I chose someone else who actually keeps their product up to date.
|
# ¿ Sep 13, 2013 16:30 |
|
I didn't even bother following it after about 6 months or so. I just started making plans to ditch the software. Even now with SP2 out they're going to fall behind again now that 2012 R2 is coming out. I know they're promising to be faster this time around but I'm skeptical to say the least. I'm also tired of backups randomly failing but that's another issue.
|
# ¿ Sep 13, 2013 16:41 |
|
I just figured out a new way to handle Nvidia/AMD video driver installs in SCCM 2012 using applications instead of packages. at Nvidia for not providing one MSI I can key off of for detections though.
|
# ¿ Sep 13, 2013 23:06 |
|
They will give you a KMS key if you ask them for it.
|
# ¿ Jan 6, 2014 21:52 |
|
lol internet. posted:Oh yay, someone I can talk SCCM 2012 R2 with. I just spent the last month setting up SCCM at my new company. I've setup 2007 in the past. I thought the Update component would be better, but at the end of the day, it still sucks. A bit more manageable but still overhead as software update groups handles max 1000 updates. 1. Before you reimage the machine, delete it from SCCM and AD. That will make SCCM detect it as an unknown computer and give it a MININT name. 2. I'd have to look tomorrow but it think you can specify in the client settings if the users can override. 3. This is also something I've wanted to get working. I'm tired of going around and chasing down computers for people who are on vacation to turn them in and let updates install. 4. I do apply all updates with no issue during OSD so yeah, look through your logs. Also don't forget that SCCM lets you do offline servicing of your images now so you can just roll your new improved updates into your image and drastically reduce your imaging time.
|
# ¿ Jan 14, 2014 04:26 |
|
FISHMANPET posted:I'm not sure why you would want the computer to get a MINIT when you could let it have its actual name. I often don't know the final name for a computer until after the build process is done (a preliminary build order will come in before hiring is completed). We also don't tend to recycle computer names all that often so I usually just delete it from AD and create it all from scratch.
|
# ¿ Jan 14, 2014 21:30 |
|
Zaepho posted:Also, Applications frigging rock for software deployment. This cannot be said enough. They rock for hotfix deployment too once you figure out how to detect them. The big pro to Applications is that they do installation checks before running so you can deploy them to devices that already have them installed and (so long as your checks work right) it will just install on the computers that are missing the Application. I just converted one of the client hotfixes to an application and it works perfectly. Also being able to specify requirements allows you to easily include x86 and x64 installs in the that Application and reduce your deployment count. Or if you really want to get creative, you can use those requirements to do fun things like deploy all the different OS-dependent versions of something like video drivers in one Application and have the requirements filter out all the wrong ones and just install the correct one. It's very powerful and I'm so glad I've managed to wrap my head around it. Edit: RE: hotfixes. I don't get why they don't let SCCM deploy updates in the Hotfix category that are imported from the WSUS catalogue. It seems like a silly limitation.
|
# ¿ Jan 22, 2014 21:14 |
|
lol internet. posted:Possible con, depends on how the person views it of course. I've done it about 5 times on SCCM/MDT so.. I don't even bother with the saving man hours on my resume. No. SCCM does some sort of deduplication of Application data on distribution points that makes it impossible to execute content directly from the DP. You can up the cache though. I set mine to 10GB and haven't had any problems. There should be a way using Compliance Settings to increase the cache size across your whole environment easily.
|
# ¿ Feb 1, 2014 06:23 |
|
IMO learning how to make everything possible into an application instead of a package is one of the big SCCM 2012 skills Also learn how to make dynamic collections using queries, collection inclusion/exclusion and collection limiting. I tend to use queries as building blocks and then assemble them into more specific collections with inclusion and limiting
|
# ¿ Mar 7, 2014 03:58 |
|
Microsoft.
|
# ¿ May 2, 2014 18:24 |
|
Sacred Cow posted:My CIO has been requesting reports from SCCM about every hour on our deployment of those update. Thankfully we're small enough that I could just say "This is serious. It's getting deployed for lunchtime, you might have to reboot, sorry for the inconvenience" and had no issues.
|
# ¿ May 2, 2014 19:03 |
|
skipdogg posted:My eye just involuntarily twitched a whole lot after reading that. The next DC upgrade I do, I'm installing Server Core for all the DC's dammit. Mine are on Server Core just for this reason, even though it's just me administering them for now. It's a bit of protection against a future stupid idea.
|
# ¿ May 9, 2014 20:05 |
|
Orcs and Ostriches posted:Right now I'm trying to stop a group of users (can be either an OU or Security Group) from logging in to any workstations. I don't want to disable their email or other web application access, so I can't just disable the AD account. Put the users in a group and add that group to the Deny log on locally setting in a GPO (Computer Configuration\Policies\Security Policies\Local Polices\User Rights Assignment). Apply that GPO to the OU where the workstations reside and put the target computers in a security group to filter it further if you need to.
|
# ¿ Jul 2, 2014 16:54 |
|
Setting up a CA is worth it if you have more than a couple of things that use SSL certificates, if only to get rid of the nagging about self signed certs.
|
# ¿ Jul 9, 2014 14:16 |
|
I always use my firewall/gateway as a time sync source instead of an internet host. Everything syncs from this source and the device syncs from the internet. The firewall is always going to be up and I know the IP address will always be good. I also just have to change my master time sync source in one location instead of all over the place.
|
# ¿ Jul 12, 2014 03:45 |
|
I was running my build and capture routine today and ran into this: http://ardamis.com/2014/06/12/microsoft-security-update-kb2965788-requires-multiple-restarts/ I normally get around this poo poo by using offline servicing to apply the update that causes multiple reboots. Unfortunately, it seems that offline servicing determines that this update is not required and won't install it. It does become required somehow during the build and capture routine though. The funny thing is if I run the routine without that update and then go and try to apply the update using offline servicing on the captured image, it installs. This all makes sense because...uh...Microsoft? I get why it is marked as not applicable then becomes applicable later but nonetheless come the gently caress on Microsoft. Either fix SCCM so these updates don't break task sequences or fix the updates so they don't break SCCM. These updates are like landmines in the updates catalog that you have to be careful of lest you run into one and waste hours of your time. I need a drink. FYI: http://support.microsoft.com/kb/2894518 is a good link to have on hand as a list of lovely updates that cause this issue.
|
# ¿ Aug 8, 2014 03:50 |
|
I work around them by offline servicing them in which has the benefit of making the build and capture routine a lot faster This one is just an oddball where it relies on a previous update. That previous update is a one of those "pending reboot" ones where it doesn't fully apply until after Windows has booted. I got around it by removing that one update from the BnC group and then offline servicing it in after the BnC had run. I really should start using MDT though.
|
# ¿ Aug 9, 2014 18:08 |
|
peak debt posted:You can't do those updates with "Apply Updates" but you can wrap them up in a package and put them onto the machine with "Install Package" if you select "Installer reboots the computer on its own". Since I had to pull them out of the update group anyways I just left it out of the build and capture update group and offline serviced them into the built image. I've also made a note to check KB2894518 every time I do a BnC to find any of these trouble updates and pull them from the BnC update group. It's just annoying in the long run but that's part of administering Enterprise Microsoft stuff.
|
# ¿ Aug 12, 2014 21:02 |
|
Just in case anyone missed it, there's a bad batch of Windows Updates this month: http://www.infoworld.com/t/microsof...-2975331-248582 They have the potential to get machines stuck in a bluescreen loop so be sure to back them out if you've already approved them for install.
|
# ¿ Aug 18, 2014 16:54 |
|
What graphics card is in the system? If it doesn't have an EFI option ROM then you'll need to enable the legacy video support option in the UEFI setup
|
# ¿ Aug 19, 2014 14:09 |
|
The .NET Framework is a component of Sever 2008/2008 R2. You have you use DISM to install it: 32-bit: DISM /online /enable-feature /featurename:NetFx3-ServerCore-WOW64 64-bit: DISM /online /enable-feature /featurename:NetFx3-ServerCore Number19 fucked around with this message at 21:55 on Sep 7, 2014 |
# ¿ Sep 7, 2014 21:53 |
|
Oh that's right 2008 is different. I skipped that one entirely so I keep forgetting how different 2008 and R2 are. According to this post: http://social.msdn.microsoft.com/Fo...orum=netfxsetup that error seems to indicate some form of OS corruption. That post is talking about Vista though and it might be different on 2008. Number19 fucked around with this message at 02:07 on Sep 8, 2014 |
# ¿ Sep 8, 2014 02:01 |
|
I rebuild my reference images when there's a major feature that I want to add that will that can't be offline serviced. A good example is when we moved to the .NET Framework 4.5.1. Windows 7 can't install the update version offline and installing it during deploy will add a lot of time and extra update rounds. It's better to just do a build and capture with it included. Upgrading internet Explorer versions is another good example of why you'd want to do this.
|
# ¿ Sep 29, 2014 01:21 |
|
If they're domain joined then the MS account thing won't be an issue. I have a few I've deployed lately and they ask for domain credentials just like any other version of Windows.
|
# ¿ Oct 31, 2014 06:48 |
|
Holy poo poo BRB, buying some more scotch.
|
# ¿ Nov 18, 2014 19:21 |
|
The SChannel update also got reissued to disable the new ciphers that were causing problems. You might want to patch that in while you're at it.
|
# ¿ Nov 18, 2014 19:45 |
|
As per this link: http://blogs.technet.com/b/srd/archive/2014/11/18/additional-information-about-cve-2014-6324.aspx You only need to patch domain controllers immediately. The rest of the updates are just for completeness and can be patched normally.
|
# ¿ Nov 18, 2014 19:52 |
|
|
# ¿ Apr 29, 2024 02:10 |
|
incoherent posted:Another month, another set of patches breaking poo poo. Discuss. It seems to only break Windows 7 / 2008 R2 in my testing so far. My Windows 8.1 and Server 2012 R2 installs are behaving properly. e: it also appears that it only affect 64-bit Windows 7/2008R2. Number19 fucked around with this message at 22:55 on Dec 10, 2014 |
# ¿ Dec 10, 2014 22:41 |