|
So I want to query all my machines to figure out which ones are statically assigned and not on DHCP. There's a property Network Configuration > DHCP Enabled, which is great. So I wrote a query that returns the System Name and value of the DHCP enabled, with no criteria (with the expectation of setting criteria to DHCP Enabled = 0). With no criteria, every computer shows up in the list twice, once with a value of 0, once with a value of 1, EXCEPT (as far as I can tell) clients that truly do not have DHCP enabled, they only show up once as DHCP Enabled = 0. Is there any way to make this report a bit more sane?
|
# ? Nov 15, 2011 23:23 |
|
|
# ? May 14, 2024 06:32 |
|
FISHMANPET posted:So I want to query all my machines to figure out which ones are statically assigned and not on DHCP. There's a property Network Configuration > DHCP Enabled, which is great. What exactly are you pulling? I assume we are talking SCCM? Pull some other data, like the name of the connection, or the name of the adapter, subnet, or whatever. Remember that anything that is a network connection will show up in this list. I seem to remember having issues because I was getting results from 6to4 adapters etc. And be sure whether it is pulling the DHCP setting for IPv4 or IPv6. In my case, I only included results that had the proper subnet mask (255.255.0.0 in my case), and were part of our class A (xx.%.%.%). I find it's always better to include too much data to be sure you are getting what you want, and then pare it down. Gets around mistakes when a query is not returning what you think it should be returning.
|
# ? Nov 15, 2011 23:32 |
|
quackquackquack posted:What exactly are you pulling? I assume we are talking SCCM? Yeah, SCCM. Well, if I could query on null values I could easily do IP Address is NOT null, and DHCP server is Null. This is pretty silly. I've manually gone through various outputs of my various queries and found the machines I'm looking for, but now I just want to figure out if this is even possible. E: I'm a big babby and using the query builder instead of manually typing in the SQLish query, if I did it manually can I make the query above?
|
# ? Nov 15, 2011 23:52 |
|
FISHMANPET posted:Yeah, SCCM. Ooooh, you're using the query builder. Yeah, unless you are making a collection, stop that. Use the "Reporting" section instead. It's a learning curve, but it's just SQL, and MUCH more powerful.
|
# ? Nov 16, 2011 00:34 |
|
IT Guy posted:
If I recall my studies correctly, R2 should allow you to take any sort of backup with wbadmin manually, but you'll run into limitations when doing scheduled jobs in that when saving to a remote share, you can only have store that day's backup, the next day will overwrite that information unless you set up some elaborate scripting.
|
# ? Nov 16, 2011 04:30 |
|
KomradeVirtunov posted:If I recall my studies correctly, R2 should allow you to take any sort of backup with wbadmin manually, but you'll run into limitations when doing scheduled jobs in that when saving to a remote share, you can only have store that day's backup, the next day will overwrite that information unless you set up some elaborate scripting. Actually, you are correct. It just didn't affect us though because the Windows backup is backing up to a NAS and then the NAS syncs to an offsite location to create archives. But yes, you are correct that it only allows one day backup when backing up to a network location. I'm slowly starting to lean back towards Backup Exec. However, I love the way that Windows backup creates a virtual hard disk that you can just attach to a machine. It's so handy.
|
# ? Nov 16, 2011 14:01 |
|
IT Guy posted:
Wait who do you do this? Everything I found said that this isn't possible :o
|
# ? Nov 18, 2011 19:53 |
|
Hey everybody, new to the thread, but just thought I would say hi and offer my help when I can. Been working with SCCM since 2007 came out, right now I'm focusing on automating SCCM with Powershell. Also, I'll complain that my new job is doing SCCM in the most convoluted way ever. Every single thing is distributed through task sequences, with over 5000 machines literally nothing is standardized, there is no tracking of licensing, and because it's government we don't force any changes, which means we're still deploying IE6 to machines, and are planning on fully supporting both WinXP 32bit and Windows 7 64bit for the next 2 years at least. And despite a heavy focus on security, it takes us literally 2 months to update Flash.
|
# ? Dec 3, 2011 07:37 |
|
spidoman posted:Every single thing is distributed through task sequences... Everything else you listed is hosed up, but I personally think Task Sequences are the way to go most of the time.
|
# ? Dec 3, 2011 19:26 |
|
quackquackquack posted:Everything else you listed is hosed up, but I personally think Task Sequences are the way to go most of the time. We're doing it because of the task sequence reporting, but DCM does that better anyway. Task Sequences work well for advanced installs but for simple installs its just extra work, not to mention not best practices.
|
# ? Dec 3, 2011 19:41 |
|
True, I suppose it depends what you mean by "everything". I prefer Task Sequences for installs run through "Run Advertised Program", as it is an easy way to show a window and progress bar for their install (even if the progress bar is mostly useless).
|
# ? Dec 3, 2011 20:40 |
|
Does anyone have any suggestions for free centralized log monitoring software?
|
# ? Dec 5, 2011 16:33 |
|
I know this is the Windows Enterprise thread, can anyone point me to a good Mac Enterprise thread? I just started a job where I'll be running Casper. I have extensive SCCM and MDT experience, so the concepts are not foreign. I'd love a place to talk about Casper, Netboot, binding to AD, the benefits of imaging vs modifying the OEM install, etc. And don't worry, I can't escape this thread, I'll still be answering questions about SCCM, and asking questions about Altiris (my project after Casper).
|
# ? Dec 7, 2011 01:47 |
|
Start one? I'm sure people will pitch in. I'm about to start having to add Macs to AD and I know I'd find it interesting as well as contribute as I learn more about the whole thing.
|
# ? Dec 7, 2011 02:33 |
|
As little traffic as this thread gets, we could just rename it and let it be about both (and throw in some Linux if anybody cares). Especially since most management of that level just comes down to getting those things play nice with AD.
|
# ? Dec 7, 2011 02:39 |
|
FISHMANPET posted:As little traffic as this thread gets, we could just rename it and let it be about both (and throw in some Linux if anybody cares). That's a really good point. This thread is relatively low traffic, and a Mac one would be even more so, perhaps by joining forces we can catch a few more eyes. My first question is about NetBoot images. On the Windows side, I would PXE into WinPE, which would write an image to the disk, etc etc. For Mac, it seems that I create a NetBoot image by installing Lion (any downsides to installing it to a VM for this?), customizing it, then making an image of it. This seems like overkill to have a full OS X install being run over the network without local swap. How much bandwidth does it take? Does it also seem like a security concern? Can I delete things like iTunes? Am I overreacting, and I shouldn't try and pare down the NetBoot image?
|
# ? Dec 7, 2011 05:52 |
|
quackquackquack posted:That's a really good point. This thread is relatively low traffic, and a Mac one would be even more so, perhaps by joining forces we can catch a few more eyes. Unless you're VM server is an Apple machine, you can't legally run OSX in a VM. Time to pony up for a Mac Mini running OSX Lion. And I'll just say this here, Apple doesn't care about the Enterprise. They took a ton of stuff out of Lion to make Macs even harder to manage. They discontinued their rack mounted server. And none of that matters, because in this day in age, the easiest way to tell that a company doesn't care about the enterprise? They won't let you virtualize their product. We've solved the hardware problem. We no longer care how powerful a single machine is, how reliable it is, how the drives are arranged, etc etc. Virtualize it properly, and stop worrying about it. Apple doesn't get that. They want you to buy mac minis. Or just not manage them I guess?
|
# ? Dec 7, 2011 06:20 |
|
Is there a way to put a computer in a collection as part of a task sequence in SCCM?
|
# ? Dec 7, 2011 07:55 |
|
FISHMANPET posted:Unless you're VM server is an Apple machine, you can't legally run OSX in a VM. Time to pony up for a Mac Mini running OSX Lion. The VM is on Apple hardware. In this case, the VM is going to be the "reference" NetBoot image. I just wanted to make sure there weren't any issues with creating the NetBoot reference image in VMware instead of bare metal. I completely agree about Apple abandoning the Enterprise. That's why I'm not trying to duplicate what's done on the Windows side, but instead trying to fit "the Apple way". For example, Casper will not be used to push any software, it will only be used to make packaged software available in what is essentially a private App Store (Self Service). However, I would still like some way to make my testing environment (fresh VMs to practice imaging, software deployment, etc) less retarded than a bunch of Mac Minis running two Lion VMs each. Understanding that I don't care that it is not on the HCL, and that it is not a real server, how does vSphere 5 run on a Mac Pro?
|
# ? Dec 7, 2011 14:04 |
|
spidoman posted:Is there a way to put a computer in a collection as part of a task sequence in SCCM? This might shed some light on it: http://social.technet.microsoft.com/Forums/en-US/configmgrsdk/thread/7ec9af4d-d84f-41ef-8fe0-ecb2d158b80a In that case especially, they could do what they wanted to do with a query instead of a direct membership, so I'm wondering what you're doing that can't be accomplished with a query.
|
# ? Dec 7, 2011 16:07 |
|
Going from DS 6.8 only to setting up SMP 7.1 from scratch is loving melting my brain.
|
# ? Dec 8, 2011 19:49 |
|
FISHMANPET posted:This might shed some light on it: The issue is that our AD queries are set to refresh every two hours at microsoft's suggestion, and with a reimage creating an obsolete record it takes up to 3 hours to get a computer completely rebuilt.
|
# ? Dec 8, 2011 23:13 |
|
I have joined a Lion machine to AD. Is there a way to query what OU it lives in from the Mac?
|
# ? Dec 8, 2011 23:19 |
|
Is there anyway I can get Windows 7's "Previous Versions" of files to make a copy of whatever the user touches on a network share, but then store it on the user's local machine? I know it's not a proper backup solution, but my outfit has a huge server drive space problem that's going to last awhile and I need a quick and cheap solution.
|
# ? Dec 10, 2011 04:31 |
|
Italy's Chicken posted:Is there anyway I can get Windows 7's "Previous Versions" of files to make a copy of whatever the user touches on a network share, but then store it on the user's local machine? I know it's not a proper backup solution, but my outfit has a huge server drive space problem that's going to last awhile and I need a quick and cheap solution. No, there is no way to do this. It would completely defeat the point of having a Previous Version of a network file if only one person could access it.
|
# ? Dec 10, 2011 07:19 |
|
If you tell us why you're trying to do this, we'll just go ahead and make fun of you. Seriously though, an MD1000 full of 7.2k drives is really not that expensive. Also don't backup to local disks.
|
# ? Dec 10, 2011 16:52 |
|
evil_bunnY posted:If you tell us why you're trying to do this, we'll just go ahead and make fun of you. Seriously though, an MD1000 full of 7.2k drives is really not that expensive. Also don't backup to local disks.
|
# ? Dec 10, 2011 22:07 |
|
Take a few hard drives out of old PCs and put them into the server to extend the drive space. If they are too slow for server use, make a RAID0 out of them.
|
# ? Dec 10, 2011 23:44 |
|
Italy's Chicken posted:Completely out of server storage. Staff are too inept to delete old files and I have no authority to make them.
|
# ? Dec 10, 2011 23:56 |
|
I've got a server that runs 2k3 and has a license for 50 terminal server seats. Does anyone know of a dirt-cheap Thin Client that'll work to just RDP into the server? It'd be on the local network so I don't need VPN support, just solid RDP. Do you think one of these is worth testing out? http://www.dealextreme.com/p/nc600-multi-user-100mbps-lan-network-workstation-terminal-59666 Or maybe someone could recommend a specific Wyse or HP Thin Client that are easy to obtain on eBay for sub-$100? Thanks for the help.
|
# ? Dec 12, 2011 20:04 |
|
Italy's Chicken posted:Next to no money left till sometime in January. Italy's Chicken posted:Completely out of server storage. Italy's Chicken posted:2000-3000 users Italy's Chicken posted:making a local copy of their work just for a month or so.... So basically you need a budget for whiskey and a gun. Because someone at that company needs shot.
|
# ? Dec 12, 2011 20:13 |
|
I've got a bit of a confusing issue that someone with more knowledge could help me out with. So I just inherited a few hundred machines to manage. About 50-60 of these machines are having problems connecting properly to our domain. These computers (windows 7 machines) all have the same symptoms: *The firewall is turned off and it can't be turned back even if I'm logged in as a local admin (I get permission denied errors.) *They can't be managed through active directory, or pinged (by name or IP), or remoted into. *They tend to have problems properly PXE booting to our WDS server (This might be a separate issue, but if it's connected it probably ties in with the active directory management problems.) *They do have access to network resources (I can log on using my domain account, I can log into servers from the machines, I can remote OUT from them, etc.) I can fix it if I flatten and re-install everything on them, which I'm going to do here in a few days, but out of curiosity does anyone have any ideas of might have caused this? El Mero Mero fucked around with this message at 17:23 on Sep 3, 2017 |
# ? Dec 15, 2011 23:38 |
|
Check for any third-party software firewalls (I had a domain join blocked by Sophos before) and also double-check that the correct DNS Servers are entered into the clients. If they're in there, try shuffling around the primary and secondary DNS servers. I can't vouch for Win7 but on XP we have to have corporate's DNS as the primary or it'll sit there forever after you enter the login/pass and time out on the join. Lastly, log in locally and check the harddrive's NTSF file permissions. Add a permission for the "Everyone" group, and check the box to have it replicate to child. Apply and you should see the filesystem blast through the entire hard drive. See if you still get the permission issue after that. That's all I can think of.
|
# ? Dec 15, 2011 23:42 |
|
I'd echo a lot of what Zero VGS suggested. Start with DNS and the basic IP stuff - DNS counts for a lot with Windows on a domain. Try running a gpresult from an admin command prompt too so you know what group policies the PC's themselves think they should be getting. Being cynical, college = students which probably = every dodgy loving thing you can think of so it may simply be they've been infected and screwed around with very badly.
|
# ? Dec 17, 2011 18:20 |
|
I'm planning on doing my thesis on self-service user information management software and implementing one of such products to my work to help ease the burden of incorrect user information. I'm currently trying to find source material to use in the work but so far without any greater success. Anybody have any hints for good books concerning management ofuser information or anything that comes even close to it?
|
# ? Dec 17, 2011 21:07 |
|
So, how are you all provisioning file volumes on your file servers? Our SAN has a single big virtualised storage pool so I don't have to deal with RAID groups at SAN level, I just get an Xtb storage pool to play with. I currently provision multiple volumes of 2tb each to the file server, but of course you end up with some areas data on volumes with no free space and other volumes have lots of free space. A single big volume doesn't seem sensible as it makes things like backup and chkdsk "fun".
|
# ? Dec 18, 2011 14:23 |
|
Edit: Looks like the system needs to be patched to support it. three fucked around with this message at 15:41 on Dec 20, 2011 |
# ? Dec 20, 2011 15:31 |
|
Bitch Stewie posted:I currently provision multiple volumes of 2tb each to the file server, but of course you end up with some areas data on volumes with no free space and other volumes have lots of free space. What kind of SAN do you have? What kind of files do you serve?
|
# ? Dec 20, 2011 23:31 |
|
Bitch Stewie posted:So, how are you all provisioning file volumes on your file servers?
|
# ? Dec 21, 2011 05:59 |
|
|
# ? May 14, 2024 06:32 |
|
Misogynist posted:We avoid this problem by not running NTFS file servers. There's too many good storage platforms out there to waste time trying to roll our own and have them subject to these sorts of problems. What the hell does that have to do with NTFS? NTFS supports volume size of 256TB using a GPT disk. As long as your server is windows 2003 SP1 and above this hasn't been an issue in years.
|
# ? Dec 21, 2011 06:20 |