|
anyone have a quick script I can use to get the email address field out of AD groups, and throw it in a csv? I know the end is just going to be |export emails.csv basically, and I have a bunch of get-ad commands, but I didn't see any that would get that specific field.
|
# ? Feb 8, 2017 22:35 |
|
|
# ? May 15, 2024 04:06 |
|
I think something like get-adgroupmember -identity <group> | get-aduser | select fullname, email | export-csv emails.csv would do it, but I'm at home and don't have an AD to test with.
|
# ? Feb 8, 2017 22:49 |
|
Collateral Damage posted:I think something like get-adgroupmember -identity <group> | get-aduser | select fullname, email | export-csv emails.csv would do it, but I'm at home and don't have an AD to test with. You might have to worry about nested groups, but that should get you started.
|
# ? Feb 8, 2017 23:02 |
|
anthonypants posted:Get-ADGroupMember -Identity "Group Name" | Get-ADUser -Properties Mail | Where-Object {$_.Mail -ne $null} | Select-Object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv How would I make this recursive in a range, or say all of the ones I'm interested in start with XYZ, and do all those without running the command 30 times?
|
# ? Feb 8, 2017 23:21 |
|
Avenging_Mikon posted:How would I make this recursive in a range, or say all of the ones I'm interested in start with XYZ, and do all those without running the command 30 times? Is the range a list of given groups? If so, make an array containing them ($grouplist = "group1", "group2", "group3") and then put that one-liner in a foreach. code:
If you want all of the groups to dump into one CSV, add "-append" to the end of the Export-Csv command, if you want different CSVs for each group, make it "C:\path\to\$group.csv You can also automagically pull the groups by starting with a code:
Inspector_666 fucked around with this message at 23:52 on Feb 8, 2017 |
# ? Feb 8, 2017 23:44 |
|
Avenging_Mikon posted:anyone have a quick script I can use to get the email address field out of AD groups, and throw it in a csv? Do you want the Group's email address or the users in the groups' email addresses? If the latter everyone has you. If the former... code:
|
# ? Feb 8, 2017 23:56 |
|
Zaepho posted:Do you want the Group's email address or the users in the groups' email addresses? If your environment has a lot of groups you're gonna want to filter in the Get-ADGroup cmdlet, not in a separate where-object. Even if it doesn't, you should still do it. Filter left!
|
# ? Feb 8, 2017 23:58 |
|
Inspector_666 posted:If your environment has a lot of groups you're gonna want to filter in the Get-ADGroup cmdlet, not in a separate where-object. Even if it doesn't, you should still do it. Filter left!
|
# ? Feb 8, 2017 23:59 |
|
Zaepho posted:You're right. I'm terrible about filtering in the AD commandlets. I can NEVER get it right without a few tries. I'm the same, I always try to cram a -match or -include in there before I remember -like is about as fancy as I can get.
|
# ? Feb 9, 2017 00:08 |
|
Inspector_666 posted:Is the range a list of given groups? If so, make an array containing them ($grouplist = "group1", "group2", "group3") and then put that one-liner in a foreach. code:
|
# ? Feb 9, 2017 01:30 |
|
Get-ADGroupMember is slow as balls and starts to go to poo poo if you've got groups with lots of members. It's quicker to pull the member attribute of the group object and then pass that to Get-ADObject:code:
|
# ? Feb 9, 2017 03:43 |
|
anthonypants posted:You might have to worry about nested groups, but that should get you started.
|
# ? Feb 9, 2017 10:01 |
|
Zaepho posted:Do you want the Group's email address or the users in the groups' email addresses? Actually, it is the former. We do distro lists in AD, and right now don't need the members, just the email associated with the group, for one department. There's like 30 of the groups, each with a different email. Thanks all, this is great info
|
# ? Feb 9, 2017 16:32 |
|
This page is super helpful if you're having trouble constructing AD cmdlet filters. Especially for confusing things like having to escape characters you wouldn't normally care about.
|
# ? Feb 10, 2017 16:30 |
|
For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13. If you're in the area come get some free pizza and heckle me.
|
# ? Feb 10, 2017 16:33 |
|
Briantist posted:For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13. Ugh, Times Square. But hey, free pizza and the topics sound pretty interesting.
|
# ? Feb 10, 2017 16:39 |
|
Because Reasons I wrote this:code:
|
# ? Feb 10, 2017 22:01 |
|
A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers I have no problems with the date and email parts of it, but I'm really scratching my head about how to connect to all of these different servers. As far as I can tell, this file isn't publicly accessible so I have to connect to each of these servers separately. The servers are in different environments and different states (not all of them are domain-joined), so I don't know how to specify and securely store several sets of credentials for each server How do other people approach working with multiple machines like this?
|
# ? Feb 11, 2017 02:33 |
|
beepsandboops posted:A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers Have service accounts with identical credentials on all of the machines, store non-sensitive connection info in a csv. Iterate over the csv, do your thing. If you want to fully automate, you can store the credentials as a secure string.
|
# ? Feb 11, 2017 06:00 |
|
beepsandboops posted:A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers
|
# ? Feb 12, 2017 18:53 |
|
I currently do a weekly report (by hand) which uses 9 other reports as source data and takes a really long time to put together. If I want to automate some or all of the process, is Powershell a good option or is that better handled with Excel macros or Python or something? I can get the source files as csvs, and my dream is being able to just throw them into a folder and hit a button and get a pretty report. Jack the Lad fucked around with this message at 10:31 on Feb 13, 2017 |
# ? Feb 13, 2017 10:28 |
|
Jack the Lad posted:I currently do a weekly report (by hand) which uses 9 other reports as source data and takes a really long time to put together. It depends on how your data is set up, but I have written scripts that do something similar. You can convert the csv files into powershell objects, and then pick through them, saving what you want as a custom powershell object.
|
# ? Feb 13, 2017 15:10 |
|
Yes, but: Depending on the size of all this, the complexity of the report you want, and if you want to look back at these things you may want to look into shoveling your data into MSSQL with PowerShell and using it's reporting services instead. edit: the first time this works you're going to have a weird grin on your face, and then kick yourself for not doing it months ago.
|
# ? Feb 13, 2017 17:04 |
|
And if you don't have a sql server already you can download SQL Server express with advanced features which includes reporting services. You want to grab the SQLEXPRADV_x64_ENU.exe installer.
|
# ? Feb 13, 2017 19:38 |
|
Briantist posted:For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13.
|
# ? Feb 13, 2017 21:04 |
|
Random Bitch: cmdlets that require multiple other cmdlets in order to be functional. Take a look at this:code:
"Well, that should be easy to fix, just allow the scheduled task to start when on batteries and make sure to also allow it to continue when on batteries!" Well sure, but through schtasks.exe you can't actually set that through the CLI; you'd have to import a loving XML file in and I wasn't gonna do that for just this one problem. So I start looking into the *-scheduledtask* cmdlets (side bitch: Set-scheduledtask and new-scheduledtasksettingsset are names that are FAR too similar gently caress you microsoft). The long and short of it is, if I want to configure this in powershell I have to first define a variable = new-scheduledtasksettingSet with the proper parameters, and then use that cmdlet as a parameter itself in set-scheduledTask. Which makes it frustrating to figure out when you're approaching a new problem. Turns out microsoft is doing a lot of this sort of thing though, as a bunch of AzureCLI poo poo is the same way (you can't just run like new-azureVM, you have to pass it a bunch of parameters that are themselves cmdlets in order to configure things like memory, network adapters, etc). I'm happy now though, this was the last bug to fix and now the project builds all the way through :3
|
# ? Feb 14, 2017 06:07 |
|
You can make a scheduled task once and then export the XML though.
|
# ? Feb 14, 2017 08:16 |
|
Write it all on one line for giggles.
|
# ? Feb 14, 2017 17:03 |
|
anthonypants posted:You can make a scheduled task once and then export the XML though. You absolutely can, but I was irritated by having to do that for this one thing so I was trying to work around it. I think if I had started from the beginning that's the way I would've gone. This way I learned something, though!
|
# ? Feb 14, 2017 17:21 |
|
Jowj posted:Random Bitch: cmdlets that require multiple other cmdlets in order to be functional. Take a look at this: The be clear, the parameter values you pass are not cmdlets, they are objects that are generated by cmdlets. This starts to make a lot of sense when the number and scope of objects starts to become really huge. The objects are instances classes, and the classes handle validation of the options. The class determines which options are valid together, which is something that might need to be evaluated at runtime. In PowerShell this is possible with dynamic parameters but they can be flaky and when you start to have lots of them that can't all be used together. Encapsulation just makes sense at that point. Sometimes though, it's just the fact that the cmdlets are thin wrappers around existing APIs and classes (*cough* WSUS cmdlets). If your use cases are narrow enough in scope and used often enough, you should write a function that takes only the parameters you need to specify that wraps the whole process.
|
# ? Feb 14, 2017 17:31 |
|
Briantist posted:The be clear, the parameter values you pass are not cmdlets, they are objects that are generated by cmdlets. Yeah this is a really good point. It makes *logical* sense when I think about it, its just frustrating when I'm trying to learn how to do something. However, can you elaborate more on this? quote:If your use cases are narrow enough in scope and used often enough, you should write a function that takes only the parameters you need to specify that wraps the whole process. I'm not sure I'm following what you mean here. Do you have an example I could look at?
|
# ? Feb 15, 2017 20:25 |
|
You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx If you really wanted to, you could make a function that takes what you feel is important in a scheduled task as parameters and then does all the work for you. The problem with this is that after you learn the scheduled task cmdlets well enough to make a function to handle them you may feel like you don't need one at all.
|
# ? Feb 15, 2017 23:27 |
|
thebigcow posted:You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx When you do stuff like this, is the best way to get it into your workflow to add a Set-Alias line to your profile.ps1 file? That's what I've been doing with my scripts but it seems hack-y.
|
# ? Feb 15, 2017 23:53 |
|
thebigcow posted:You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx OH, yeah I already do that. I thought there was something more complex I'm missing. Thanks for clarifying! That snippet of code I posted earlier is actually within a function, I just sanitized it. I'll post when I get home if I remember to show the full thing.
|
# ? Feb 16, 2017 00:05 |
|
Inspector_666 posted:When you do stuff like this, is the best way to get it into your workflow to add a Set-Alias line to your profile.ps1 file? That's what I've been doing with my scripts but it seems hack-y. Make your own modules.
|
# ? Feb 16, 2017 00:34 |
|
The Fool posted:Make your own modules. Welp, guess it's time to learn a thing.
|
# ? Feb 16, 2017 00:48 |
|
Inspector_666 posted:Welp, guess it's time to learn a thing. Well worth learning. Sapien PowerShell Studio actually has pretty good boilerplate code / module projects for learning what those look like. I think its a 30 or 60 day trial, too. Makes getting started with Modules really easy
|
# ? Feb 16, 2017 15:21 |
|
Walked posted:Well worth learning. Apparently all I had to do was add two lines to the existing script, rename it something that fits with the proper verb-noun style, save it as a .psm1 in the right place, and then import it the usual way. I know there's the whole manifest file and I need to actually write proper help documentation for stuff, but that was a lot easier than I was expecting.
|
# ? Feb 16, 2017 17:02 |
|
Walked posted:Well worth learning. Seconded, modules are awesome and well worth learning if you use PowerShell frequently. Inspector_666 posted:Apparently all I had to do was add two lines to the existing script, rename it something that fits with the proper verb-noun style, save it as a .psm1 in the right place, and then import it the usual way. If you put your module into one of the locations referenced by the PSModulePath environment variable then you can take advantage of automatic module loading which negates the need to manually import the module: https://msdn.microsoft.com/en-us/library/dd878284(v=vs.85).aspx. Pile Of Garbage fucked around with this message at 17:07 on Feb 16, 2017 |
# ? Feb 16, 2017 17:03 |
|
|
# ? May 15, 2024 04:06 |
|
Yeah modules are the way to go. Ensure they are well-formed modules with proper paths and manifests and then put them in a path that's included in variable cheese-cube mentioned, that way you can import them by name only. I started separating out every function into its own .ps1 file (that matches the function name) for my modules, in folders that determine whether they get exported or not (like Public and Pivate). Then my .psm1 becomes boilerplate code like this: code:
|
# ? Feb 16, 2017 19:23 |