Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?
anyone have a quick script I can use to get the email address field out of AD groups, and throw it in a csv?

I know the end is just going to be |export emails.csv basically, and I have a bunch of get-ad commands, but I didn't see any that would get that specific field.

Adbot
ADBOT LOVES YOU

Collateral Damage
Jun 13, 2009

I think something like get-adgroupmember -identity <group> | get-aduser | select fullname, email | export-csv emails.csv would do it, but I'm at home and don't have an AD to test with.

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Collateral Damage posted:

I think something like get-adgroupmember -identity <group> | get-aduser | select fullname, email | export-csv emails.csv would do it, but I'm at home and don't have an AD to test with.
Get-ADGroupMember -Identity "Group Name" | Get-ADUser -Properties Mail | Where-Object {$_.Mail -ne $null} | Select-Object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv

You might have to worry about nested groups, but that should get you started.

Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?

anthonypants posted:

Get-ADGroupMember -Identity "Group Name" | Get-ADUser -Properties Mail | Where-Object {$_.Mail -ne $null} | Select-Object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv

You might have to worry about nested groups, but that should get you started.

How would I make this recursive in a range, or say all of the ones I'm interested in start with XYZ, and do all those without running the command 30 times?

Inspector_666
Oct 7, 2003

benny with the good hair

Avenging_Mikon posted:

How would I make this recursive in a range, or say all of the ones I'm interested in start with XYZ, and do all those without running the command 30 times?

Is the range a list of given groups? If so, make an array containing them ($grouplist = "group1", "group2", "group3") and then put that one-liner in a foreach.

code:
$grouplist = "group1", "group2", "group3"

foreach ($group in $grouplist) {
     Get-ADGroupMember -Identity $group | Get-ADUser -Properties Mail | Where-Object {$_.Mail -ne $null} | 
     Select-Object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv
}
(linebreak for the sake of tables, but it wouldn't break anything anyway)

If you want all of the groups to dump into one CSV, add "-append" to the end of the Export-Csv command, if you want different CSVs for each group, make it "C:\path\to\$group.csv

You can also automagically pull the groups by starting with a
code:
Get-ADGroup -filter {name -like "XYZ*"} |
and skipping the foreach. (I'm realizing that practically all of my scripts make use of at least one foreach :v:)

Inspector_666 fucked around with this message at 23:52 on Feb 8, 2017

Zaepho
Oct 31, 2013

Avenging_Mikon posted:

anyone have a quick script I can use to get the email address field out of AD groups, and throw it in a csv?

I know the end is just going to be |export emails.csv basically, and I have a bunch of get-ad commands, but I didn't see any that would get that specific field.

Do you want the Group's email address or the users in the groups' email addresses?

If the latter everyone has you. If the former...

code:
get-ADGroup | where-object {$_.Name -like "XYZ*"} | select-object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv

Inspector_666
Oct 7, 2003

benny with the good hair

Zaepho posted:

Do you want the Group's email address or the users in the groups' email addresses?

If the latter everyone has you. If the former...

code:
get-ADGroup | where-object {$_.Name -like "XYZ*"} | select-object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv

If your environment has a lot of groups you're gonna want to filter in the Get-ADGroup cmdlet, not in a separate where-object. Even if it doesn't, you should still do it. Filter left!

Zaepho
Oct 31, 2013

Inspector_666 posted:

If your environment has a lot of groups you're gonna want to filter in the Get-ADGroup cmdlet, not in a separate where-object. Even if it doesn't, you should still do it. Filter left!
You're right. I'm terrible about filtering in the AD commandlets. I can NEVER get it right without a few tries.

Inspector_666
Oct 7, 2003

benny with the good hair

Zaepho posted:

You're right. I'm terrible about filtering in the AD commandlets. I can NEVER get it right without a few tries.

I'm the same, I always try to cram a -match or -include in there before I remember -like is about as fancy as I can get.

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Inspector_666 posted:

Is the range a list of given groups? If so, make an array containing them ($grouplist = "group1", "group2", "group3") and then put that one-liner in a foreach.

code:
$grouplist = "group1", "group2", "group3"

foreach ($group in $grouplist) {
     Get-ADGroupMember -Identity $group | Get-ADUser -Properties Mail | Where-Object {$_.Mail -ne $null} | 
     Select-Object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv
}
(linebreak for the sake of tables, but it wouldn't break anything anyway)

If you want all of the groups to dump into one CSV, add "-append" to the end of the Export-Csv command, if you want different CSVs for each group, make it "C:\path\to\$group.csv

You can also automagically pull the groups by starting with a
code:
Get-ADGroup -filter {name -like "XYZ*"} |
and skipping the foreach. (I'm realizing that practically all of my scripts make use of at least one foreach :v:)
And to extend this you'd do
code:
Get-ADGroup -Filter {(Name -like "XYZ*") -or (Name -like "*ABC")}

Pile Of Garbage
May 28, 2007



Get-ADGroupMember is slow as balls and starts to go to poo poo if you've got groups with lots of members. It's quicker to pull the member attribute of the group object and then pass that to Get-ADObject:

code:
(Get-ADGroup -Identity 'Users' -Properties Member).Member | Get-ADObject
That is usually 3-5x faster than Get-ADGroupMember (You can check for yourself with Measure-Command). With some logic and a loop you can do it recursively although that may end up being costlier because Get-ADGroupMember returns unique objects so using the above you'd probably have to pipe it to Select-Object -Unique which is costly.

Collateral Damage
Jun 13, 2009

anthonypants posted:

You might have to worry about nested groups, but that should get you started.
If you can install extra software, the Quest AD tools has a get-qadMemberof with an -Indirect switch that traverses nested groups.

Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?

Zaepho posted:

Do you want the Group's email address or the users in the groups' email addresses?

If the latter everyone has you. If the former...

code:
get-ADGroup | where-object {$_.Name -like "XYZ*"} | select-object Name,Mail | Export-Csv -NoTypeInformation C:\path\to\emails.csv

Actually, it is the former. We do distro lists in AD, and right now don't need the members, just the email associated with the group, for one department. There's like 30 of the groups, each with a different email.

Thanks all, this is great info

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy
This page is super helpful if you're having trouble constructing AD cmdlet filters. Especially for confusing things like having to escape characters you wouldn't normally care about.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy
For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13.

If you're in the area come get some free pizza and heckle me.

Inspector_666
Oct 7, 2003

benny with the good hair

Briantist posted:

For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13.

If you're in the area come get some free pizza and heckle me.

Ugh, Times Square. But hey, free pizza and the topics sound pretty interesting.

The Fool
Oct 16, 2003


Because Reasons I wrote this:
code:
get-msoluser -all | Where {$_.isLicensed -eq "True"} | Select DisplayName,@{Label="License";Expression={$_.Licenses.AccountSKUID | Where {$_ -contains "contoso:DESKLESSPACK" -or $_ -contains "contoso:ENTERPRISEPACK" -or $_ -contains "contoso:STANDARDPACK"}}}

beepsandboops
Jan 28, 2014
A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers

I have no problems with the date and email parts of it, but I'm really scratching my head about how to connect to all of these different servers. As far as I can tell, this file isn't publicly accessible so I have to connect to each of these servers separately.

The servers are in different environments and different states (not all of them are domain-joined), so I don't know how to specify and securely store several sets of credentials for each server

How do other people approach working with multiple machines like this?

The Fool
Oct 16, 2003


beepsandboops posted:

A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers

I have no problems with the date and email parts of it, but I'm really scratching my head about how to connect to all of these different servers. As far as I can tell, this file isn't publicly accessible so I have to connect to each of these servers separately.

The servers are in different environments and different states (not all of them are domain-joined), so I don't know how to specify and securely store several sets of credentials for each server

How do other people approach working with multiple machines like this?

Have service accounts with identical credentials on all of the machines, store non-sensitive connection info in a csv.

Iterate over the csv, do your thing.

If you want to fully automate, you can store the credentials as a secure string.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

beepsandboops posted:

A developer's asked me to write a script that takes a look at a file on our IIS servers, checks the modified date, then sends an email to the developers

I have no problems with the date and email parts of it, but I'm really scratching my head about how to connect to all of these different servers. As far as I can tell, this file isn't publicly accessible so I have to connect to each of these servers separately.

The servers are in different environments and different states (not all of them are domain-joined), so I don't know how to specify and securely store several sets of credentials for each server

How do other people approach working with multiple machines like this?
Set up certificate-based auth for PowerShell remoting on each one of them.

Jack the Lad
Jan 20, 2009

Feed the Pubs

I currently do a weekly report (by hand) which uses 9 other reports as source data and takes a really long time to put together.

If I want to automate some or all of the process, is Powershell a good option or is that better handled with Excel macros or Python or something?

I can get the source files as csvs, and my dream is being able to just throw them into a folder and hit a button and get a pretty report.

Jack the Lad fucked around with this message at 10:31 on Feb 13, 2017

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin

Jack the Lad posted:

I currently do a weekly report (by hand) which uses 9 other reports as source data and takes a really long time to put together.

If I want to automate some or all of the process, is Powershell a good option or is that better handled with Excel macros or Python or something?

I can get the source files as csvs, and my dream is being able to just throw them into a folder and hit a button and get a pretty report.

It depends on how your data is set up, but I have written scripts that do something similar.

You can convert the csv files into powershell objects, and then pick through them, saving what you want as a custom powershell object.

thebigcow
Jan 3, 2001

Bully!
Yes, but:

Depending on the size of all this, the complexity of the report you want, and if you want to look back at these things you may want to look into shoveling your data into MSSQL with PowerShell and using it's reporting services instead.




edit: the first time this works you're going to have a weird grin on your face, and then kick yourself for not doing it months ago.

Collateral Damage
Jun 13, 2009

And if you don't have a sql server already you can download SQL Server express with advanced features which includes reporting services. You want to grab the SQLEXPRADV_x64_ENU.exe installer.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Briantist posted:

For those of you in NYC: I'm presenting at the NYC PowerShell meetup at Microsoft in Times Square on Monday Feb 13.

If you're in the area come get some free pizza and heckle me.
Last minute reminder

Jowj
Dec 25, 2010

My favourite player and idol. His battles with his wrists mirror my own battles with the constant disgust I feel towards my zerg bugs.
Random Bitch: cmdlets that require multiple other cmdlets in order to be functional. Take a look at this:

code:
#schtasks.exe is used to create the scheduled task initially but I removed the line to avoid tablebreaking

# schtasks.exe cannot modify specific battery arguments without importing XML (not gonna do.dat). Modify it here:
$settings = New-ScheduledTaskSettingsSet -allowStartIfonBatteries -dontStopIfGoingOnBatteries 
$currentUser = [Security.Principal.WindowsIdentity]::GetCurrent().Name
# SchTasks.exe cannot specify a user for the LOGON schedule - it applies to all users. Modify it here:
$trigger = New-ScheduledTaskTrigger -AtLogon -User $currentUser
# SchTasks.exe cannot specify an action with long arguments (maxes out at like 200something chars). Modify it here: 
$action = New-ScheduledTaskAction -Execute "$PSHome\Powershell.exe" -Argument "-File `"$tempRestartScriptPath`""
Set-ScheduledTask -taskname $taskName -settings $settings -action $action -trigger $trigger
First, to know how I got here, I am working on a project with a friend who had already written most of this. However, we were running into a problem where this scheduled task we were creating wouldn't run. Turns out this is because I'm working from my laptop and it wasn't plugged in.

"Well, that should be easy to fix, just allow the scheduled task to start when on batteries and make sure to also allow it to continue when on batteries!"

Well sure, but through schtasks.exe you can't actually set that through the CLI; you'd have to import a loving XML file in and I wasn't gonna do that for just this one problem. So I start looking into the *-scheduledtask* cmdlets (side bitch: Set-scheduledtask and new-scheduledtasksettingsset are names that are FAR too similar gently caress you microsoft).

The long and short of it is, if I want to configure this in powershell I have to first define a variable = new-scheduledtasksettingSet with the proper parameters, and then use that cmdlet as a parameter itself in set-scheduledTask. Which makes it frustrating to figure out when you're approaching a new problem. Turns out microsoft is doing a lot of this sort of thing though, as a bunch of AzureCLI poo poo is the same way (you can't just run like new-azureVM, you have to pass it a bunch of parameters that are themselves cmdlets in order to configure things like memory, network adapters, etc).

I'm happy now though, this was the last bug to fix and now the project builds all the way through :3

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
You can make a scheduled task once and then export the XML though.

thebigcow
Jan 3, 2001

Bully!
Write it all on one line for giggles.

Jowj
Dec 25, 2010

My favourite player and idol. His battles with his wrists mirror my own battles with the constant disgust I feel towards my zerg bugs.

anthonypants posted:

You can make a scheduled task once and then export the XML though.

You absolutely can, but I was irritated by having to do that for this one thing so I was trying to work around it. I think if I had started from the beginning that's the way I would've gone. This way I learned something, though!

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Jowj posted:

Random Bitch: cmdlets that require multiple other cmdlets in order to be functional. Take a look at this:

code:
#schtasks.exe is used to create the scheduled task initially but I removed the line to avoid tablebreaking

# schtasks.exe cannot modify specific battery arguments without importing XML (not gonna do.dat). Modify it here:
$settings = New-ScheduledTaskSettingsSet -allowStartIfonBatteries -dontStopIfGoingOnBatteries 
$currentUser = [Security.Principal.WindowsIdentity]::GetCurrent().Name
# SchTasks.exe cannot specify a user for the LOGON schedule - it applies to all users. Modify it here:
$trigger = New-ScheduledTaskTrigger -AtLogon -User $currentUser
# SchTasks.exe cannot specify an action with long arguments (maxes out at like 200something chars). Modify it here: 
$action = New-ScheduledTaskAction -Execute "$PSHome\Powershell.exe" -Argument "-File `"$tempRestartScriptPath`""
Set-ScheduledTask -taskname $taskName -settings $settings -action $action -trigger $trigger
First, to know how I got here, I am working on a project with a friend who had already written most of this. However, we were running into a problem where this scheduled task we were creating wouldn't run. Turns out this is because I'm working from my laptop and it wasn't plugged in.

"Well, that should be easy to fix, just allow the scheduled task to start when on batteries and make sure to also allow it to continue when on batteries!"

Well sure, but through schtasks.exe you can't actually set that through the CLI; you'd have to import a loving XML file in and I wasn't gonna do that for just this one problem. So I start looking into the *-scheduledtask* cmdlets (side bitch: Set-scheduledtask and new-scheduledtasksettingsset are names that are FAR too similar gently caress you microsoft).

The long and short of it is, if I want to configure this in powershell I have to first define a variable = new-scheduledtasksettingSet with the proper parameters, and then use that cmdlet as a parameter itself in set-scheduledTask. Which makes it frustrating to figure out when you're approaching a new problem. Turns out microsoft is doing a lot of this sort of thing though, as a bunch of AzureCLI poo poo is the same way (you can't just run like new-azureVM, you have to pass it a bunch of parameters that are themselves cmdlets in order to configure things like memory, network adapters, etc).

I'm happy now though, this was the last bug to fix and now the project builds all the way through :3

The be clear, the parameter values you pass are not cmdlets, they are objects that are generated by cmdlets.

This starts to make a lot of sense when the number and scope of objects starts to become really huge. The objects are instances classes, and the classes handle validation of the options. The class determines which options are valid together, which is something that might need to be evaluated at runtime.

In PowerShell this is possible with dynamic parameters but they can be flaky and when you start to have lots of them that can't all be used together. Encapsulation just makes sense at that point.

Sometimes though, it's just the fact that the cmdlets are thin wrappers around existing APIs and classes (*cough* WSUS cmdlets).

If your use cases are narrow enough in scope and used often enough, you should write a function that takes only the parameters you need to specify that wraps the whole process.

Jowj
Dec 25, 2010

My favourite player and idol. His battles with his wrists mirror my own battles with the constant disgust I feel towards my zerg bugs.

Briantist posted:

The be clear, the parameter values you pass are not cmdlets, they are objects that are generated by cmdlets.

This starts to make a lot of sense when the number and scope of objects starts to become really huge. The objects are instances classes, and the classes handle validation of the options. The class determines which options are valid together, which is something that might need to be evaluated at runtime.

In PowerShell this is possible with dynamic parameters but they can be flaky and when you start to have lots of them that can't all be used together. Encapsulation just makes sense at that point.

Sometimes though, it's just the fact that the cmdlets are thin wrappers around existing APIs and classes (*cough* WSUS cmdlets).

Yeah this is a really good point. It makes *logical* sense when I think about it, its just frustrating when I'm trying to learn how to do something. However, can you elaborate more on this?

quote:

If your use cases are narrow enough in scope and used often enough, you should write a function that takes only the parameters you need to specify that wraps the whole process.

I'm not sure I'm following what you mean here. Do you have an example I could look at?

thebigcow
Jan 3, 2001

Bully!
You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx

If you really wanted to, you could make a function that takes what you feel is important in a scheduled task as parameters and then does all the work for you. The problem with this is that after you learn the scheduled task cmdlets well enough to make a function to handle them you may feel like you don't need one at all.

Inspector_666
Oct 7, 2003

benny with the good hair

thebigcow posted:

You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx

When you do stuff like this, is the best way to get it into your workflow to add a Set-Alias line to your profile.ps1 file? That's what I've been doing with my scripts but it seems hack-y.

Jowj
Dec 25, 2010

My favourite player and idol. His battles with his wrists mirror my own battles with the constant disgust I feel towards my zerg bugs.

thebigcow posted:

You can make functions that themselves behave like cmdlets. https://technet.microsoft.com/en-us/library/hh360993.aspx

If you really wanted to, you could make a function that takes what you feel is important in a scheduled task as parameters and then does all the work for you. The problem with this is that after you learn the scheduled task cmdlets well enough to make a function to handle them you may feel like you don't need one at all.

OH, yeah I already do that. I thought there was something more complex I'm missing. Thanks for clarifying! That snippet of code I posted earlier is actually within a function, I just sanitized it. I'll post when I get home if I remember to show the full thing.

The Fool
Oct 16, 2003


Inspector_666 posted:

When you do stuff like this, is the best way to get it into your workflow to add a Set-Alias line to your profile.ps1 file? That's what I've been doing with my scripts but it seems hack-y.

Make your own modules.

Inspector_666
Oct 7, 2003

benny with the good hair

The Fool posted:

Make your own modules.

Welp, guess it's time to learn a thing.

Walked
Apr 14, 2003

Inspector_666 posted:

Welp, guess it's time to learn a thing.

Well worth learning.

Sapien PowerShell Studio actually has pretty good boilerplate code / module projects for learning what those look like. I think its a 30 or 60 day trial, too. Makes getting started with Modules really easy

Inspector_666
Oct 7, 2003

benny with the good hair

Walked posted:

Well worth learning.

Sapien PowerShell Studio actually has pretty good boilerplate code / module projects for learning what those look like. I think its a 30 or 60 day trial, too. Makes getting started with Modules really easy

Apparently all I had to do was add two lines to the existing script, rename it something that fits with the proper verb-noun style, save it as a .psm1 in the right place, and then import it the usual way.

I know there's the whole manifest file and I need to actually write proper help documentation for stuff, but that was a lot easier than I was expecting.

Pile Of Garbage
May 28, 2007



Walked posted:

Well worth learning.

Seconded, modules are awesome and well worth learning if you use PowerShell frequently.

Inspector_666 posted:

Apparently all I had to do was add two lines to the existing script, rename it something that fits with the proper verb-noun style, save it as a .psm1 in the right place, and then import it the usual way.

I know there's the whole manifest file and I need to actually write proper help documentation for stuff, but that was a lot easier than I was expecting.

If you put your module into one of the locations referenced by the PSModulePath environment variable then you can take advantage of automatic module loading which negates the need to manually import the module: https://msdn.microsoft.com/en-us/library/dd878284(v=vs.85).aspx.

Pile Of Garbage fucked around with this message at 17:07 on Feb 16, 2017

Adbot
ADBOT LOVES YOU

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy
Yeah modules are the way to go. Ensure they are well-formed modules with proper paths and manifests and then put them in a path that's included in variable cheese-cube mentioned, that way you can import them by name only.

I started separating out every function into its own .ps1 file (that matches the function name) for my modules, in folders that determine whether they get exported or not (like Public and Pivate). Then my .psm1 becomes boilerplate code like this:

code:
#Requires -Version 4.0

$Subs = @(
    @{
        Path = 'Private'
        Export = $false
        Recurse = $false
    } ,

    @{
        Path = 'Public'
        Export = $true
        Recurse = $true
    }
) 

$Subs | ForEach-Object -Process {
    $sub = $_
    $PSScriptRoot | Join-Path -ChildPath $sub.Path | 
    Get-ChildItem -Filter *-*.ps1 -Recurse:$sub.Recurse -ErrorAction Ignore | ForEach-Object -Process {
        try {
            . $_.FullName
            if ($sub.Export -or $Global:__MyModuleName_Export_All) {
                Export-ModuleMember -Function $_.BaseName
            }
        } catch {
            Write-Error -Message "Could not import $($_.FullName)"
        }
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply