Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

adaz posted:

You need to set inheritance and propagation flags when you create your new access rule.

Right. It's been a while since I did this and wasn't near my domain. Totally forgot.

Adbot
ADBOT LOVES YOU

Wicaeed
Feb 8, 2005
Before I spend too much time writing a new script, does anyone know of a pre-made Powershell script that I could use to pull a list of user-created scheduled tasks off of a server and tell me what credential and/or username it is set to run as?

adaz
Mar 7, 2009

Wicaeed posted:

Before I spend too much time writing a new script, does anyone know of a pre-made Powershell script that I could use to pull a list of user-created scheduled tasks off of a server and tell me what credential and/or username it is set to run as?

I'm pretty sure it's a simple wmi query to win32_scheduledjob... which would be wrong. Win32_scheduledjob for some incredibly stupid reason excludes any jobs created with the UI, good job Microsoft.

Check out this article:
http://www.windowsitpro.com/article/windows-powershell/how-to-powershell-scheduled-tasks-140978

Korlac
Nov 16, 2006

A quintessential being known throughout the Realm as the 'Dungeon Master'. :rolldice:
Here's another diagnostic tool out there for any Exchange Administrators. Sometimes you need to look up Application events across several servers, this will let you determine which server role (including all Exchange Servers), which event logs you wish to parse, the event level, and finally the Event ID. Once all that data is selected through the Powershell menu, it will generate a Text file on your desktop with all the matching results.

code:
Function Get-EventLogs
	{
	$SRnumber = read-host -prompt "What is the SRX number for your incident?"
	Write-Host -fore green "Which Server would you like to search?"
	Write-Host -fore yellow "M: Mailbox Servers."
	Write-Host -fore yellow "H: Hub Transport Servers."
	Write-Host -fore yellow "C: Client Access Servers."
	Write-Host -fore yellow "A: All Exchange Servers."
	$a = Read-Host "Select M, H, C, or A"
	Write-Host " "
		Switch ($a)
		{
			M {$Servers = Get-MailboxServer}
			H {$Servers = Get-TransportServer}
			C {$Servers = Get-ClientAccessServer}
			A {$Servers = Get-ExchangeServer}
		}
	Write-Host -fore green "Which Event Log would you like to search?"
	Write-Host -fore yellow "A: Application logs."
	Write-Host -fore yellow "S: Security logs."
	Write-Host -fore yellow "E: Setup logs."
	Write-Host -fore yellow "Y: System logs."
		$b = Read-Host "Select A, S, E, or Y"
	Write-Host " "
		Switch ($b)
		{
			A {$LogName = "Application"}
			S {$LogName = "Security"}
			E {$LogName = "Setup"}
			Y {$LogName = "System"}
		}
	Write-Host -fore green "Which event level are you searching for?"
	Write-Host -fore yellow "C: Critical."
	Write-Host -fore yellow "W: Warning."
	Write-Host -fore yellow "V: Verbose."
	Write-Host -fore yellow "E: Error."
	Write-Host -fore yellow "I: Information."
			$c = Read-Host "Select C, W, V, E, or I"
		Write-Host " "
			Switch ($c)
			{
				C {$EntryType = "Critical"}
				W {$EntryType = "Warning"}
				V {$EntryType = "Verbose"}
				E {$EntryType = "Error"}
				I {$EntryType = "Information"}
			}
	$EventID = read-host -prompt "What is the EventID number?"
	foreach ($server in $servers)
			{
			$Content = get-eventlog -LogName $LogName -EntryType $EntryType | Where {$_.EventID -eq $EventID} | FL
			$Output = $Content | Out-File ~\desktop\$SRnumber.txt -append -width 2000
			}
		$Output
	}

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Korlac posted:

Here's another diagnostic tool out there for any Exchange Administrators. Sometimes you need to look up Application events across several servers, this will let you determine which server role (including all Exchange Servers), which event logs you wish to parse, the event level, and finally the Event ID. Once all that data is selected through the Powershell menu, it will generate a Text file on your desktop with all the matching results.


That looks really nice, but it doesn't account for fat fingering. You can easily change that by using the default switch and a recursive function:

code:
Function Select-Mailserver
{
    Write-Host -fore green "Which Server would you like to search?"
    Write-Host -fore yellow "M: Mailbox Servers."
    Write-Host -fore yellow "H: Hub Transport Servers."
    Write-Host -fore yellow "C: Client Access Servers."
    Write-Host -fore yellow "A: All Exchange Servers."
    $a = Read-Host "Select M, H, C, or A"
    Write-Host " "
    	Switch ($a)
	{
		M {return Get-MailboxServer}
		H {return Get-TransportServer}
		C {return Get-ClientAccessServer}
		A {return Get-ExchangeServer}
                default {Write-Host -fore red "Wrong Input, try again."; Select-MailServer}
	}
}

$servers = Select-MailServer
Next step for upgrading your script would be to make it into a full cmdlet, having a helpfile and letting it take arguments. I'll see if I can write it out for you, if I have time later.

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!
Alright, I had fun making it into a real cmdlet. I just wanted to explain autohelp and parameters. I just used your script as an excuse to do it. Feel free to ask about any of it of course. First the code, then I will explain what I did. I kept the functionality of your script the same.

code:
<#  
.SYNOPSIS  
    This script reads eventlogs of exchange servers by role or eventlog level. 
.DESCRIPTION  
    This script reads eventlogs of exchange servers. You can filter by serverrole and or event level. Default will output to a file on your desktop.
.NOTES    
    Author:              Korlac & Jelmylicious
    First Creation Date: DATE UNKOWN
    Last Edited Date:    15-May-2012
    Created For:         SomethingAwful

    ChangeLog: 
    DATE UNKOWN: Korlac: Functional Creation of script
    15-May-2012: Jelmylicious: Added parameters, helpfile and basic errorcontrol.
.LINK  
    Sources used to create this script:
    Base Script:
    [url]http://forums.somethingawful.com/showthread.php?threadid=3286440&pagenumber=13#post403624126[/url]
.EXAMPLE  
    Get-ExchangeEventlogs
    
    Description
    
    -----------
    This command prompts for all variables and will save the output to your desktop in a text-file named after the SRX number you provide.
.EXAMPLE  
    Get-ExchangeEventlogs.ps1 -Outfile ./DNSErrors.txt -LogName "System" -EntryType "Warning" -EventID 1014
    
    Description
    
    -----------
    This will prompt for server role, then find all warning events with ID 1014 in the systemlog, which are DNS errors. All entries will be saved in DNSErrors.txt in the current directory.
.PARAMETER Outfile  
    Output filename.txt will output to a file/location of your choice.
.PARAMETER Servertype  
    Select servers by Exchange role. Possible values: Mailbox, Hub, Client or AL
.PARAMETER LogName  
    Select Eventlog to probe. Possible values: Application, Security, Setup, System
.PARAMETER Entrytype
    Select the type of entry you want to collect. Possible values: Error, Information, FailureAudit, SuccessAudit and Warning
.PARAMETER EventID
    Event ID of the event you are looking for. Simple number.    
#> 

param 
    (  
    [string] $Outfile 
    ,
    [parameter(HelpMessage="Possible options: Mailbox, Hub, Client or ALL")]
    [ValidateSet("Mailbox", "Hub", "Client", "All")]
    [string] $Servertype
    ,
    [parameter(HelpMessage="Possible options: Application, Security, Setup, System")]
    [ValidateSet("Application", "Security", "Setup", "System")]
    [string] $LogName
    ,
    [parameter(HelpMessage="Possible options: Error, Information, FailureAudit, SuccessAudit and Warning")]
    [ValidateSet("Error", "Information", "FailureAudit", "SuccessAudit", "Warning")]
    [string] $EntryType
    ,
    [int] $EventID
    )


Function Select-Mailserver
{
    param ($a)
    if ($a -eq $null)
    {
        Write-Host -fore green "Which Server would you like to search?"
	    Write-Host -fore yellow "M: Mailbox Servers."
	    Write-Host -fore yellow "H: Hub Transport Servers."
	    Write-Host -fore yellow "C: Client Access Servers."
	    Write-Host -fore yellow "A: All Exchange Servers."
        $a = Read-Host "Select M, H, C, or A"
	    Write-Host " "
    }
	Switch ($a)
	{
		M {return Get-MailboxServer}
		H {return Get-TransportServer}
		C {return Get-ClientAccessServer}
		A {return Get-ExchangeServer}
        default {Write-Host -fore red "$a is not one of the options, please try again."; Select-Mailserver}
	}
}

Function Select-Eventlog
{
    Write-Host -fore green "Which Event Log would you like to search?"
    Write-Host -fore yellow "A: Application logs."
    Write-Host -fore yellow "S: Security logs."
    Write-Host -fore yellow "E: Setup logs."
    Write-Host -fore yellow "Y: System logs."
	$b = Read-Host "Select A, S, E, or Y"
    Write-Host " "
	Switch ($b)
	{
		A {return "Application"}
		S {return "Security"}
		E {return "Setup"}
		Y {return "System"}
        default {Write-Host -fore red "$b is not one of the options, please try again."; Select-Eventlog}
	}
}
function Select-Entrytype
{
    Write-Host -fore green "Which event level are you searching for?"
    Write-Host -fore yellow "E: Error."
    Write-Host -fore yellow "W: Warning."
    Write-Host -fore yellow "F: FailureAudit."
    Write-Host -fore yellow "S: SuccesAudit."
    Write-Host -fore yellow "I: Information."
	$c = Read-Host "Select C, W, F, S, or I"
	Write-Host " "
	Switch ($c)
	{
		C {return "Critical"}
		W {return "Warning"}
		V {return "Verbose"}
		E {return "Error"}
		I {return "Information"}
        default {Write-Host -fore red "$c is not one of the options, please try again."; Select-Entrytype}
	}
}

######################
# Start of main body #
######################

#Gather missing information
if ($Servertype -eq $NULL) 
    {$Servers = Select-MailServer}
Else
    {$Servers = Select-Mailserver $Servertype} 

if ($LogName -eq $NULL) 
    {$LogName = Select-Eventlog}
if ($EntryType -eq $NULL) 
    {$EntryType = Select-Entrytype}
if ($OutFile -eq $NULL)  
    {$SRnumber = read-host -prompt "What is the SRX number for your incident?"
    $OutFile = ~\desktop\$SRnumber.txt}
if ($EventID -eq $NULL)   
    {$EventID = read-host -prompt "What is the EventID number?"}

foreach ($server in $servers)
	{
	$Content = get-eventlog -LogName $LogName -EntryType $EntryType | Where {$_.EventID -eq $EventID} | FL
	$Content | Out-File $OutFile -append -width 2000
	}
First of, you hade some illegal parameters for get-eventlog -Entrytype. Is we look at it's helpfile, you will see the following:

code:
PS D:\> get-help Get-EventLog -Parameter Entrytype

-EntryType <string[]>
    Gets only events with the specified entry type. Valid values are Error, Information, FailureAudit, SuccessAudit, and Warning. The default is all events.

    Required?                    false
    Position?                    named
    Default value                All events
    Accept pipeline input?       false
    Accept wildcard characters?  false
So, I fixed that part. I also changed the switch for select-mailserver to $a[0], so I could simply pass the full name, and not have to write extra switch options. Since the first letter is the thing being switched on, and a string is just an array of letters, $a[0] was the easiest adjustment for that.

Apart from putting in the errorhandling I talked about in the previous post, I added in the following:

PARAMETERS
To make this script accept parameters, I included a param() block. I will explain using one parameter as an example:

code:
[parameter(HelpMessage="Possible options: Mailbox, Hub, Client or ALL")]
I only know you can add a helpmessage, I never actually called the parameter helpmessage. I just include it always, because I am anal.
code:
[ValidateSet("Mailbox", "Hub", "Client", "All")]
This will declare all your possible parameters, so no illegal options will be passed through. It will error out on anything else. Because of switching on the first letter, we could make this less stringent, for instance by accepting just the first letters as options. But I didn't do this. Note: this isn't case sensitive.
code:
[string] $Servertype
The actual name of your parameter. Just by this name, you can call your scipt using -ServerType as an option. I only explicitly defined the objecttype, so get-help will show the objecttype.

You could do more fun stuff with parameters, like setting default values. A fun example for that is:

code:
[string] $LDAP = ([ADSI]“”).distinguishedname
Which will default to your current domain root, which you can override by adding a calling script with -DC "DC=somethingawful,DC=com". Output of ([ADSI]“”).distinguishedname is in the form of: DC=contoso,DC=com

Other fun you could do with parameters would be to make them required, have them positioned (so you don't have to specify the parameter name) or make sets, because only certain combinations of parameters do something. There is more fun to be had, of course.

Helpfile
And now for the awesome part. get-help works for this script! The only thing you need to get that going, is to include that huge commentblock in the beginning in that syntax. I always use a base commentblock as a start for my scripts, where I just fill in the specifics as I go. I suggest anyone making big scripts to do the same! How awesome is it, if you can just tell your coworkers to RTFM in a windows environment! (NB: Man is an alias for get-help, for added fun)
So, to break it down, this is part of my base script, an adaptation of this example:
code:
<#  
.SYNOPSIS  
    A summary of what this script does  
    In this case, this script documents the auto-help text in PSH CTP 3  
    Appears in all basic, -detailed, -full, -examples  
.DESCRIPTION  
    A more in depth description of the script 
    Should give script developer more things to talk about  
    Hopefully this can help the community too  
    Becomes: "DETAILED DESCRIPTION"  
    Appears in basic, -full and -detailed  
.NOTES    
    Author:              Your name here 
    First Creation Date: not set
    Last Edited Date:    not set
    Created For:         not set
    ChangeLog:
    Date: Author: Changes
    TODO:
    Parts not yet implemented, Ideas for the future.
.LINK  
    The sources I used to create this script:
    [url]http://pshscripts.blogspot.com/2008/12/get-autohelpps1.html[/url]
    This is the source of the help part of my base script
.EXAMPLE  
    The first example - just text documentation  
    You should provide a way of calling the script, plus expected output  
    Appears in -detailed and -full  
.EXAMPLE  
    The second example - more text documentation  
    This would be an example calling the script differently. You can have lots  
    and lots, and lots of examples if this is useful.  
    Appears in -detailed and -full  
.PARAMETER foo  
   The .Parameter area in the script is used to derive the contents of the PARAMETERS in Get-Help output which   
   documents the parameters in the param block. The section takes a value (in this case foo,  
   the name of the first actual parameter), and only appears if there is parameter of that name in the  params block. Having a section for a parameter that does not exist generate no extra output of this section  
   Appears in -det, -full (with more info than in -det) and -Parameter (need to specify the parameter name)  
.PARAMETER bar  
   Example of a parameter definition for a parameter that does not exist.  
   Does not appear at all.  
#> 
I hope that was long enough for people.

Glans Dillzig
Nov 23, 2011

:justpost::justpost::justpost::justpost::justpost::justpost::justpost::justpost:

knickerbocker expert
So the decree came down from on high- The CEO wants a quarterly report of every single share on our network, along with their permissions. Now maintaining a list of the shares is easy, but not so much with the permissions. Is there a way to write a Powershell script to have it aggregate all this info?

The final product is going to be uploaded to Sharepoint (:suicide:), but I think for now it just needs to be dumped to Notepad. More of a proof-of-concept thing, I suppose.

I've looked at Dumpsec and this tool already, and I'm not sure whether either would work better, or if Powershell would be sufficient.

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!
get-acl will do what you want, but you might have to parse the outcome a bit to make it readable.

adaz
Mar 7, 2009

Walter_Sobchak does he mean SHARE or "every folder in share"? Does it need to recurse? And do you mean share permissions or are you (please lord) doing the best practice and setting everyone full control on shares and setting actual permissions via NTFS?

and Jelmylicious that's some mighty fine function craftin'

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!
I just realized: I did not specify in my helpfile that this script requires the exchange modules. This would be one of the more important things to put in there. Whoops :downs:

And for Walter_Sobchak: indeed, what is the exact thing you want to do. Do you want to check the share permissions also, in case those aren't good? How many levels deep will the permissions be unique? Do you need to inventory who is a member of the security groups as well? Want to output it in a pretty excel file for management to swoon over?

Anyway, a simple option would be:

code:
$List = get-content ListOfComputers.txt
foreach ($computer in $list)
{
   Get-WmiObject win32_share -ComputerName $Computer | get-acl | fl
}
This will error out on the IPC$ share, since it is not a real path. So you'd need to filter it out. Also, you might want to filter out the admin shares, unless you suspect something to be wrong there or want to have a complete list. Note that this doesn't have pretty output, giving special permissions a numeric value that is hard to parse for a human. But, this could get you started in the right direction.

e: Why didn't I look at this earlier. If you convert it to HEX, it gets a lot more readable. 268435456 is GENERIC_ALL. 268435456 in hex is 0x10000000

Jelmylicious fucked around with this message at 11:20 on May 16, 2012

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Walter_Sobchak posted:

So the decree came down from on high- The CEO wants a quarterly report of every single share on our network, along with their permissions. Now maintaining a list of the shares is easy, but not so much with the permissions. Is there a way to write a Powershell script to have it aggregate all this info?

Another big post ahoy! First of, as you can see in the comment notes, there is a lot more you can do with this, that isn't implemented yet. I don't filter out admin shares, I do filter out the shares that are unreachable. I chose to output one access right per line, to keep things flat.
The conversion-table for making shares human readable is definitely incomplete, but that is easy to append. I know this script might seem big and daunting for a firsttimer, but that is because I made it into a full script that includes a helpfile and can be run with parameters. Save it as Get-ShareRights.ps1 and you can run it from commandline, or run it as a scheduled job. Then have something compare previous results and you have a quick and dirty rights auditing! But I'm digressing. First the full script, after that, some explanation.

code:
<#  
.SYNOPSIS  
    Exports all filesystem rights of all shares on given computers into CSV file
.DESCRIPTION  
    Exports all filesystem rights of all shares on given computers into CSV file. This scripts assumes the current account has the right to access these shares, also, this script does not look at sharepermissions, Just NTFS rights.
.NOTES    
    Author:              Jelmylicious  
    First Creation Date: 15-May-2012
    Last Edited Date:    15-May-2012
    Created For:         SomethingAwful
    TODO:
    -Create a better conversion table for accessrights.
    -Make it accept several computernames
    -Make it go one level deeper
    -Make it accept a single share
    -Make it also accept a list of shares, rather than computers
    -Have the option to save output per share or per host.
    -Make it accept a filterlist, to filter out adminshares
.LINK  
    Sources Used:
    For the custom objects:
    adaz' awesome post at [url]http://forums.somethingawful.com/showthread.php?threadid=3286440&userid=148148&perpage=40&pagenumber=2#post394234303[/url]
    For the rights conversion table:
    [url]http://msdn.microsoft.com/en-us/library/windows/desktop/aa369774[/url](v=vs.85).aspx
.EXAMPLE  
    Get-ShareRights

    Description
    
    -----------
    This command gets all shares from localhost and enumerates all rights in csv file.
.EXAMPLE
    Get-ShareRights -Computers "Fileserver" -DontConvert -OutFile ".\output.csv"

    Description
    
    -----------
    Probe server named fileserver, while not converting to human readable strings and outputting to output.csv in current working directory.
.PARAMETER DontConvert  
    When this switch is added, the String that get-acl returns arent converted into human readable format. Default is Convert
.PARAMETER $Computers
    List of Computers that need to be scanned, Default is localhost.
.PARAMETER OutFile  
    Specify where you want the file to be saved to. Default is to prompt for location
.PARAMETER InputFile
    List of computers in textformat, one hostname per line.
#> 

param
    (
    [Switch]$DontConvert
    ,
    [string]$Computers = "Localhost"
    ,
    [string]$OutFile = "NotSet"
    ,
    [String]$InputFile = "NotSet"
    )


Function Convertto-Readable
#This function takes the cryptic strings that Get-ACL returns, and makes them human readable. (Incomplete)
{
    param 
        (
        [string] $Rights
        )
    If ($DontConvert) {return $Rights}
    Else
    {
        switch ($Rights)
        {
            268435456  {return "ReadWriteExecute"}
            536870912  {return "Execute"}
            1073741824 {return "Write"}
            default    {return $Rights}
        }
    }
}


#Collect all necessary info and instantiate variables
$ExportTable = @()
if ($OutFile -eq "NotSet") {$OutFile  = Read-Host "Type path you want to save output to: "}
if ($InputFile -ne "NotSet") {[array]$Computers = Get-Content $InputFile}


#########################
#Actual Code starts here#
#########################

foreach ($Computer in $Computers)
{
    $ShareHost = '\\' + $computer + '\'
    $shares = Get-WmiObject win32_share -ComputerName $Computer  
    foreach ($share in $shares)
    {
        $FullShare = $ShareHost + $share.name
        #Not all paths are valid paths, like the default share IPC$, so test before you try
        if (Test-Path $FullShare)
        {
            $ACL = Get-Acl $FullShare 
            foreach ($Access in $ACL.access)
            {
                #Create an empty custon object with all properties needed and add that to the table.
                $ExportLine = "" | select Computer,Share,Path,Identity,Type,Rights,ParentInheritance,Childinheritance,Propagation
                $ExportLine.Computer = $Computer
                $Exportline.Share = $Fullshare
                $Exportline.Path = $Share.path
                $Exportline.Identity = $Access.IdentityReference
                $Exportline.Type = $Access.AccessControlType
                $ExportLine.Rights = (Convertto-Readable $Access.FileSystemRights)
                $ExportLine.ParentInheritance = $Access.IsInherited
                $Exportline.Childinheritance = $Access.InheritanceFlags
                $ExportLine.Propagation = $Access.PropagationFlags
                $ExportTable += $Exportline
            }
        }
        Else
        {
            Write-Host -fore Red "$FullShare does not exist or isn't active"
        }
    }
}
$ExportTable | Export-Csv  $OutFile
And now for the explaining! I am not going to explain the help or the parameter part anymore. For that, you can scroll up a few posts.
Let me start with the only function in this script. All it does is take a simple string as input, and either return a different string if it knows the conversion, or return the same string again if it doesn't. The global parameter $dontconvert is first polled to see if any conversion has to be done at all.

code:
Function Convertto-Readable
#This function takes the cryptic strings that Get-ACL returns, and makes them human readable. (Incomplete)
{
    param 
        (
        [string] $Rights
        )
    If ($DontConvert) {return $Rights}
    Else
    {
        switch ($Rights)
        {
            268435456  {return "ReadWriteExecute"}
            536870912  {return "Execute"}
            1073741824 {return "Write"}
            default    {return $Rights}
        }
    }
}
Now on to the meat of the sauce, which boils down to this:

code:
foreach ($Computer in $Computers)
{
    $ShareHost = '\\' + $computer + '\'
    $shares = Get-WmiObject win32_share -ComputerName $Computer  
    foreach ($share in $shares)
        {
        Get-Acl $FullShare 
        }
This simply takes each computer in your list, then asks the computer to list all it's shares. Then, for each share, it asks for the access list. But if this part is so short, why did I add all that other crud? First off, get-acl output is ugly! It basically crams all the rights info in one property called access. This property is itself an array of arrays. So I split that out, put it in a different object that is flat (no nested properties) so I could easily export it to CSV. The way to do that, I cheated of this post by adaz and then added it to a table for easy exporting:
code:
$ExportLine = "" | select Computer,Share,Path,Identity,Type,Rights,ParentInheritance,Childinheritance,Propagation
$ExportLine.Computer = $Computer
$Exportline.Share = $Fullshare
#[Redacted for brevity]
$Exportline.Childinheritance = $Access.InheritanceFlags
$ExportLine.Propagation = $Access.PropagationFlags
$ExportTable += $Exportline

I also put in a small test, to see if the share is valid, so it wouldn't error out, but give you a small message saying a share doesn't exist:
code:
if (Test-Path $FullShare)
)

And there you have it. If you need it adjusted, I can do so. I might make this script bigger for auditing purposes in my company.

Jelmylicious fucked around with this message at 20:02 on May 18, 2012

Swink
Apr 18, 2006
Left Side <--- Many Whelps
This will be an easy one. The following lists all subfolders of all our exchange mailboxes.


code:
get-mailbox | get-mailboxfolderstatistics | Select Name,FolderPath,FolderType
emsInFolder,FolderSize,Identity | Sort-Object FolderSize -descending | Export-Csv c:\folders.csv
When I export to CSV, the foldersize comes out as '97.04 MB (101,752,614 bytes)' Which excel cannot sort. Is it possible to spit that out as just MBs so I dont have to do another layer of fixing up in Excel?

brosmike
Jun 26, 2009
This article explains how to format a ByteQuantifiedSize (like FolderSize) in detail. Short version is to replace FolderSize with @{expression={FolderSize.Value.ToMB()}; label="FolderSize (MB)"} in your Select call (you may need to move the sort to before the select).

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Thanks, that sorted me out.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I'm using cwrsync to pull some files from a Unix server, and I've decided to do this little project in Powershell, because why not.

To get it to run I have to modify some enviroment variables, which I can do in a bat script like this:
code:
SET HOME=E:\folder
SET PATH=%programfiles(x86)%\cwrsync\bin;%PATH%
rsync -av -e "ssh -i e:\ssh\id_rsa" unixhost:~/file .
Now I'm trying to duplicate this in powershell. I've found the $env and used that to modify the PATH variable, but I can't figure out how I need to invoke the rsync command so that it looks in that path.

I can't just run it with the full path because it keeps grabbing at binaries in PATH to figure out what to do (insert: I bet if I specified full path for both rsync and ssh it would work, but I'd rather just figure this out).

So, what am I missing here?

E: Solved my own problem. This is what I was doing:
code:
$env:path = "$env:programfiles(x86)" + "\cwrsync\bin;" + $env:path
For some reason it was interpreting "$env:programfiles(x86)" as C:\ProgramFiles(x86) (which isn't a real path). This instead worked:
code:
$env:path = ${env:programfiles(x86)} + "\cwrsync\bin;" + $env:path
Now it's in my path, and I can call the rsync binary just fine.

FISHMANPET fucked around with this message at 23:08 on May 25, 2012

angrytech
Jun 26, 2009
I'm working with Powershell and AD here and I have no idea where to begin.
I've got an OU full of machine accounts that I need to move into different ones. I can use a DirectorySearcher to get an array of all the objects in the OU, but when I use
code:
foreach ($x in $results)
{
    Write-Host $x.Properties.distinguishedname
}
I also get back a list of all the bitlocker keys stored inside those objects. How can I keep the search from recursing?

edit: holy poo poo I'm dumb.
I had set the directorysearcher scope to "subtree", which recurses through all the objects and OUs that it encounters. Using the "onelevel" scope causes it to only search the specified OU.
Hope this helps someone in the future!

angrytech fucked around with this message at 17:32 on May 30, 2012

Phone
Jul 30, 2005

親子丼をほしい。
Ugh, this is gonna hurt: 800k+ files in 7 directories; delete all files older than 60 days. I know I saw something about Powershell performance tapering off before, but I can't remember where I saw it.

I just hope that my gci -recurse works. :toot:

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Phone posted:

Ugh, this is gonna hurt: 800k+ files in 7 directories; delete all files older than 60 days. I know I saw something about Powershell performance tapering off before, but I can't remember where I saw it.

I just hope that my gci -recurse works. :toot:

I think this is what you are looking for:

from: http://blogs.msdn.com/b/powershell/archive/2009/11/04/why-is-get-childitem-so-slow.aspx
Since the sweet point seems to be around 300k files, why not specify the 7 directories, and do a simple gci without the recurse on them? I feel dirty for removing some automation, but sometimes, doing it yourself really is better. Or, to get all the directories automatically, either:
- do a gci -directory (powershell 3 option) or
- filter with gci -filter *. (to specify a native filesystem filter for files with no extension)
Last option has the assumption that directories have no extension, and files do. Or,if you can distinguish by name, you could use a different filter.

e: changed image host to imgur, even though msdn.com can probably handle the load from this thread...

Jelmylicious fucked around with this message at 16:42 on Jun 1, 2012

Phone
Jul 30, 2005

親子丼をほしい。
Looks like I barely got by, haha. It was 7 folders in a share that has close to 30 folders total, so each folder had it's own gci call. It went surprisingly fast and created a 110MB log file (while eating up 2.3GB of RAM :catstare: ). The largest folder had about 315k files in it; so bullet = dodged.

Thanks for that link!!

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days

I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.

stubblyhead
Sep 13, 2007

That is treason, Johnny!

Fun Shoe

adaz posted:

As a note, and I missed this last week, but the beta of powershell 3.0 is out: http://www.microsoft.com/download/en/details.aspx?id=28998

Has the public beta for Powershell 3 been discontinued? This link is dead now.

e: n/m, found a new link http://www.microsoft.com/en-us/download/details.aspx?id=29939

stubblyhead fucked around with this message at 04:25 on Jun 9, 2012

adaz
Mar 7, 2009

Scaramouche posted:

This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days

I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.

The answer is to use Powershell w/ the .NET 4 framework which fixes the lovely performance issue with a ton of files. Currently PS 2.0 uses the 2 framework by default, although it can be changed (warning: causes issues but if you want to know how see http://stackoverflow.com/questions/2094694/how-can-i-run-powershell-with-the-net-4-runtime). The issue with file performance is a .NET 2 issue, not a intrinsic powershell problem. Your other option is to use DOS or something else that uses the WIN 32 API directly instead of going through .NET

PS 3.0 uses the 4.5 framework I do believe but I haven't checked that for sure so don't quote me on that.

e: also any PS goons going to be down in Orlando for Tech-Ed next week? I'll be attending some of the Powershell sessions

adaz fucked around with this message at 08:25 on Jun 9, 2012

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Scaramouche posted:

This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days

I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.

I would keep using PowerShell. The advantage of that, is that it returns objects, not plaintext. So, if you can filter it down with -Filter (native NTFS filter, like dir uses) or if you can break it up in chunks, I would keep using Get-ChildItem for flexibilities sake.
For this example, the batch script would of course work, it's just that you can do so much more with the objects. If you'd ever want to expand on your script, the PowerShell one would be very easy to modify.
Granted, the batch script will probably be a lot faster.

Wicaeed
Feb 8, 2005
So I saw this posted on Reddit, give a really good overview of Powershell for those trying to learn it, and also gives a lot of good tips on script creation: https://www.youtube.com/watch?v=-Ya1dQ1Igkc

For example, I didn't even know of the show-command cmdlet.

Blew my fuckin' MIND, man.

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Wicaeed posted:

So I saw this posted on Reddit, give a really good overview of Powershell for those trying to learn it, and also gives a lot of good tips on script creation: https://www.youtube.com/watch?v=-Ya1dQ1Igkc

For example, I didn't even know of the show-command cmdlet.

Blew my fuckin' MIND, man.

Just gave the first day of our internal two day powershell course. Most important commands I taught were Get-Command (in conjunction with filters), Get-Help and Get-Member. With those three, you can find out almost all you need to know or at least find specific terms to google.

e: Just watched most of that video, it is really good. I think I am going to change the structure of my course a bit, because of this.

Jelmylicious fucked around with this message at 23:27 on Jun 11, 2012

adaz
Mar 7, 2009

Saw a good talk at teched on powershell remoting, the lecture guy was awesome and had a TON of tips I had no idea about. He wrote a free book you can check out on the remoting here: http://powershellbooks.com/. I highly recommend it, takes a lot of the pain out of weird remoting options that aren't documented anywhere.

Wicaeed
Feb 8, 2005
The remoting feature looked pretty awesome, but it strikes me that third-party applications that run on the server are going to be negatively impacted by removing the Windows Server gui. Our monitoring system (PRTG) relies on a GUI-based control panel that runs on the server. Is there a way (with ps remoting) to redirect GUI content to another computer?

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!
Alright, I found something wierd, which is probably just sommething in the datetime format the WMI returns. First, let me lay a little background. To get a the installdate through WMI you can ask it like this:
code:
(gwmi win32_operatingsystem).InstallDate
Which outputs: 20120412025148.000000+120
Well, that was helpful. I think I see a 2012 at the beginning, but yeah.... How long is that thing anyway?
code:
((gwmi win32_operatingsystem).InstallDate).length
Twenty-five characters! Let me google if I can convert that.
code:
$LongNumber = (gwmi win32_operatingsystem).InstallDate
[System.Management.ManagementDateTimeconverter]::todatetime($longNumber)
Hmmm, thursday April 12, sounds about right. Let's check systeminfo for that: Original Install Date: 12-4-2012. It works. Now let's play with that long number, with a random number, 25 digits long, starting with 1999:
code:
$LongNumber = 1999193125501212345678901 
[System.Management.ManagementDateTimeconverter]::todatetime($longNumber)
Exception calling "ToDateTime" with "1" argument(s): "Year, Month, and Day parameters describe an un-representable DateTime." Hey, there does seem to be a pattern here. Ok, fix the month and party like it's 1999:
code:
$LongNumber = 1999123125501212345678901 
[System.Management.ManagementDateTimeconverter]::todatetime($longNumber)
Now I get "Hour, Minute, and Second parameters describe an un-representable DateTime." Ok final try:
code:
$LongNumber = 1999123123501212345678901 
[System.Management.ManagementDateTimeconverter]::todatetime($longNumber)
Saturday 25 december 1999 20:29:12. YAY. Wait! what?
What I would expect to be 1999-12-31 23:50:12.12345678901 yields 1999-12-25 20:29:12. 4 days, 3 hours and 21 minutes difference! Let's timetravel! change the 1999 to 1899, 'cause I'm oldfashioned: monday 25 december 1899 20:29:12. Exact same difference! Anyone know what's up? Or should I ask in the .NET thread, since the datetime class is technically more their domain. I trust the conversion, I am just intrigued by this all.

adaz
Mar 7, 2009

Wicaeed posted:

The remoting feature looked pretty awesome, but it strikes me that third-party applications that run on the server are going to be negatively impacted by removing the Windows Server gui. Our monitoring system (PRTG) relies on a GUI-based control panel that runs on the server. Is there a way (with ps remoting) to redirect GUI content to another computer?

No, remoting is using the WSMAN protocol and doesn't really do GUI redirection. For that particular case you'd be better off using something like server 2012 minimal install, which is kind of like core but still installs some gui for software that needs it. Or, more correctly, it installs the entire GUI but windows explorer and internet explorer.

Also, just spent today oogling the new powershell 3.0 features. Intellisense in the IDE? Yes. Debug tools that show you loop variables and all the rest as you hover over them? Yes please. A scheduled job cmd-let? Yesssss. Simplified for-each syntax & finally using 4 framework yessssssssssssss

Negatives:
Remoting is still balls bad if you need to cross forests that have no trust, as in so complex the lead architect was trying to get it working before their presentation and couldn't.

Export-CSV remains dumb :argh:

But seriously go download the 3.0 beta.

adaz fucked around with this message at 00:14 on Jun 14, 2012

stubblyhead
Sep 13, 2007

That is treason, Johnny!

Fun Shoe

adaz posted:

Negatives:
Remoting is still balls bad if you need to cross forests that have no trust, as in so complex the lead architect was trying to get it working before their presentation and couldn't.

Export-CSV remains dumb :argh:

It also inserts spaces instead of tab characters, which bothers me to no end. I really don't want to bring back the Great Indent Wars though, so I'll say no more about it.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hola powershell companeros, another day, another powershell question. I'm making a powershell script to:
1. Log into an ftp (done)
2. Get a directory listing (done)
3. Find the last file in the directory listing, which is randomly named (failing hard)
4. Download said last file (not done but easy)

I'm doing all this using native objects because this thing has to be somewhat portable, otherwise I'd be using the great PSFTP client (http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb). Basically what I've done is returning a stream, and I'm boggled as to how to get the last line of it.

Here's what I've got so far, hacked together from a couple guys' FTP examples:
code:
{void} {System.Reflection.Assembly}::LoadWithPartialName("system.net")

$ftpserver = "ftp://adaz.com/Orders"
$ftpuser = "Jelmylicious"
$ftppassword = "passworddonotsteal"

$ftpconn = {system.net.ftpwebrequest} {system.net.webrequest}::create($ftpserver)
$ftpconn.Credentials = New-Object System.Net.NetworkCredential($ftpuser,$ftppassword)

$ftpconn.method = {system.net.WebRequestMethods+ftp}::listdirectorydetails
$ftpresponse = $ftpconn.getresponse()
$ftpstream = $ftpresponse.getresponsestream()

  $buffer = new-object System.Byte{} 1024 
  $encoding = new-object System.Text.AsciiEncoding 

  $outputBuffer = "" 
  $doMore = $false 

  do 
  { 
    start-sleep -m 1000 

    $doMore = $false 
    $ftpstream.ReadTimeout = 1000 

    do 
    { 
      try 
      { 
        $read = $ftpstream.Read($buffer, 0, 1024) 

        if($read -gt 0) 
        { 
          $doMore = $true 
          $outputBuffer += ($encoding.GetString($buffer, 0, $read)) 
        } 
      } catch { $doMore = $false; $read = 0 } 
    } while($read -gt 0) 
  } while($doMore)
My problems are twofold:
1. I got no idea how to do anything with this stream. I've tried my VB tricks with Seek, and split(\n) but I get gobbledygook,null, or errors all the time. I think there's something different with how PowerShell handles them.
2. Even after that I'll still have to isolate the file name itself. The output I'm looking at looks like:
code:
Date   Time      Size(bytes)   Filename
Ideally I guess I'd want to find EOF, go back one CR/LF, and then grab the string until the first TAB (assuming that's the separator used). I don't need you guys to write me a whole solution, but just give me either a way to do seek/etc. like I'm used to or (thinking outside the box) turn the listing into something I can do get-child type operations on.

Mario
Oct 29, 2006
It's-a-me!
I think you could simplify this a great deal by using a StreamReader to wrap the response stream. Haven't tested it though.

Something like this, picking up partway through your code. It checks against an empty string in case there are trailing newlines.
code:
$ftpresponse = $ftpconn.getresponse()
$ftpstream = $ftpresponse.getresponsestream()

$reader = New-Object System.IO.StreamReader($ftpStream)

while ($reader.EndOfStream -eq $false)
{
	$curLine = $reader.readLine()
	if ($curLine -ne '')
	{
		$lastLine = $curLine
	}
}

# Parse out $lastLine by tabs or whatever

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Thanks mang, sorry for the late reply I took a little 'computer break' this weekend. I'll check that out.

Wicaeed
Feb 8, 2005
So, does anyone know if it's possible to import all of the PowerCLI modules directly into Powershell or Powershell ISE?

edit: nm, apparently

code:
 if ((Get-PSSnapin "VMware.VimAutomation.Core" -ErrorAction SilentlyContinue) -eq $null) { Add-PSSnapin "VMware.VimAutomation.Core" } 
gets it done

Wicaeed fucked around with this message at 18:30 on Jun 21, 2012

Nebulis01
Dec 30, 2003
Technical Support Ninny
Any ideas on why the following returns items that are from yesterday instead of just the stuff 5 days old and older?
code:
$targetFolder = "c:\backups"
Get-ChildItem -Path $targetFolder -recurse | WHERE {($_.CreationTime -le $(Get-Date).AddDays(-5))} | Remove-Item -recurse -force -whatif 
The directory has items from 6/16-6/21 and when run it returns the following:

code:
What if: Performing operation "Remove Directory" on Target "C:\Backups\2012-06-16".
What if: Performing operation "Remove Directory" on Target "C:\Backups\2012-06-16\Flat File".
What if: Performing operation "Remove Directory" on Target "C:\Backups\2012-06-16\SystemState".
What if: Performing operation "Remove File" on Target "C:\Backups\2012-06-16\backup.log".
What if: Performing operation "Remove File" on Target "C:\Backups\2012-06-21\Flat File\Daily FlatFile Backup.bks".
What if: Performing operation "Remove File" on Target "C:\Backups\2012-06-21\Flat File\FlatFile.bkf".
What if: Performing operation "Remove File" on Target "C:\Backups\2012-06-21\SystemState\Daily SystemState Backup.bks".
What if: Performing operation "Remove File" on Target "C:\Backups\2012-06-21\SystemState\SystemState.bkf".

Nebulis01 fucked around with this message at 00:36 on Jun 23, 2012

kampy
Oct 11, 2008

Nebulis01 posted:

Any ideas on why the following returns items that are from yesterday instead of just the stuff 5 days old and older?
code:
$targetFolder = "c:\backups"
Get-ChildItem -Path $targetFolder -recurse | WHERE {($_.CreationTime -le $(Get-Date).AddDays(-5))} | Remove-Item -recurse -force -whatif 

Have you tried with LastWriteTime instead of CreationTime? It may be that the CreationTime does not reflect the correct date for whatever reason, I'd recommend checking the attributes for the incorrectly matched files with

code:
select fullname, creationtime, lastwritetime
in place of the Remove-Item part.

Phone
Jul 30, 2005

親子丼をほしい。
LastWriteTime and LastModifiedTime work a lot better.

e: i think lastwritetime = modified time

Swink
Apr 18, 2006
Left Side <--- Many Whelps
How can I validate user input? I have a script that creates AD users, and I need to specify the users' location. There are four options, how can I ensure I dont misspell the city when I'm typing it in? Is there any way I could choose from a list of options rather than having to type?

Jelmylicious
Dec 6, 2007
Buy Dr. Quack's miracle juice! Now with patented H-twenty!

Swink posted:

How can I validate user input? I have a script that creates AD users, and I need to specify the users' location. There are four options, how can I ensure I dont misspell the city when I'm typing it in? Is there any way I could choose from a list of options rather than having to type?

With parameters, you couldt use a validateset:
code:
param
    (
    [Parameter(Mandatory=$True)]
    [validateset("Mars","Mordor","Sodom","Gomorra")]
    [string]$City
    )
If you use interactive input, you could either check the statement with if statements:
code:
if ("Mars","Mordor","Sodom","Gomorra" -eq $city) {write-host "Wow! you can type correctly!"}
For this to validate, you need the list of correct cities on the left hand side, making the statement: "If any of these match". Note also that this isn't case sensitive. You could use a switch statement:
code:
$Choice = Read-Host "Select 1 for Mars, 2 for Mordor, 3 for Sodom, 4 for Gomorra"
Switch ($Choice)
     {
     1 {$City = "Mars"}
     2 {$City = "Mordor"}
     3 {$City = "Sodom"}
     4 {$City = "Gomorra"}
     }
VVV Indeed, you can do fun things with validation. You could ensure the $Max is always bigger than $Min by using a validate expression, for instance. VVV

Jelmylicious fucked around with this message at 06:32 on Jun 27, 2012

Adbot
ADBOT LOVES YOU

kampy
Oct 11, 2008

Jelmylicious posted:

With parameters, you couldt use a validateset:
code:
param
    (
    [Parameter(Mandatory=$True)]
    [validateset('Mars','Mordor','Sodom','Gomorra')]
    [string]$City
    )

And for more help and samples with those, check
code:
help about_functions_advanced_parameters

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply