Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hadlock
Nov 9, 2004

Welp, the head of AppDev at our company realized we've written over 5000 lines of powershell in the last year which has a linear relation to the amount of billable hours they didn't get from us last year for internal C# projects, so we're not allowed to write any large powershell projects anymore, they have to go through AppDev now.

Adbot
ADBOT LOVES YOU

Bluffoon
Jun 15, 2005
hhheeeeeeeeeeehaaaaaaaaawww

myron cope posted:

I think python (or something that isn't powershell, at least) is ultimately a better option, but I need to, you know, learn some python first.

So in the mean time, I've settled on this horrid workaround:

code:
Compare-Object (gc "file1.txt") (gc "file2.txt") -PassThru -IncludeEqual | Where-Object { $_.SideIndicator -eq '==' } | Sort-Object 
Which seems to work. It would seem -ExcludeDifferent would be what I want but it never had any output (that I could pipe at least) and I can't figure out why.

If it's not abundantly clear, I basically hacked this together without really knowing more than the basics of powershell just from searching online

Would something like this work?

code:
# Get the contents of all of the files
$fileOne = Get-Content "test1.txt"
$fileTwo = Get-Content "test2.txt"
$fileThree = Get-Content "test3.txt"

# Create an array to store the IP addresses that are found in all 3 files
$commonIPAddresses = @()


foreach($ipAddress in $fileOne)
{
    # Check each IP address in File One to see if it is also contained in both of the other files, and add to the array if it is
    if ($fileTwo -contains $ipAddress -and $fileThree -contains $ipAddress)
    {       
        $commonIPAddresses += $ipAddress
    }
}

if ($commonIPAddresses.Count -gt 0)
{
    # Print the results out to another file, if any matches were found
    $commonIPAddresses | Out-File "CombinedIPAddressList.txt"
}
Not super heavily tested - I just created three text files that each had a couple of IP addresses (each on a new line), with one of those IP addresses listed in all of them. After confirming that worked, I added another common one to all of the files, and it seemed to be returning the right results.

Sefal
Nov 8, 2011
Fun Shoe
I'm trying to use a powershell script to retrieve all empty groups. But it also lists domain computers Which has 2000+ members.
to work around that i made the script count how many members there are in each group. That works and is good enough to use. but I am curious as to why it keeps seeing Domain Computers as an empty group.

This is the script i'm using.

code:
Get-QADGroup -Empty:$True -SizeLimit 0 | Select-Object Name,@{n='MemberCount';e={ (Get-QADGroupMember $_ -SizeLimit 0 
| Measure-Object).Count}},@{n='MemberOfCount';e={ ((Get-QADGroup $_).MemberOf | Measure-Object).Count}} 
I googled around and saw that it may have to do with the member attribute not being set? i'm not sure how to test that. I went into the attribute editor in the AD and couldn't find it.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Sefal posted:

I'm trying to use a powershell script to retrieve all empty groups. But it also lists domain computers Which has 2000+ members.
to work around that i made the script count how many members there are in each group. That works and is good enough to use. but I am curious as to why it keeps seeing Domain Computers as an empty group.

This is the script i'm using.

code:
Get-QADGroup -Empty:$True -SizeLimit 0 | Select-Object Name,@{n='MemberCount';e={ (Get-QADGroupMember $_ -SizeLimit 0 
| Measure-Object).Count}},@{n='MemberOfCount';e={ ((Get-QADGroup $_).MemberOf | Measure-Object).Count}} 
I googled around and saw that it may have to do with the member attribute not being set? i'm not sure how to test that. I went into the attribute editor in the AD and couldn't find it.
I've never used the Quest AD cmdlets since Microsoft has their own ActiveDirectory module.

With the one from MS I'd do it like this:
code:
Import-Module ActiveDirectory
Get-ADGroup -Filter * | Where-Object { -not ($_ | Get-ADGroupMember | Select-Object -First 1) }

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
You all know that powershell has a multi line mode that's like perl.net right

And a debugger that's nicer than pythons at leasts


Please don't create long rear end pipeline one liners are they are unmaintainable


Lol@5k of ps not being billable though just charge it like any other .net language

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Microsoft is doing a couple of live events later this month covering DSC if anyone is interested. These tend to be recorded and can be watched at a later date as well

http://www.microsoftvirtualacademy.com/liveevents/getting-started-with-powershell-desired-state-configuration-dsc

http://www.microsoftvirtualacademy.com/liveevents/advanced-powershell-desired-state-configuration-dsc-and-custom-resources

Weaponized Autism
Mar 26, 2006

All aboard the Gravy train!
Hair Elf
Feel like I am doing this all wrong, but need some help on how to break out of an area of the code. I am basically writing a script with a GUI that will do certain things to a database via a SQL connection. The GUI itself takes inputs (servername, database, username, password) and I have added buttons that will trigger particular events. So button1 might update a particular column on the database, button2 might dump data into an XML, etc. Pseudo-written like this:

code:
$button1 = 
{
CODE HERE
 if($connection = valid)
 {
   Write-Host "Connection is OK!"
  else
  {
 Write-Host "Connection is invalid!"
  break;
 }
   }
}

Function GUI_Screen {
CODE HERE
Add_click{$button1}
}
So basically what I have is Add_click going to various variables ($button1, $button2, etc.), and all the work is defined within those variables. This triggers the Powershell console or command prompt to display what is actually going on, while the GUI is just used for input. If everything is inputted correctly, the code will execute just fine since it walks through the code just fine until it reaches the end where it just waits for another action from the GUI. However, if the database connection is invalid, I want it to stop what it is doing, clear the input boxes, and basically prompt the user to try again. I tried try/catch, but even if it takes the exception it still tries to go through the rest of the code, which would be pointless anyway since there is no database connection.


Edit: Nevermind I figured it out! Instead of calling variables, I just changed it to a function definition. Then, I just put in a "return" after the Connection Is Invalid line. It throws the appropriate SQL errors and goes back to the GUI as expected. :)

Weaponized Autism fucked around with this message at 15:32 on Feb 13, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Tailored Sauce posted:

Edit: Nevermind I figured it out! Instead of calling variables, I just changed it to a function definition. Then, I just put in a "return" after the Connection Is Invalid line. It throws the appropriate SQL errors and goes back to the GUI as expected. :)
If return worked in the function, then it should have worked in the scriptblock the same way. It's kind of the same thing with a foreach loop vs. a call to ForEach-Object.

CapMoron
Nov 20, 2000
Forum Veteran
I posted this over the Working in IT thread, and they kindly directed me here, any suggestions?

I have a project at work to help cull some of the decade plus data sitting on our Windows file servers that hasn't been accessed in a decent amount of time. Googling has indicated that a Powershell script would probably be best. The problem is that I know next to nothing about Powershell. It is on my to-learn list, but I'm currently enrolled in WGU, so my brain's in depth learning bandwidth is maxed by my current courses.

I found a script that works pretty well:

code:
Function Get-NeglectedFiles

{

 Param([string[]]$path,

       [int]$numberDays)

 $cutOffDate = (Get-Date).AddDays(-$numberDays)

 Get-ChildItem -recurse -Path $path |

 Where-Object {$_.LastAccessTime -le $cutOffDate}

}
The problem is that there is so much data that is nested in subdirectories upon subdirectories, that it is hitting the 248 character pathtoolong issue. Is there a better way to do this, or a non-complicated way around the path limit, or am I just hosed?

vanity slug
Jul 20, 2010

Have you tried invoking the \\?\ prefix? That should get you up to 32k. I haven't tried this in a while, but read this article: http://blogs.msdn.com/b/bclteam/archive/2007/02/13/long-paths-in-net-part-1-of-3-kim-hamilton.aspx

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

CapMoron posted:

I posted this over the Working in IT thread, and they kindly directed me here, any suggestions?

I have a project at work to help cull some of the decade plus data sitting on our Windows file servers that hasn't been accessed in a decent amount of time. Googling has indicated that a Powershell script would probably be best. The problem is that I know next to nothing about Powershell. It is on my to-learn list, but I'm currently enrolled in WGU, so my brain's in depth learning bandwidth is maxed by my current courses.

I found a script that works pretty well:

code:
Function Get-NeglectedFiles

{

 Param([string[]]$path,

       [int]$numberDays)

 $cutOffDate = (Get-Date).AddDays(-$numberDays)

 Get-ChildItem -recurse -Path $path |

 Where-Object {$_.LastAccessTime -le $cutOffDate}

}
The problem is that there is so much data that is nested in subdirectories upon subdirectories, that it is hitting the 248 character pathtoolong issue. Is there a better way to do this, or a non-complicated way around the path limit, or am I just hosed?
It's unlikely you'll be able to directly use the \\?\ prefix within PowerShell since PowerShell relies on the .Net objects behind the scenes.

There's a janky workaround that might work but you would have to run the script on the file server itself (you can't use UNC paths for this).

Basically, you could create directory junctions that are short, but lead into the deep paths. It would be a bit of a pain in the rear end, since you would constantly have to be using the junction path to traverse the share, and create new junction paths deeper inside as needed.

To create the junctions, you can use mklink, which does support the \\?\ prefix:

code:
mklink /J "C:\mnt\Test" "\\?\Z:\v\e\r\y\long\path\name\here\to\folder"
To call that from powershell:
code:
cmd.exe /c mklink /J "C:\mnt\Test" "\\?\Z:\v\e\r\y\long\path\name\here\to\folder"
Since the junctioning happens at the filesystem level, the Win32 APIs (the source of the problem) should not be able to tell that the shortened path is not the real path (their normalization techniques won't resolve this type of link).

It's really just the logistics of creating and deleting the junctions as needed that will be really annoying.

To delete, use rmdir (or rd):
code:
cmd.exe /c rd "C:\mnt\Test"

Briantist fucked around with this message at 22:09 on Feb 25, 2015

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

Briantist posted:

It's unlikely you'll be able to directly use the \\?\ prefix within PowerShell since PowerShell relies on the .Net objects behind the scenes.

There's a janky workaround that might work but you would have to run the script on the file server itself (you can't use UNC paths for this).

Basically, you could create directory junctions that are short, but lead into the deep paths. It would be a bit of a pain in the rear end, since you would constantly have to be using the junction path to traverse the share, and create new junction paths deeper inside as needed.

To create the junctions, you can use mklink, which does support the \\?\ prefix:

code:
mklink /J "C:\mnt\Test" "\\?\Z:\v\e\r\y\long\path\name\here\to\folder"
To call that from powershell:
code:
cmd.exe /c mklink /J "C:\mnt\Test" "\\?\Z:\v\e\r\y\long\path\name\here\to\folder"
Since the junctioning happens at the filesystem level, the Win32 APIs (the source of the problem) should not be able to tell that the shortened path is not the real path (their normalization techniques won't resolve this type of link).

It's really just the logistics of creating and deleting the junctions as needed that will be really annoying.

To delete, use rmdir (or rd):
code:
cmd.exe /c rd "C:\mnt\Test"
One other option I absolutely have not tested, but should at least work locally, would be to handle your own recursion and change into each enumerated child directory using Push-Location and Pop-Location instead of bookkeeping junctions. Since you're dealing with relative paths instead of absolute paths at that point, you're very unlikely to hit PATH_MAX issues.

Vulture Culture fucked around with this message at 00:06 on Feb 26, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Misogynist posted:

One other option I absolutely have not tested, but should at least work locally, would be to handle your own recursion and change into each enumerated child directory using Push-Location and Pop-Location instead of bookkeeping junctions. Since you're dealing with relative paths instead of absolute paths at that point, you're very unlikely to hit PATH_MAX issues.

I thought about that, but looking at the articles that Jeoh posted, part of what the Win32 APIs do (if they aren't processing a long path pre-ceded with \\?\) is to normalize the path, which includes among other things "converting relative paths into full paths" so I thought that that would not end up working.

Harry Lime
Feb 27, 2008


So I'm having a bizarre issue with remove-item that maybe the brain trust here can shed some light on. The goal of this script is to on a daily basis pull specfic files from a inactive archive directory. These archive folders are created one for each day by an application and the folder names are always in the format of yyyyMMdd_1_Some random string of numbers. When I run the script below I get zero files found for removal.
code:
$Date = ([DateTime]::Today.AddDays(-30)) | Get-Date -Format yyyyMMdd

Remove-Item -Path D:\Path\$Date*\* -Include *_153_44_*,*_154_45_*  -Force 
If I use -Exclude instead of -Include while still using \$Date*\* to identify the path I get files found. If I replace $Date with the exact folder name -Include will produce results so there are files that meet the -Include parameters. If I run just

code:
$Date = ([DateTime]::Today.AddDays(-30)) | Get-Date -Format yyyyMMdd

Get-item -Path D:\Path\$Date*\*
It will pull all the files in this directory successfully. Is there something simple I am missing? Using -Exclude really isn't a solution for me because out of the 1K or so file name patterns there are only about 30 that need to be removed.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.
Powershell on my work machine seems unable to save any configuration / installations. For example, if I install PsGet with the command (new-object Net.WebClient).DownloadString("http://psget.net/GetPsGet.ps1") | iex , I'm able to use it, but after I exit powershell and restart it PsGet isn't there anymore.

What's broken here? How do I fix it?

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Harry Lime posted:

So I'm having a bizarre issue with remove-item that maybe the brain trust here can shed some light on. The goal of this script is to on a daily basis pull specfic files from a inactive archive directory. These archive folders are created one for each day by an application and the folder names are always in the format of yyyyMMdd_1_Some random string of numbers. When I run the script below I get zero files found for removal.
code:
$Date = ([DateTime]::Today.AddDays(-30)) | Get-Date -Format yyyyMMdd

Remove-Item -Path D:\Path\$Date*\* -Include *_153_44_*,*_154_45_*  -Force 
If I use -Exclude instead of -Include while still using \$Date*\* to identify the path I get files found. If I replace $Date with the exact folder name -Include will produce results so there are files that meet the -Include parameters. If I run just

code:
$Date = ([DateTime]::Today.AddDays(-30)) | Get-Date -Format yyyyMMdd

Get-item -Path D:\Path\$Date*\*
It will pull all the files in this directory successfully. Is there something simple I am missing? Using -Exclude really isn't a solution for me because out of the 1K or so file name patterns there are only about 30 that need to be removed.
I'm not 100% certain why you're seeing this behavior. When you say that you replace $Date with the exact name, are you still using the wildcards? -Include and -Exclude require wildcards to be used in the Path, so if you aren't doing that, then Include might actually be taking the place of Path.

In any case, while it's not exactly the powershell way, I would probably circumvent the parameters with something like this:
code:

$Patterns = @(
    '*_153_44_*',
    '*_154_45_*' 
)

Get-ChildItem -Path D:\Path\$Date*\* -Recurse -File | Where-Object { 
    foreach($pat in $Patterns) {
        if($_.Name -like $pat) {
            return $true
        }
    }
    $false
} | Remove-Item -Force -WhatIf  # Remove the WhatIf when you know it's working
This is a workaround.. I would need to create a test set of files and folders to really figure out the behavior of Include, since I rarely use it. Maybe I'll do that later.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Newf posted:

Powershell on my work machine seems unable to save any configuration / installations. For example, if I install PsGet with the command (new-object Net.WebClient).DownloadString("http://psget.net/GetPsGet.ps1") | iex , I'm able to use it, but after I exit powershell and restart it PsGet isn't there anymore.

What's broken here? How do I fix it?
Are you importing the module in the new sessions? The install script is importing it for you so it's available right away. In a new session, you may need to import it yourself. You can't always rely on module autoloading.

Harry Lime
Feb 27, 2008


Briantist posted:

I'm not 100% certain why you're seeing this behavior. When you say that you replace $Date with the exact name, are you still using the wildcards? -Include and -Exclude require wildcards to be used in the Path, so if you aren't doing that, then Include might actually be taking the place of Path.

When I say exact name it becomes D:\Path\Actual folder name\* so the tailing wildcard is still present. That is also the only way I have been able to get -Include to pull results.

quote:

In any case, while it's not exactly the powershell way, I would probably circumvent the parameters with something like this:
code:

$Patterns = @(
    '*_153_44_*',
    '*_154_45_*' 
)

Get-ChildItem -Path D:\Path\$Date*\* -Recurse -File | Where-Object { 
    foreach($pat in $Patterns) {
        if($_.Name -like $pat) {
            return $true
        }
    }
    $false
} | Remove-Item -Force -WhatIf  # Remove the WhatIf when you know it's working
This is a workaround.. I would need to create a test set of files and folders to really figure out the behavior of Include, since I rarely use it. Maybe I'll do that later.

I've been working with a coworker on this problem and a script very similar to yours is the only way we have been able to get it to work so far.

Halo14
Sep 11, 2001
For beginners like me I found this series immensely helpful:

http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start

Not sure if it's been posted here before.

Video Nasty
Jun 17, 2003

Can Compare-Object be used to compare more than just two objects/references?
Would I have to break things down into multiple comparisons so I can compare four directories of similar filenames?

myron cope
Apr 21, 2009

Jake Blues posted:

Can Compare-Object be used to compare more than just two objects/references?
Would I have to break things down into multiple comparisons so I can compare four directories of similar filenames?

I'm still a powershell nincompoop but something close to this answer from above would probably work?

Bluffoon posted:

Would something like this work?

code:
# Get the contents of all of the files
$fileOne = Get-Content "test1.txt"
$fileTwo = Get-Content "test2.txt"
$fileThree = Get-Content "test3.txt"

# Create an array to store the IP addresses that are found in all 3 files
$commonIPAddresses = @()


foreach($ipAddress in $fileOne)
{
    # Check each IP address in File One to see if it is also contained in both of the other files, and add to the array if it is
    if ($fileTwo -contains $ipAddress -and $fileThree -contains $ipAddress)
    {       
        $commonIPAddresses += $ipAddress
    }
}

if ($commonIPAddresses.Count -gt 0)
{
    # Print the results out to another file, if any matches were found
    $commonIPAddresses | Out-File "CombinedIPAddressList.txt"
}

Video Nasty
Jun 17, 2003

Dang, that's actually really close to what I was hoping to do. I'm going to steal that and modify it to suit my needs.
I'll definitely paste my finished piece when it's working but I'm just glad I don't need to feed results into more comparisons. Thanks!

12 rats tied together
Sep 7, 2006

Briantist posted:

code:

$Patterns = @(
    '*_153_44_*',
    '*_154_45_*' 
)

Get-ChildItem -Path D:\Path\$Date*\* -Recurse -File | Where-Object { 
    foreach($pat in $Patterns) {
        if($_.Name -like $pat) {
            return $true
        }
    }
    $false
} | Remove-Item -Force -WhatIf  # Remove the WhatIf when you know it's working

Honestly I've run into something like this before, and I'm about 99% sure I was able to successfully use:

code:
gci "path" -recurse | Where-Object { $_.Name -like $patterns }
The script is on a server at my previous employer, but I recall being fairly impressed that powershell did the 'dirty work' of comparison between a property and an array of multiple values for me, without me needing to actually write anything special for it. This was especially nice in my case because the comparison patterns inside $Patterns were generated dynamically by searching through other files.

In any case, I would strongly recommend that you try to use -like and -notlike whenever possible (no matter how kludgey), because I've never had good luck with -include or -exclude.

This also seems like a great use case for regular expressions, but I have this problem where I always try to justify problems as a use case for regular expressions. :)

Weaponized Autism
Mar 26, 2006

All aboard the Gravy train!
Hair Elf
I am trying to create a GUI that will display all output triggered by PowerShell. I have a Start-Transcript writing to transcript.log, and I was able to display it successfully in a RichTextBox via:

$outputBox.Text = Get-Content ./transcript.log | Out-String

However, the problem is Get-Content. From what I am reading online, this just won't work because it basically locks the file it reads. This leads to the transcript.log not writing in new data, and also I don't even know if I refreshed the GUI form it would show. So I think the alternative is to set up some sort of filestream into the GUI. I tried tail, but that actually prevents the GUI from appearing. Any ideas? My end goal is to create a "Live Log Viewer" at the bottom of all my programs so I (and others) can see all PowerShell activity without viewing it in a shell window.


code:
[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Drawing") 
[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") 

Start-Transcript -Path .\transcript.log -NoClobber -Append

Function Generate-Form {

    Add-Type -AssemblyName System.Windows.Forms    
    Add-Type -AssemblyName System.Drawing

$objForm = New-Object System.Windows.Forms.Form 
$objForm.Text = "Output Tester"
$objForm.Size = New-Object System.Drawing.Size(1000,700) 
$objForm.StartPosition = "CenterScreen"

$outputBox = New-Object System.Windows.Forms.RichTextBox
$outputBox.Location = New-Object System.Drawing.Size(10,370)
$outputBox.Size = New-Object System.Drawing.Size(900,250)
$outputBox.MultiLine = $True
$outputBox.ReadOnly = $True
$outputBox.ScrollBars = "Both"

$outputBox.Text = Get-Content ./transcript.log | Out-String 
$outputBox.SelectionStart = $outputBox.Text.Length
$outputBox.ScrollToCaret()
$objForm.Controls.Add($outputBox) 


$objForm.Topmost = $True

$objForm.Add_Shown({$objForm.Activate()})
$objForm.ShowDialog() | Out-Null 
}

Generate-Form

vanity slug
Jul 20, 2010

Why not output it to a log file and use something like CMTrace for live updates?

Weaponized Autism
Mar 26, 2006

All aboard the Gravy train!
Hair Elf

Jeoh posted:

Why not output it to a log file and use something like CMTrace for live updates?

Oh I absolutely could, but I wanted to figure out a way to do this solely through PowerShell and not rely on other applications.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I'm fighting badly with a script to scan a list of machines for installed KBs. I create a remote WindowsUpdate Session and everything goes pretty good until I start trying to filter the data. I was never able to get the Search method of IUpdateSearcher to return anything meaningful, so I used some methodology I found in a TechNet article to walk the collection. The line where everything goes sideways is the loop to iterate across the results. It gives me exactly what I'm looking for, but it spins up a ton of network activity and takes about 20 seconds to process. That feels like an awful long time to iterate across a completed query (like, 300 lines of text). While it only takes about 20 seconds, that's fairly unwieldy when you multiply that by 100 or 1000 machines. I'm not anything approaching a PS expert, but is there a way to just pull the entire collection over locally once so that I can run all my comparisons against that?

code:
Param(
	[Parameter(Mandatory=$true,Position=1)]
	[string]$kbs,
	
	[Parameter(Mandatory=$true,Position=2)]
	[string]$computers
)

function Get-Matches($Pattern) { 
  begin { $regex = New-Object Regex($pattern) }
  process { foreach ($match in ($regex.Matches($_))) { ([Object[]]$match.Groups)[-1].Value } }
}

function Get-KBs($pcname){
    if (Test-Connection -Count 1 -Quiet $pcname){
        $OS = Get-WmiObject -Computer $pcname -Class Win32_OperatingSystem
        $Report = @()
        $objSession = [activator]::CreateInstance([type]::GetTypeFromProgID("Microsoft.Update.Session",$pcname))
        $objSearcher= $objSession.CreateUpdateSearcher()
        $colSucessHistory = $objSearcher.QueryHistory(0, $objSearcher.GetTotalHistoryCount())
        Foreach($objEntry in $colSucessHistory | where {$_.ResultCode -eq '2'}) {
            $Report += $objEntry.Title
        }
        $objSession = $null
        
        $kb_regex = "(" + [string]::Join("|",$kbItems) + ")"
        $kb_matches = $Report | Get-Matches $kb_regex
        if ($kb_matches){
            return ($pcname + "`t" + $OS.caption + "`t" + $kb_matches.Length + " KBs Found`t" + [string]::Join("`t",$kb_matches))
        } else {
            return ($pcname + "`t" + $OS.caption + "`tNo KBs Found")
        }
    } else {
        Write-Output $($pcname + "`tCannot connect to PC.")
    }
}

#################################################

Clear-Host

Write-Output $("Machine Name`tOS Version`tKBs Installed")
$kbItems = Get-Content $kbs
$pcs = Get-Content $computers
	
foreach($pc in $pcs){
	Write-Host "Querying $pc, please wait..."
    Get-KBs($pc) 
}

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Toshimo posted:

I'm fighting badly with a script to scan a list of machines for installed KBs. I create a remote WindowsUpdate Session and everything goes pretty good until I start trying to filter the data. I was never able to get the Search method of IUpdateSearcher to return anything meaningful, so I used some methodology I found in a TechNet article to walk the collection. The line where everything goes sideways is the loop to iterate across the results. It gives me exactly what I'm looking for, but it spins up a ton of network activity and takes about 20 seconds to process. That feels like an awful long time to iterate across a completed query (like, 300 lines of text). While it only takes about 20 seconds, that's fairly unwieldy when you multiply that by 100 or 1000 machines. I'm not anything approaching a PS expert, but is there a way to just pull the entire collection over locally once so that I can run all my comparisons against that?

code:
Param(
	[Parameter(Mandatory=$true,Position=1)]
	[string]$kbs,
	
	[Parameter(Mandatory=$true,Position=2)]
	[string]$computers
)

function Get-Matches($Pattern) { 
  begin { $regex = New-Object Regex($pattern) }
  process { foreach ($match in ($regex.Matches($_))) { ([Object[]]$match.Groups)[-1].Value } }
}

function Get-KBs($pcname){
    if (Test-Connection -Count 1 -Quiet $pcname){
        $OS = Get-WmiObject -Computer $pcname -Class Win32_OperatingSystem
        $Report = @()
        $objSession = [activator]::CreateInstance([type]::GetTypeFromProgID("Microsoft.Update.Session",$pcname))
        $objSearcher= $objSession.CreateUpdateSearcher()
        $colSucessHistory = $objSearcher.QueryHistory(0, $objSearcher.GetTotalHistoryCount())
        Foreach($objEntry in $colSucessHistory | where {$_.ResultCode -eq '2'}) {
            $Report += $objEntry.Title
        }
        $objSession = $null
        
        $kb_regex = "(" + [string]::Join("|",$kbItems) + ")"
        $kb_matches = $Report | Get-Matches $kb_regex
        if ($kb_matches){
            return ($pcname + "`t" + $OS.caption + "`t" + $kb_matches.Length + " KBs Found`t" + [string]::Join("`t",$kb_matches))
        } else {
            return ($pcname + "`t" + $OS.caption + "`tNo KBs Found")
        }
    } else {
        Write-Output $($pcname + "`tCannot connect to PC.")
    }
}

#################################################

Clear-Host

Write-Output $("Machine Name`tOS Version`tKBs Installed")
$kbItems = Get-Content $kbs
$pcs = Get-Content $computers
	
foreach($pc in $pcs){
	Write-Host "Querying $pc, please wait..."
    Get-KBs($pc) 
}
I have used the update searcher before, but I don't do remote sessions with it. The delays you're seeing are probably the result of the "completed" query not really containing all the data needed. It's doing a sort of lazy loading so that it only talks to the other side when the value is requested. I'm speculating; I'm not super familiar with it.

But there are 2 ways to mitigate this:

1. Parallelize the process by using PowerShell Jobs. Then the 20 second pauses run in parallel (not perfectly, it will probably take more than 20 seconds just to set up all the jobs).

Be a bit careful with this; jobs are not threads. You spawn at least one new process with each background job. If you have 1000 servers, then consider whether the machine you're running this on can handle 1001+ PowerShell sessions and the associated DCOM (?) traffic from the Update Searcher concurrently.

2. Don't use a remote searcher. Instead use a PowerShell remoting session so that your code is running on the remote machine, then just use the local searcher. This requires having PowerShell remoting setup and configured on your servers. If you don't have this already, then you should; it's incredibly useful. I have an article about configuring it through Group Policy. Then, you can use:
code:
Invoke-Command -ComputerName $pcame -ScriptBlock {
    # All your code
}
You can actually use this with option 1 as well, by using the -AsJob parameter with Invoke-Command. Now the local PowerShell sessions are basically just waiting for the remote side (which is doing all the work) to give back a result. Plus, the remote side might have a smaller delay or not at all (the 20 seconds might go away).


I highly recommend option 2, whether you use jobs or not.

Here's a good article on running jobs out of a queue, in case you have enough servers that jobs will wreak havoc on the local machine.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Briantist posted:

But there are 2 ways to mitigate this:

1. Parallelize the process by using PowerShell Jobs. Then the 20 second pauses run in parallel (not perfectly, it will probably take more than 20 seconds just to set up all the jobs).

Be a bit careful with this; jobs are not threads. You spawn at least one new process with each background job. If you have 1000 servers, then consider whether the machine you're running this on can handle 1001+ PowerShell sessions and the associated DCOM (?) traffic from the Update Searcher concurrently.

2. Don't use a remote searcher. Instead use a PowerShell remoting session so that your code is running on the remote machine, then just use the local searcher. This requires having PowerShell remoting setup and configured on your servers. If you don't have this already, then you should; it's incredibly useful. I have an article about configuring it through Group Policy. Then, you can use:

I'll take a look when I get back in on Monday, but I'm pretty sure PowerShell remoting is not enabled and I don't have the pull to get the GPO changed. I'm running my scripts from a Win7x86 box with 4 gig of RAM and about a billion layers of cruft. I've done the parallelization both through jobs and through just good old "for-each -parallel". Either one helps, but I don't get too many concurrent jobs. Thanks for the links, though. I'll give them a read.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.
If you need better performance than background jobs can give you for tasks that are embarrassingly parallel, you can also consider runspaces, or workflows if you need more control over the way results are aggregated and returned. Workflows might be really helpful here just for the built-in checkpointing capabilities.

Zaepho
Oct 31, 2013

Microsoft just posted the Advanced PowerShell Desired State Configuration (DSC) and Custom Resources Jump Start session video.
Available at: http://www.microsoftvirtualacademy.com/training-courses/advanced-powershell-desired-state-configuration-dsc-and-custom-resources

I guess I know what I'll be watching tonight.

AreWeDrunkYet
Jul 8, 2006

Toshimo posted:

I'm fighting badly with a script to scan a list of machines for installed KBs.

I know this is a PowerShell thread, but isn't this better handled by SCCM?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

AreWeDrunkYet posted:

I know this is a PowerShell thread, but isn't this better handled by SCCM?

You would think so. But in the last month of listening to our conference calls with Microsoft, I've begun to suspect that SCCM reporting is currently held together with baling wire and bubble gum. There's some goofy stuff going on like machines reporting as compliant immediately on release of a SUS package based on the fact that the machines are unreachable so the SUS server defaults to the last known catalog, which of course doesn't have the current package in it. Also, I don't have access to enough DBA time to write anything big and custom to get just this sort of info (which would be an arseload of work).

In regards to the other suggestions, I have verified that we don't have Powershell remoting enabled, and every variant of parallelization I've tried has capped out rather low because as soon as it tries to do a significant number of simultaneous connections, it causes the networking on the box to break down (it was dropping my RDP session at some points).

Zaepho
Oct 31, 2013

Toshimo posted:

like machines reporting as compliant immediately on release of a SUS package based on the fact that the machines are unreachable so the SUS server defaults to the last known catalog, which of course doesn't have the current package in it.

OK that's really goofy and something I haven't seen. I'll have to see if I can reproduce this. Can you elaborate further? As I understand it a machine only reports compliance/requirement with each individual update. From there the SCCM reporting summarizes that into compliance with the entire update group. WSUS itself has very little to do with it and should be completely ignored 99% of the time. I'm really interested in what you're seeing and understanding it better so i can answer any questions about it that customers might have. I'd hate to be caught with my pants down.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.
I'd like to define a macro to clear everything from a directory. I tried
code:
set-alias delAll "li | %{ del $_.Name }"
but running it pops the error
code:
delAll : The term 'ls | %{del .Name}' is not recognized as the name of a cmdlet, function, script file, or operable program...
I guess that what I want is to define a function or script - how do I do that?

12 rats tied together
Sep 7, 2006

I'd start here: http://blogs.technet.com/b/heyscriptingguy/archive/2012/07/07/weekend-scripter-cmdletbinding-attribute-simplifies-powershell-functions.aspx

If you're going to be using powershell a lot, you shoud also look into creating a module. I created and maintained a module full of useful niche commands for my previous employer and it was quite enjoyable and useful.

You might want to wait until you have more custom scripts though because you can probably just do "del *" inside powershell and it will clear out the directory you are in.

epswing
Nov 4, 2003

Soiled Meat
I once had a shell script in ubuntu that I could call to watch apache logs.

Behold, logwatch.sh:
pre:
tailf $1 | awk '{if ($7 !~ /favicon.ico && $7 !~ /robots.txt) print $1 " " $4 $5 $6 " " $7 " " $8 " " $9}'
Basically, "for each line that is appended to the log, if the 7th column (which is the file being requested) isn't a request for favicon.ico or robots.txt, print the columns I'm interested in, which are 1, 4, 5, 6, 7, 8 and 9".

I would just run logwatch.sh /path/to/logfile.txt which would tailf the file and pipe it to awk and bam, magic.

Now I'm streaming log files from azure with azure site log tail mywebsite. How do I do this in powershell? I want to pipe the lines output from one program into another, filter out the ones I don't want, and print the parts of the line I want to see. I've been googling in circles for half an hour just to figure out how to read from stdin in a loop and I'm still not sure how to do it.

Edit: I figured this would be a good start, just echo the lines on [Console]::In, but it sits there waiting for user input. (If I type 'hello' and press enter, I get 'hello' back.)

pre:
while (($line = [Console]::In.ReadLine()) -ne '') {
    write-host $line
}

C:\> azure site log tail mywebsite | thescript.ps1

epswing fucked around with this message at 05:01 on Mar 31, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

epalm posted:

I once had a shell script in ubuntu that I could call to watch apache logs.

Behold, logwatch.sh:
pre:
tailf $1 | awk '{if ($7 !~ /favicon.ico && $7 !~ /robots.txt) print $1 " " $4 $5 $6 " " $7 " " $8 " " $9}'
Basically, "for each line that is appended to the log, if the 7th column (which is the file being requested) isn't a request for favicon.ico or robots.txt, print the columns I'm interested in, which are 1, 4, 5, 6, 7, 8 and 9".

I would just run logwatch.sh /path/to/logfile.txt which would tailf the file and pipe it to awk and bam, magic.

Now I'm streaming log files from azure with azure site log tail mywebsite. How do I do this in powershell? I want to pipe the lines output from one program into another, filter out the ones I don't want, and print the parts of the line I want to see. I've been googling in circles for half an hour just to figure out how to read from stdin in a loop and I'm still not sure how to do it.

Edit: I figured this would be a good start, just echo the lines on [Console]::In, but it sits there waiting for user input. (If I type 'hello' and press enter, I get 'hello' back.)

pre:
while (($line = [Console]::In.ReadLine()) -ne '') {
    write-host $line
}

C:\> azure site log tail mywebsite | thescript.ps1

Typically if you're piping into powershell like that, you can just use the $input variable in your script and it will contain everything you piped. Note that powershell reads all of the input before your script starts executing.

wonderboy
Aug 15, 2001
What is the secret of your power?
While I do have a modest amount of programming background, I've never done much in Windows and I'm basically a novice with PowerShell.

We've started using Microsoft's Group Chat, the thing that's part of Office Communications Server 2007 R2; before you ask, yes I know that Lync replaced this and has it's own "persistant chat" that replaces the Group Chat functionality, but we're part of a woefully sluggish corporate environment that can't be bothered to get up to speed with that, and no idea how long it might be before they get their act together.

What I'd like is a PowerShell script that can send a message to one of the groups I'm a part of in my existing/running instance of Group Chat, and I'm having a hell of a time either finding either examples or much in the way of documentation on how to go about it. On the contrary, I do have Lync 2010 running as well, and found a nice and simple example of how to start a Lync "conversation" with members of a Lync group, which I'll paste below in case anyone finds that interesting.

If anyone knows of or can help create something similar for Group Chat, I'd very much appreciate it.

(the below assumes you have the Lync SDK installed in the standard location, change the first line if that isn't true for you)
code:
$assemblyPath = "C:\Program Files (x86)\Microsoft Lync\SDK\Assemblies\Desktop\Microsoft.Lync.Model.DLL"
Import-Module $assemblyPath

$IMType = 1
$PlainText = 0

$cl = [Microsoft.Lync.Model.LyncClient]::GetClient()
$conv = $cl.ConversationManager.AddConversation()

$gs = $cl.ContactManager.Groups

foreach ($g in $gs)
{
    if ($g.Name -eq "Target Group Name Goes Here")
    {
        foreach ($contact in $g)
        {
            $null= $conv.AddParticipant($contact)
        }
    }
}

$d = New-Object "System.Collections.Generic.Dictionary[Microsoft.Lync.Model.Conversation.InstantMessageContentType,String]"
$d.Add($PlainText, "This is a test message.")

$m = $conv.Modalities[$IMType]
$null = $m.BeginSendMessage($d, $null, $d)

Adbot
ADBOT LOVES YOU

monkey
Jan 20, 2004

by zen death robot
Yams Fan
I want to do something in windows 7 which should be trivially simple, I don't have powershell, but it looks like it is the right tool for the job, so installing it now.

All I want to do is get every subfolder in a specific folder and overwrite the modified date with the created date, is this the sort of thing powershell does?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply