Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy
WMF 5.0 Production Preview is here!

Adbot
ADBOT LOVES YOU

orange sky
May 7, 2007

This is a reaaaaally niche request, so I'm not really hoping to strike gold, but you never know.

So, we're performing a Domino to Exchange migration and the migration product doesn't migrate permissions on anything (bit of a nightmare, really). As such, we're better off building a script that gets all the permissions on Domino and then another script that puts those permissions onto exchange automatically. Most of the stuff we'll probably be able to do, but I'm no expert at working with Domino and Powershell.

Do any of you guys have script that might get a list of the permissions/delegations on Domino?

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

orange sky posted:

This is a reaaaaally niche request, so I'm not really hoping to strike gold, but you never know.

So, we're performing a Domino to Exchange migration and the migration product doesn't migrate permissions on anything (bit of a nightmare, really). As such, we're better off building a script that gets all the permissions on Domino and then another script that puts those permissions onto exchange automatically. Most of the stuff we'll probably be able to do, but I'm no expert at working with Domino and Powershell.

Do any of you guys have script that might get a list of the permissions/delegations on Domino?
Oddly, the first thing I found Googling "powershell domino" to see how remotely possible this even is covers ACLs specifically:

http://baldwin-ps.blogspot.com/2013/08/lotus-notes-and-powershell-retrieve-acl.html

You'll need to use their COM objects for basically all the automation.

MJP
Jun 17, 2007

Are you looking at me Senpai?

Grimey Drawer
Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts.

I need to find out, if possible, who last logged into them.

What I have is this:

code:
$computers | %{get-wmiobject win32_computersystem.username}
I get an error that it's an invalid query. If I just do the above but without .username it just repeats WMI info for my machine.

I also tried:

code:
foreach ($computer in $computers) | %{get-wmiobject win32_computersystem.username}
I get an error that it's missing the statement body in the foreach loop.

I'm not fixed on using an array for this - totally open to ideas to do this better.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin

MJP posted:

Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts.

I need to find out, if possible, who last logged into them.

What I have is this:

code:
$computers | %{get-wmiobject win32_computersystem.username}
I get an error that it's an invalid query. If I just do the above but without .username it just repeats WMI info for my machine.

I also tried:

code:
foreach ($computer in $computers) | %{get-wmiobject win32_computersystem.username}
I get an error that it's missing the statement body in the foreach loop.

I'm not fixed on using an array for this - totally open to ideas to do this better.

Just throwing this out there before I start testing, could you try this syntax:
(Get-WMIObject -Computername $computer -Class Win32_computersystem).username

Edit:
I'm phoneposting so this is hard, but it looks like your structure is sorta weird. I don't think you want to use a pipe after that foreach.

Dr. Arbitrary fucked around with this message at 21:53 on Sep 3, 2015

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
$computers | %{$(get-wmiobject -ComputerName $_ -class win32_computersystem).username}

Toshimo fucked around with this message at 22:16 on Sep 3, 2015

Swink
Apr 18, 2006
Left Side <--- Many Whelps
I hate that % alias.

12 rats tied together
Sep 7, 2006

MJP posted:

Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts.
[...]
I get an error that it's an invalid query. If I just do the above but without .username it just repeats WMI info for my machine.

To attempt to explain this a little better, you have an array or collection or whatever. A list of computers that you copied from a spreadsheet. You're using % or ForEach-Object and then doing something. It might help if you visualize it like this:
code:
$computers = 
    @(
        computer1 => take this and { Get-WMIObject win32_computersystem }
        computer2 => take this and { Get-WMIObject win32_computersystem }
        computer3 => take this and { Get-WMIObject win32_computersystem }
     )
What your script is going to do is, for every object inside $computers, run the command inside {}. If you have 3 computers you run the command 3 times, the command inside {} doesn't have anything that might tell it to look anywhere other than your computer, so that's what it is doing and that's why you get WMI info for your machine a bunch of times in a row. Also, as far as I can tell, "win32_computersystem.username" is not a valid WMI object and that's why you're getting the invalid query with your other attempt.

The thing you want it to do, what Toshimo posted, is take each object inside your array $computers and "drop" it into the {}s. "$_" is a built-in variable that generally refers to "what am I working on right now?" -- since you are going through a list of computer names, in each pass through the loop "$_" is going to be the first, second, third, etc, computer name.

Try this: $computers | % { write-output "Pretend that I got info for $_" }
Compare it to: $computers | % { write-output "Pretend that I got info for username" }

So, if you rewrite it to be "$computers | % { get-wmiobject -computername $_ win32_computersystem }" it turns into this:
code:
$computers = 
    @(
        computer1 => take this and { Get-WMIObject -computername computer1 win32_computersystem }
        computer2 => take this and { Get-WMIObject -computername computer2 win32_computersystem }
        computer3 => take this and { Get-WMIObject -computername computer3 win32_computersystem }
     )
Hope that makes sense.

12 rats tied together fucked around with this message at 16:56 on Sep 4, 2015

Hadlock
Nov 9, 2004

I run a mixed environment of 32 and 64 bit machines at work and have to gently caress with the registry periodically. Right now I'm just looking for wow64node or whatever in the registry path to move the project along but there's got to be a better way.

What's the best practice for determining the bit count of the machine?

Is there a way to dynamically navigate to the 64 or 32 bit version of a registry path?

I guess you could do a bit check at the beginning of the script and say if 64 bit,

$a="wow64node\"

And then in all your paths say,

$reg = 'hlkm:software\$a path\to\stuff'

? My syntax is probably off a little (phone posting) but that seems a little obtuse still.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Hadlock posted:

I run a mixed environment of 32 and 64 bit machines at work and have to gently caress with the registry periodically. Right now I'm just looking for wow64node or whatever in the registry path to move the project along but there's got to be a better way.

What's the best practice for determining the bit count of the machine?

Is there a way to dynamically navigate to the 64 or 32 bit version of a registry path?

I guess you could do a bit check at the beginning of the script and say if 64 bit,

$a="wow64node\"

And then in all your paths say,

$reg = 'hlkm:software\$a path\to\stuff'

? My syntax is probably off a little (phone posting) but that seems a little obtuse still.

code:
[Environment]::Is64BitOperatingSystem

Walked
Apr 14, 2003

What is everyone doing for version/source control?

I'm starting to grow in both number of scripts, and people I share them with to a point where I need something in place to manage it a bit better.

Looking at git or TFS Online options. Any suggestions on workflows I can move to?

Any good git primer you guys can recommend if that's the way to go?

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Walked posted:

What is everyone doing for version/source control?

I'm starting to grow in both number of scripts, and people I share them with to a point where I need something in place to manage it a bit better.

Looking at git or TFS Online options. Any suggestions on workflows I can move to?

Any good git primer you guys can recommend if that's the way to go?

Use VSO + Git. That way, you have a local repo and you can make sure if your PC explodes you're still covered by source control.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

Ithaqua posted:

Use VSO + Git. That way, you have a local repo and you can make sure if your PC explodes you're still covered by source control.
Bitbucket is another free option with unlimited private repositories.

Walked
Apr 14, 2003

Followup on the git question:

Spent part of today reading up, and fairly comfortable with the basic concepts.

However; my use case is that:

I have my local repo; makes sense
Which I'll push to VSO for changes I'm happy with.

How would I go about also getting the VSO repo synced up with a DFS namespace/fileshare at work? Basically I'd like to have a share at work that is only (and autmatically) updated with changes that are pushed to the server.

Any suggestions there? I'd like as much automation as possible so the repo and share are as in-sync as possible without human intervention.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

Walked posted:

Followup on the git question:

Spent part of today reading up, and fairly comfortable with the basic concepts.

However; my use case is that:

I have my local repo; makes sense
Which I'll push to VSO for changes I'm happy with.

How would I go about also getting the VSO repo synced up with a DFS namespace/fileshare at work? Basically I'd like to have a share at work that is only (and autmatically) updated with changes that are pushed to the server.

Any suggestions there? I'd like as much automation as possible so the repo and share are as in-sync as possible without human intervention.
Create a repository on a shared drive. Create a scheduled task to automatically git pull your wanted branch(es) from VSO. Ensure it's read-only (i.e. only the user running that pull script can write to it).

Walked
Apr 14, 2003

Vulture Culture posted:

Create a repository on a shared drive. Create a scheduled task to automatically git pull your wanted branch(es) from VSO. Ensure it's read-only (i.e. only the user running that pull script can write to it).

That's where I'm stumbing; I cant find any documentation on how to cook in the authentication for VSO into a scheduled task. Otherwise this is 100% exactly what I'd like to do. Any tips on what to look for on that? I'm normally pretty good with google-fu, but not being a developer, not using VSO before, and not knowing git and I'm struggling :negative:

edit: Figured it out; dang.


Easy! Though documentation is not all over about this

but you can just use the VSO Personal Access token and
code:
git clone htps://{{ACCESS_TOKEN}}@project.visualstudio.com/DefaultCollection/_git/info
Thanks :)

Walked fucked around with this message at 21:52 on Sep 5, 2015

bgreman
Oct 8, 2005

ASK ME ABOUT STICKING WITH A YEARS-LONG LETS PLAY OF THE MOST COMPLICATED SPACE SIMULATION GAME INVENTED, PLAYING BOTH SIDES, AND SPENDING HOURS GOING ABOVE AND BEYOND TO ENSURE INTERNET STRANGERS ENJOY THEMSELVES
So I'm running into another permissions issue with my Powershell application deployment script.

If I remote (using remote desktop) into the "jump box" where the build is staged, I can run the deployment script just fine, after the changes I made in the discussion around this post.

However, I'm trying to remotely invoke the script from the build machine where the build actually takes place. Basically the flow is: do build, copy package and deployment script to jump box, try to use invoke-command to execute script on jump box. The script is being executed, but large portions of it are bombing out, presumably due to the same double-hop issues I had to fix in the script itself.

So basically I'm doing this:

code:
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($userName, $password)
$sesh = New-PSSession -ComputerName path.to.jumpbox -Credential $cred
Invoke-Command -Session $sesh { J:\path\to\AutoDeploy.ps1 }
All the Copy-Items and Get-Services in the script seem to be failing.

If I want to maintain the workflow posted above, how would I go about handling this? Can I somehow forward the credential on to the remote execution of the script?

That is, would something like this work?

code:
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($userName, $password)
$sesh = New-PSSession -ComputerName path.to.jumpbox -Credential $cred
Invoke-Command -Session $sesh { Invoke-Command -ComputerName "." -Credential $using:cred { J:\path\to\AutoDeploy.ps1 } }
These permissions issues with remoting have me so confused.

Edit: Tried the above, no luck. Same access denied errors from the script, running on the jump box, when it tries to copy files to other machines. RDP-ing into the jump box and running the script causes no issues.

bgreman fucked around with this message at 17:46 on Sep 8, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

bgreman posted:

So I'm running into another permissions issue with my Powershell application deployment script.

If I remote (using remote desktop) into the "jump box" where the build is staged, I can run the deployment script just fine, after the changes I made in the discussion around this post.

However, I'm trying to remotely invoke the script from the build machine where the build actually takes place. Basically the flow is: do build, copy package and deployment script to jump box, try to use invoke-command to execute script on jump box. The script is being executed, but large portions of it are bombing out, presumably due to the same double-hop issues I had to fix in the script itself.

So basically I'm doing this:

code:
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($userName, $password)
$sesh = New-PSSession -ComputerName path.to.jumpbox -Credential $cred
Invoke-Command -Session $sesh { J:\path\to\AutoDeploy.ps1 }
All the Copy-Items and Get-Services in the script seem to be failing.

If I want to maintain the workflow posted above, how would I go about handling this? Can I somehow forward the credential on to the remote execution of the script?

That is, would something like this work?

code:
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($userName, $password)
$sesh = New-PSSession -ComputerName path.to.jumpbox -Credential $cred
Invoke-Command -Session $sesh { Invoke-Command -ComputerName "." -Credential $using:cred { J:\path\to\AutoDeploy.ps1 } }
These permissions issues with remoting have me so confused.

You can forward credentials by enabling, configuring, and using CredSSP, but I find using CredSSP is usually not needed, and other workarounds are cleaner.

For example in your case, the entire purpose of remoting is to start a script without giving it any parameters. For that, I would probably create a scheduled task (on-demand, no actual schedule) that runs your script with the specified credentials.

Then your invocation becomes:
code:
Invoke-Command -Session $sesh -ScriptBlock { Start-ScheduledTask -TaskName "AutoDeploy" }
or if you're using older versions of PowerShell that don't have that cmdlet, this will work (and will work in the newer ones too):
code:
Invoke-Command -Session $sesh -ScriptBlock { schtasks.exe /Run /TN "AutoDeploy" }
A scheduled task can be created through group policy if you need it on lots of domain joined machines and it's easy to invoke the task, but it also gives anyone who can start a task the ability to run that script. That might be bad depending on your security requirements.

Another thing you can do is to create a new PSSession configuration on the target machine, and give that configuration a RunAs credential. You can finely control who is allowed to connect to this endpoint, but it's a bit more work upfront.

bgreman
Oct 8, 2005

ASK ME ABOUT STICKING WITH A YEARS-LONG LETS PLAY OF THE MOST COMPLICATED SPACE SIMULATION GAME INVENTED, PLAYING BOTH SIDES, AND SPENDING HOURS GOING ABOVE AND BEYOND TO ENSURE INTERNET STRANGERS ENJOY THEMSELVES

Briantist posted:

You can forward credentials by enabling, configuring, and using CredSSP, but I find using CredSSP is usually not needed, and other workarounds are cleaner.

For example in your case, the entire purpose of remoting is to start a script without giving it any parameters. For that, I would probably create a scheduled task (on-demand, no actual schedule) that runs your script with the specified credentials.

Then your invocation becomes:
code:
Invoke-Command -Session $sesh -ScriptBlock { Start-ScheduledTask -TaskName "AutoDeploy" }
or if you're using older versions of PowerShell that don't have that cmdlet, this will work (and will work in the newer ones too):
code:
Invoke-Command -Session $sesh -ScriptBlock { schtasks.exe /Run /TN "AutoDeploy" }
A scheduled task can be created through group policy if you need it on lots of domain joined machines and it's easy to invoke the task, but it also gives anyone who can start a task the ability to run that script. That might be bad depending on your security requirements.

Another thing you can do is to create a new PSSession configuration on the target machine, and give that configuration a RunAs credential. You can finely control who is allowed to connect to this endpoint, but it's a bit more work upfront.

I'll give this a go, thanks for the suggestions! I only need to run the script (for now) on the jump box. If someone unauthorized is able to get on the box and run the script, we've got bigger issues than someone accidentally running a deploy, so I'm not too worried about the security of the scheduled task.

Hadlock
Nov 9, 2004

I have to dynamically parse an XML file

This is what my XML file looks like. Each line looks like <component Registration="None" Name="Filename.dll"/>
I can't figure out how to auto-pretty a single line XML so here's a screenshot


Basically what I want to do is parse $manifest.manifest.Machines.Machine where Code="BAT" and then write these values

$path = $manifest.manifest.Machines.Machine.installpath.Default (should = C:\path\)
$file = $manifest.manifest.Machines.Machine.installpath.component.name (should = Filename.dll)

So far I have this to parse up to the BAT part but getting the actual data out is stumping me.

code:
$xml = "\\path\to\manifest.xml"
[xml]$manifest = Get-Content $xml

foreach($machine in $manifest.manifest.Machines.Machine){
    if($machine.Code -match "BAT"){
        Write-Host "BAT found"

        foreach ($Component in $manifest.Manifest.Machines.Machine.InstallPath) {
            if($Component.Name -match "dll"){
                #Do something
                Write-Host "dll found:" $component.name
            }
        }
    }
}

#[url]http://stackoverflow.com/questions/18032147/parsing-xml-using-powershell[/url]
Thoughts? THis is probably really easy, I've just never had to parse one before :confused:

Video Nasty
Jun 17, 2003

Hadlock posted:

I have to dynamically parse an XML file

So far I have this to parse up to the BAT part but getting the actual data out is stumping me.

code:
$xml = "\\path\to\manifest.xml"
[xml]$manifest = Get-Content $xml

foreach($machine in $manifest.manifest.Machines.Machine){
    if($machine.Code -match "BAT"){
        Write-Host "BAT found"

        foreach ($Component in $manifest.Manifest.Machines.Machine.InstallPath) {
            if($Component.Name -match "dll"){
                #Do something
                Write-Host "dll found:" $component.name
            }
        }
    }
}
Thoughts? THis is probably really easy, I've just never had to parse one before :confused:

Isn't $component.name supposed to be capitalized [$Component.name], or was that just mis-typed in the example?
Powershell is a stickler on case; but everything else checks out and should function properly as long as you have access to that network-shared file through powershell.

Video Nasty fucked around with this message at 00:27 on Sep 9, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Hadlock posted:

I have to dynamically parse an XML file

This is what my XML file looks like. Each line looks like <component Registration="None" Name="Filename.dll"/>
I can't figure out how to auto-pretty a single line XML so here's a screenshot


Basically what I want to do is parse $manifest.manifest.Machines.Machine where Code="BAT" and then write these values

$path = $manifest.manifest.Machines.Machine.installpath.Default (should = C:\path\)
$file = $manifest.manifest.Machines.Machine.installpath.component.name (should = Filename.dll)

So far I have this to parse up to the BAT part but getting the actual data out is stumping me.

code:
$xml = "\\path\to\manifest.xml"
[xml]$manifest = Get-Content $xml

foreach($machine in $manifest.manifest.Machines.Machine){
    if($machine.Code -match "BAT"){
        Write-Host "BAT found"

        foreach ($Component in $manifest.Manifest.Machines.Machine.InstallPath) {
            if($Component.Name -match "dll"){
                #Do something
                Write-Host "dll found:" $component.name
            }
        }
    }
}

#[url]http://stackoverflow.com/questions/18032147/parsing-xml-using-powershell[/url]
Thoughts? THis is probably really easy, I've just never had to parse one before :confused:

Yeah you were really close, this is the only change you need:
code:
foreach ($Component in $manifest.Manifest.Machines.Machine.InstallPath.Component) {

bgreman
Oct 8, 2005

ASK ME ABOUT STICKING WITH A YEARS-LONG LETS PLAY OF THE MOST COMPLICATED SPACE SIMULATION GAME INVENTED, PLAYING BOTH SIDES, AND SPENDING HOURS GOING ABOVE AND BEYOND TO ENSURE INTERNET STRANGERS ENJOY THEMSELVES
Does the PowerShell ISE have a different implementation of Test-Path or something?



Top window is a normal Powershell command line where I've populated those values by manually running some lines from my script. Bottom window is from the PowerShell ISE that was paused after running the appropriate lines.

The problem I'm having is that Test-Path $BuildZip is returning false if I run it from the command line (or if I just have PowerShell execute my script), but it returns true and the script executes normally if I run this stuff from within the ISE. What gives?

(I know I could probably fix this by using Test-Path $BuildZip.FullName, but I want to understand what's happening here if possible).

Edit: Right as I clicked post I understood. It returns True within the ISE because the execution directory is the same as the folder containing $BuildZip, but in the standalone command line, it's not. When I changed directory to that dir in the command line and ran Test-Path $BuildZip it worked just fine. Derp.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Is anyone using the http://psappdeploytoolkit.com and is it a worthwhile thing to use in a SME?

edit - for context, we install\update software via GPO however with more and more mobile staff, less people are actually booting up in the office so I need a method to push out software when the machine is online.

Swink fucked around with this message at 08:24 on Sep 15, 2015

Doug
Feb 27, 2006

This station is
non-operational.

Swink posted:

Is anyone using the http://psappdeploytoolkit.com and is it a worthwhile thing to use in a SME?

edit - for context, we install\update software via GPO however with more and more mobile staff, less people are actually booting up in the office so I need a method to push out software when the machine is online.

I can't comment specifically on that, but we use PDQ Deploy and it's been really great. You can automate patches, set heartbeat triggers so installs/updates are pushed as soon as a computer comes online, and it provides a repository for updates so you can restrict update access to only that server. It pairs well with their other product PDQ Inventory but isn't necessary. http://www.adminarsenal.com/pdq-deploy

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
So I'm writing a script to cleanup from an SCCM bug and I'm wondering how to approach the problem.

The problem is that SCCM has created multiple folders that have identical contents and I need to find them. The folders have random names, but they're named with a GUID. So I'd probably need a regex to say "is this a GUID of form X" or not. I also don't know how many levels deep the folders go. So it will look something like this
code:
Folder A
-Folder B
--Folder C
---Folder ???
----{GUID 1}
----{GUID 2}
So I need to find a folder that's full of GUID folders, then iterate over every GUID folder and match identical contents. I'm thinking for that a double for-each loop, so for each GUID folder I compare to every other GUID folder. But I'm just wondering what would be an easy way to compare the folders in Powershell? The files are identical, the same number, the same size, I'd imagine the hashes are the same (but I don't want to hash gigabytes of data for no reason). Any tips?

Swink
Apr 18, 2006
Left Side <--- Many Whelps

Doug posted:

I can't comment specifically on that, but we use PDQ Deploy and it's been really great. [/url]

I'm all about PDQ :) the beauty of the powershell toolkit (for me) is the notifications for the users. Allows them to defer etc.

Might be question for the windows thread.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Double post for this: https://twitter.com/Johnmroos/status/643915357384740865


PowershellGet Support in PS 3.0 and 4.0

Hadlock
Nov 9, 2004

How is that going to work? Some sort of updater package that pushes you from PS 3.0 and 4.0 to 3.1 and 4.1?

adaz
Mar 7, 2009

FISHMANPET posted:

So I'm writing a script to cleanup from an SCCM bug and I'm wondering how to approach the problem.

The problem is that SCCM has created multiple folders that have identical contents and I need to find them. The folders have random names, but they're named with a GUID. So I'd probably need a regex to say "is this a GUID of form X" or not. I also don't know how many levels deep the folders go. So it will look something like this
code:
Folder A
-Folder B
--Folder C
---Folder ???
----{GUID 1}
----{GUID 2}
So I need to find a folder that's full of GUID folders, then iterate over every GUID folder and match identical contents. I'm thinking for that a double for-each loop, so for each GUID folder I compare to every other GUID folder. But I'm just wondering what would be an easy way to compare the folders in Powershell? The files are identical, the same number, the same size, I'd imagine the hashes are the same (but I don't want to hash gigabytes of data for no reason). Any tips?

So you basically want something like this right?

code:

$folders = gci C:\temp -recurse
foreach($folder in $folders){
     [Guid]$name
     if([Guid]::TryParse($folder.Name,[ref]$name))
     {
          ## We're a guid folder! do something!
     }

}

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Swink posted:

Double post for this: https://twitter.com/Johnmroos/status/643915357384740865

PowershellGet Support in PS 3.0 and 4.0
gently caress yeah!

adaz posted:

So you basically want something like this right?

code:

$folders = gci C:\temp -recurse
foreach($folder in $folders){
     [Guid]$name
     if([Guid]::TryParse($folder.Name,[ref]$name))
     {
          ## We're a guid folder! do something!
     }

}
Welp, beat me to using the [Guid] type, though I might have just piped the gci directly into ForEach-Object.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
pre:
set-strictmode -version 2.0


$folders = Get-ChildItem C:\Windows\System32\catroot -Force -Recurse | Where-Object{($_.PSIsContainer)} | Select Name, FullName

$filesize_list = @()

ForEach($folder in $folders){
     $name = [Guid]::Empty
     if([Guid]::TryParse($folder.Name,[ref]$name))
     {
        $filesize_list += Get-Item $folder.FullName | Select -Property FullName,@{Name="Size"; `
                          Expression = {(Get-ChildItem $_.FullName | Measure-Object -property length -sum).Sum + 0}}
     }
}

$file_groups = $filesize_list | Select FullName, Size | Group-Object -Property Size | Where-Object {($_.Count -gt 1) -and ($_.Name -gt 0)} `
               | Select Name, @{Name="Group"; Expression = {$_.Group.FullName}} 

$file_hashes = @{}

ForEach( $file_group in $file_groups){
    ForEach( $file_to_hash in $file_group.Group){
        $file_hashes.Add( $file_to_hash, (Get-ChildItem $file_to_hash -Force -Recurse | Get-FileHash | Select Hash))
    }
}
$file_hashes.GetEnumerator() | Group-Object -Property Value | Select Count, @{Name="Matches"; Expression = {$_.Group.Name}} `
                | Out-GridView -OutputMode Single | Select -ExpandProperty Group
Oh no what have I typed at 2am.

12 rats tied together
Sep 7, 2006

FISHMANPET posted:

So I need to find a folder that's full of GUID folders, then iterate over every GUID folder and match identical contents. I'm thinking for that a double for-each loop, so for each GUID folder I compare to every other GUID folder. But I'm just wondering what would be an easy way to compare the folders in Powershell? The files are identical, the same number, the same size, I'd imagine the hashes are the same (but I don't want to hash gigabytes of data for no reason). Any tips?

You probably don't need to match or even care about the guid, unless I'm really misunderstanding something here. This is an SCCM bug, yeah? So it probably barfed a bunch of poo poo out into some folder and you need to clean it up? Then everything is going to be a child of c:\SSCM\bullshit\.

So, spin through c:\sccm\bullshit recursively, then do something like this:

code:
$ParentHash = @{}
gci -recurse | foreach-object {

$identifiable property = get-identifiable_property # size, hash, whatever

if (!($ParentHash.ContainsKey($identifiable_property)) {
    $ParentHash.$identifiable_property += @{ "match 1" = "$($_.FullName)"
    }

elif {
    $ParentHash.$identifiable_property += @{ "match $($ParentHash.$identifiable_property.Count)" = "$($_.FullName)" }
    }
}
Now you have $ParentHash which contains a key-value list of whatever property you decided to determine "uniqueness" on. The key is that unique property, the value is a pointer to another hash table that contains "match 1", "match 2" and so on, and then the full path to your instance of a matched object. You didn't have to compare anything, you went through everything once and "wrote" it all down.

So, if you know that what you're looking for has a hash of "$hash" or a size of "56000 or greater", and then you need to delete these things or move them somewhere, you can just do:
code:
($ParentHash.Keys.ForEach({ if ($_ -ge 56000) { $ParentHash.$_.Values} }) | % { Remove-Item $_ }
# or...
$ParentHash.$hash.values | % { Remove-Item $_ }
Make sense? In my opinion you really, really don't want to start messing around with "Get | For | For | If -> assign to $variable, $variable | get | measure" nonsense when you are talking about big directory structures. Iterate through as few times as possible, record what you need and then iterate through that. These solutions are workable and probably your only option at the moment because your problem description was kind of vague. "Examine an arbitrary long list of things and record all possible matches they might have with each other" is not a pretty problem.

But, if you have some kind of "reference object" (or even a small list of them), you can probably do this much more easily with Compare-Object. If you know that SCCM hosed up with some specific set of folders and then just vomited them into subdirectories at random, you can get your "master" and use Compare-Object with a target of "gci /some/path -recurse". No need to reinvent the wheel here.

12 rats tied together fucked around with this message at 17:10 on Sep 16, 2015

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
The specific bug is that when it imported a driver folder, it makes a copy of the folder for every INF file. the worst case was a 500mb Realtek driver that had 40 INF files so it made 40 copies of that folder, resulting in a 20Gb driver pack. So yeah SCCM spewed too many files out, but I don't know what files specifically it spewed, I just know that it if something is there twice then I need to mark it as bad (remediation will be done manually because it's kind of involved and scripting in SCCM sucks).

But anyway I've got another thing I have to work on today before I can dig into this but hopefully I can get to it yet today and figure out what's going on.

IAmKale
Jun 7, 2007

やらないか

Fun Shoe
Are any of you aware of a change in Powershell between Windows 8 and Windows 10 that would lead to a different date format being used for CSV output?

Here's an example of the difference I'm talking about :

quote:

Windows 8 output:
"User Name","Project_Name","10/1/2014 7:00:00 AM","Month",,"8","0","0","0","0","0"

Windows 10 output:
"User Name","Project_Name","Wed 10 1 7:00:00 AM","Month",,"8","0","0","0","0","0"

This is output from a script I wrote to export Sharepoint List data using CSOM. The issue is that Sharepoint won't recognize the new date format as a valid date, so these records won't import when I run my import script. These scripts worked perfectly on Windows 8 but I haven't been able to track down what's changed in PowerShell on Windows 10 that would cause this oddity. :iiam:

By the way here's how my export script converts a collection of List data into a CSV file:

code:
$allItems |
    %{
        # Certain columns won't always contain a value! Because of this we have to analyze each row's columns to
        # determine if there are any special values we might need to take into account
        $csvCols = @()

        foreach($col in $_listCols)
        {
            # Default to grabbing the column's value
            $expr_str = '$_["{0}"]' -f $col

            # Check to see if a column's value is a LookupValue
            # Note: We can only do that if there's an actual value in the column, hence the null check
            if($_["$col"] -ne $null)
            {
                $colType = $_["$col"].GetType().Name
                # If a particular column contains a FieldLookupValue, we need to grab it a particular way
                if(($colType -eq "FieldLookupValue") -or ($colType -eq "FieldUserValue"))
                {
                    $expr_str = '$_["{0}"].LookupValue' -f $col
                }
            }
            # This will tell select-object below which values to grab from each row before exporting to CSV
            $csvCols += @{Name=$col;expression=[scriptblock]::Create($expr_str)}
        }
        # Create objects containing only the values we need from the ListItem
        select-object -input $_ -prop $csvCols
    } |
        Export-Csv -Path ($_exportPath + '\' + $_listName + '.csv') -Encoding UTF8 -NoTypeInformation
        PrintMsg ('- {0}: Saved {1} items' -f $_listName, $allItems.Count) -f Green

IAmKale fucked around with this message at 21:20 on Sep 17, 2015

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
If you run this on both machines, what do you get:

code:
PS C:\> [System.Globalization.DateTimeFormatInfo]::CurrentInfo.FullDateTimePattern

IAmKale
Jun 7, 2007

やらないか

Fun Shoe

Toshimo posted:

If you run this on both machines, what do you get:

code:
PS C:\> [System.Globalization.DateTimeFormatInfo]::CurrentInfo.FullDateTimePattern
I don't have access to Windows 8 at the moment but here's what I get back on Windows 10:

quote:

dddd, MMMM d, yyyy h:mm:ss tt

I imagine it'll be different on Windows 8. If this is the case, is there anything I can incorporate into my script to help control date output in my CSV exports?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Karthe posted:

I don't have access to Windows 8 at the moment but here's what I get back on Windows 10:


I imagine it'll be different on Windows 8. If this is the case, is there anything I can incorporate into my script to help control date output in my CSV exports?

I'm not entirely sure that's the variable that's causing you problems then, since it doesn't match the output. You can try taking "-NoTypeInformation" off your export and seeing what kind of variable it is kicking out for the date.

Video Nasty
Jun 17, 2003

Karthe posted:

I don't have access to Windows 8 at the moment but here's what I get back on Windows 10:


I imagine it'll be different on Windows 8. If this is the case, is there anything I can incorporate into my script to help control date output in my CSV exports?

use get-date -format and dump it from both machines so you end up with the same format in the CSV file?

example:
code:
PS C:\> $now=Get-Date -format "dd-MMM-yyyy HH:mm"

Adbot
ADBOT LOVES YOU

BaseballPCHiker
Jan 16, 2006

Wow Exchange has gotten so much easier to use with Powershell. I don't have to do a lot of Exchange administration but the few times I have powershell has made things much easier to work with. Seems like Microsoft really meant for Exchange 2013 to be ran almost exclusively through it. Here is a little tiny script I wrote that checks messages sent from a specific user during a certain time frame. It's been useful for when someone claims an automated message didnt get sent;

code:
Get-MessageTrackingLog -ResultSize unlimited -sender [email]yoursender@whatever.com[/email] -start "09/01/2015 12:00:00 AM" -end "09/18/2015 12:00:00 PM" 
| Select-Object Sender, Recipients, Timestamp, MessageSubject | Format-Table
Anyone who works more with Exchange have any suggestions for this script? Would running this through an invoke-command speed the output up at all?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply