Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
brosmike
Jun 26, 2009
Yes, you could do that with something like:

code:
Get-ChildItem 'C:\your\directory' -Directory | ForEach-Object { $_.LastWriteTime = $_.CreationTime }

Adbot
ADBOT LOVES YOU

monkey
Jan 20, 2004

by zen death robot
Yams Fan
Awesome, that worked, thanks! (I had to remove the -Directory filter, but that's fine)

So, it seems getting it to work as a script I can double click on is a bit of a challenge...

edit: weird, if I run powershell as administrator it does not see mapped network drives, but running it normally works.

edit2: got it working with right click, thanks again!

monkey fucked around with this message at 22:10 on Apr 5, 2015

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Mapped drives are per user, and when you run as admin you're running as another user, and that other user doesn't have the drives that your user does.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.
How do you use powershell regex on multi-line files?

Given a file foo.txt:
code:
this file
has
three lines
The command (get-content -raw foo.txt) -replace ".*" , "hello" | write-output produces

code:
hellohello
hellohello
hellohello
instead of the expected
code:
hello
What am I doing wrong?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Newf posted:

How do you use powershell regex on multi-line files?

Given a file foo.txt:
code:
this file
has
three lines
The command (get-content -raw foo.txt) -replace ".*" , "hello" | write-output produces

code:
hellohello
hellohello
hellohello
instead of the expected
code:
hello
What am I doing wrong?

Well, for starters, ".*" is a like a loaded gun. It matches on an empty string so you get weird situations like this. You probably want to match on ".+" for most situations like that since it won't match on nothing. That's why you are getting double "hello" on each line. Also, get-content is going to split objects on line breaks. If you really want it to be a big long stream, you'll want to -Replace "`n|`r","" to clear the linebreaks.

To get your desired output, you'd want to use something line:
code:
((get-content -raw foo.txt) -Replace "`n|`r","") -replace ".+" , "hello" | write-output
If you gave us a better idea of what your real inputs/outputs look like, we could help you refine it further.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.
This is helpful, thanks. The real situation is that I need to swap all of the text (.*, I thought) between a couple of 'tags' in a file. Eg,
code:
<swap>
This text should be replaced
This text should also go
</swap>
Then the goal is to run something like (get-content -raw file.txt) -replace "(<swap>).*(</swap>)" , '$1' + $newContent + '$2', except with a fixed expression to pull out all of the stuff between the tags. Currently having success with [\s\S]*.

Is there any documentation on the actual behavior of .? I think that in js, for example, it works the way that I expected with my first attempt.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Newf posted:

This is helpful, thanks. The real situation is that I need to swap all of the text (.*, I thought) between a couple of 'tags' in a file. Eg,
code:
<swap>
This text should be replaced
This text should also go
</swap>
Then the goal is to run something like (get-content -raw file.txt) -replace "(<swap>).*(</swap>)" , '$1' + $newContent + '$2', except with a fixed expression to pull out all of the stuff between the tags. Currently having success with [\s\S]*.

Is there any documentation on the actual behavior of .? I think that in js, for example, it works the way that I expected with my first attempt.
If you're parsing valid XML, you may want to actually cast your string as [XML] and then use the XML object to replace the string.

If you must use regex, you can use balanced regex to match tags like this (.NET is one of the few regex engines to support this).

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.

Briantist posted:

If you're parsing valid XML, you may want to actually cast your string as [XML] and then use the XML object to replace the string.

If you must use regex, you can use balanced regex to match tags like this (.NET is one of the few regex engines to support this).

No XML here - this is a total hack and it's actually a .cpp file with the comments //<swap>(don't delete me) and //</swap>(don't delete me) as endpoints. Thanks for the link - that's neato. My working (well enough) solution is -replace "(//<swap>.*`n)[\s\S]*(`n//</swap>)", ('$1' + $newContent + '$2').

Venusy
Feb 21, 2007
Is there a way of using Get-WMIObject to determine which files are currently in use by a specific process? Trying to troubleshoot what I suspect could be a file locking issue with one of our internal apps.

zzMisc
Jun 26, 2002

Venusy posted:

Is there a way of using Get-WMIObject to determine which files are currently in use by a specific process? Trying to troubleshoot what I suspect could be a file locking issue with one of our internal apps.

I think you can actually do this with Get-Process. Here's a starting point, replace 'conhost' with the name of the process you're looking for or the whole expression with whatever you need:

code:
get-process | ? { $_.Name -eq 'conhost' } | foreach { write-output ($_.Modules).FileName }

Venusy
Feb 21, 2007
That shows the DLLs in use, but I'm looking for the same kind of information you could see in the Disk section of Resource Monitor, just for remote machines. I can't use Get-Process because either it's blocked by our firewall for some reason (though Get-Service and Get-WMIObject aren't) or it requires WinRM, which isn't enabled on these machines.

zzMisc
Jun 26, 2002

Venusy posted:

That shows the DLLs in use, but I'm looking for the same kind of information you could see in the Disk section of Resource Monitor, just for remote machines. I can't use Get-Process because either it's blocked by our firewall for some reason (though Get-Service and Get-WMIObject aren't) or it requires WinRM, which isn't enabled on these machines.

So you're trying to get the file handles of running processes on a remote machine without using WinRM / PS remoting. You may already know that you can probably get the win32_process WMI object from it using -ComputerName, but that object doesn't have the info you want. From what I can tell, you simply can't get the info you want through WMI. I did find something using .NET which may help as a starting point, but I can't get it to work for me:

$processlist = [System.Diagnostics.Process]::GetProcesses("ComputerName")

..unfortunately every time I try it I get a 'couldn't connect to remote machine' error, which makes me suspect it's using the same mechanism as Get-Process, or Get-Process is just a wrapper for that object anyway.

Honestly, I think the right answer here if you want to use Powershell for this is to turn on WinRM & PS Remoting on the remote system if at all possible, then you can just connect directly and run it locally. Or there's always PSExec.

Edit: Actually I just got the above .NET object, and get-process, to work by starting the RemoteRegistry service on the remote machine. I did that using PS remoting though..


Yeah, get-process and the above .NET object are literally exactly the same. Start RemoteRegistry on the remote machine, then use Get-Process -Computername X.

code:
get-service -ComputerName RemotePC | ? { $_.Name -eq 'RemoteRegistry' } | start-service
get-process -ComputerName RemotePC | ? { $_.Name -eq 'conhost' } | foreach { write-output ($_.Modules).FileName }
Edit..: Argh, for some reason that doesn't seem to contain anything in Modules like it would for the local machine. Not seeing a way around that.

code:
PS C:\> $a = get-process -computername demo -Module
get-process : Exception getting "Modules" or "FileVersion": "This feature is not supported for remote computers.".
At line:1 char:6

zzMisc fucked around with this message at 14:48 on Apr 9, 2015

snackcakes
May 7, 2005

A joint venture of Matsumura Fishworks and Tamaribuchi Heavy Manufacturing Concern

That's pretty cool. Is it possible to do the reverse? Enter a file and see what processes are using it?

Sefal
Nov 8, 2011
Fun Shoe
I'm trying to make a script to check if a login name is available
this script is intented for HR so it needs to be as easy as possible

i've made a script where it exports the samaccountname to a csv
i'm a bit stumbled on how to make a gui or something for HR where they only enter the desired logon name and can see if its taken or not for a new employee.

this is all I have

code:
Add-PSSnapin quest.activeroles.admanagement
Get-QADUser -sizelimit 0  -searchroot (domainname)/Accounts |select SamAccountName | Export-Csv  "C:\Users\(username)\Desktop\test\listofsamaccounts.csv 
"
i'm still learning powershell. Is what i wanna do even possible?
worst case scenario i let them run the script, open the csv and ctrl+f the name

Spazz
Nov 17, 2005

You could do a simple if/else statement looking for the user in the list.

code:
$UserName = "SomeName"
$UserList = (Get-QADUser -sizelimit 0  -searchroot (domainname)/Accounts |select SamAccountName)
If($UserList -contains $UserName){Write-Host "User exists!"}
Else{Write-Host "Nope"}
Might need some tweaks, but you can feed the username in using arguments with $args or use a pop-up input box.

Sefal
Nov 8, 2011
Fun Shoe

Spazz posted:

You could do a simple if/else statement looking for the user in the list.

code:
$UserName = "SomeName"
$UserList = (Get-QADUser -sizelimit 0  -searchroot (domainname)/Accounts |select SamAccountName)
If($UserList -contains $UserName){Write-Host "User exists!"}
Else{Write-Host "Nope"}
Might need some tweaks, but you can feed the username in using arguments with $args or use a pop-up input box.

Thank you! I didn't even know you can create a pop-up input box. Thank you for steering me in the right direction

Zaepho
Oct 31, 2013

Sefal posted:

I'm trying to make a script to check if a login name is available
this script is intented for HR so it needs to be as easy as possible

i've made a script where it exports the samaccountname to a csv
i'm a bit stumbled on how to make a gui or something for HR where they only enter the desired logon name and can see if its taken or not for a new employee.

this is all I have

code:
Add-PSSnapin quest.activeroles.admanagement
Get-QADUser -sizelimit 0  -searchroot (domainname)/Accounts |select SamAccountName | Export-Csv  "C:\Users\(username)\Desktop\test\listofsamaccounts.csv 
"
i'm still learning powershell. Is what i wanna do even possible?
worst case scenario i let them run the script, open the csv and ctrl+f the name

Take away the entire though process on their part. Make them feed the script First, Middle and Last names. From there the script should construct a username and test it, if it's in use, try the next combination in your username format and so on and so forth until the script gives HR the username back. The idea is to make their jobs easier. If they had rights to create use accounts in a specific OU, it might be worth actually creating the empty shell of the account for them.

Extra bonus points if the script can go into the HR system and get all of the data to completely fill out the user account data.

Sefal
Nov 8, 2011
Fun Shoe

Zaepho posted:

Take away the entire though process on their part. Make them feed the script First, Middle and Last names. From there the script should construct a username and test it, if it's in use, try the next combination in your username format and so on and so forth until the script gives HR the username back. The idea is to make their jobs easier. If they had rights to create use accounts in a specific OU, it might be worth actually creating the empty shell of the account for them.

Extra bonus points if the script can go into the HR system and get all of the data to completely fill out the user account data.

That's actually a really good idea. Going to try to make this.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
This sounds more like a process issue. Why does HR care about user names? In my experience it goes like this:

HR says "hey we have a new person starting named Joe Smith, we need normal computer stuff for him"
IT grunts set up computers and AD accounts and stuff, go back to HR and say "okay it's all set up, his username is jsmith12334164" or whatever
HR gives the information to Joe on his first day

Tony Montana
Aug 6, 2005

by FactsAreUseless
ello Powershell thread :)

I thought there must be one somewhere. So for my new employer and I've taken the plunge and (tried) to leave my beloved vbs behind. I can't really.. already I've found stuff I can't quite do as well in PS. For instance I write large scripts that query Active Directory and output in Excel, but in PS because it's .NET and not .COM doing this is unreliable. Apparently there is no real way to access the exposed API of Excel, I've read of people going to HTML and then CSV but opening the CSV using Excel which preserves the formatting..

Yuck. Anyways, I'm just accepting that being good with both scripting languages and using the right ones for the right tasks is best.

I've got a pretty simple question for today, though.

So I've got a dirty great log from a DNS server and I've written a nice PS one-liner that uses Select-String to find IPs in it. The idea is to go through the file, parse it and give an output of unique IPs, just one unique IP on each line. I've got Select-String working fine and defined IPs via a regular expression, using Select Matches I get a return of just the IP (like using grep or something). But now I want to use the Sort-Object function followed by the Get-Unique function.

In the script I've tried the inital line and then just pipe to each function, finishing with piping to an output file.. but it doesn't work. I've tried putting the results of the inital parse into variable and then piping that variable to each function on new line in order (like you'd do in vbs) but that didn't work either. How do I progressive operations on a variable with functions?

Sefal
Nov 8, 2011
Fun Shoe

Ithaqua posted:

This sounds more like a process issue. Why does HR care about user names? In my experience it goes like this:

HR says "hey we have a new person starting named Joe Smith, we need normal computer stuff for him"
IT grunts set up computers and AD accounts and stuff, go back to HR and say "okay it's all set up, his username is jsmith12334164" or whatever
HR gives the information to Joe on his first day

That's how we do it now. But we are now in the process of changing things arround and i guess make it more streamlined? one of the wishes is that u get all the info you need when HR gives you the new employee data.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Tony Montana posted:

ello Powershell thread :)

I thought there must be one somewhere. So for my new employer and I've taken the plunge and (tried) to leave my beloved vbs behind. I can't really.. already I've found stuff I can't quite do as well in PS. For instance I write large scripts that query Active Directory and output in Excel, but in PS because it's .NET and not .COM doing this is unreliable. Apparently there is no real way to access the exposed API of Excel, I've read of people going to HTML and then CSV but opening the CSV using Excel which preserves the formatting..

Yuck. Anyways, I'm just accepting that being good with both scripting languages and using the right ones for the right tasks is best.
Accessing and managing AD in PowerShell is a dream compared to VBS. What you want to do is use the ActiveDirectory module, then look at the Active Directory Cmdlets.

Here's an example of exporting all the users in a certain OU with the last name "Lowtax" to a CSV:
code:
Import-Module ActiveDirectory

Get-ADUser -Filter { Surname -eq 'Lowtax' } -SearchBase 'OU=Whatever,DC=domain,DC=tld' | Export-Csv -Path C:\path\to\file.csv -NoTypeInformation
As far as using COM objects, you absolutely can:

code:
$excel = New-Object -ComObject Excel.Application

$excel.Workbooks.Add()

#etc

quote:

I've got a pretty simple question for today, though.

So I've got a dirty great log from a DNS server and I've written a nice PS one-liner that uses Select-String to find IPs in it. The idea is to go through the file, parse it and give an output of unique IPs, just one unique IP on each line. I've got Select-String working fine and defined IPs via a regular expression, using Select Matches I get a return of just the IP (like using grep or something). But now I want to use the Sort-Object function followed by the Get-Unique function.

In the script I've tried the inital line and then just pipe to each function, finishing with piping to an output file.. but it doesn't work. I've tried putting the results of the inital parse into variable and then piping that variable to each function on new line in order (like you'd do in vbs) but that didn't work either. How do I progressive operations on a variable with functions?
Can you post this code? Maybe some sample input from the log to go along with it? I'm having trouble visualizing it.

Wicaeed
Feb 8, 2005
Kind of a long shot, I recently found this script here Kind of a long shot, I recently found this script here that can be used to clean up folders on a server.

I'm going to use it, but I was wondering if there would be an easy way that I might be overlooking to include the name of the server on which it was run, either as the email subject or in the log file anywhere.

I'm kind of new with Powershell, but I figure I'd ask before I start trying to edit the script to do something like that.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Find the lines where it sets the subject of the email and add another variable to that. You can pretty easily get the hostname of a computer using built in powershell variables.
Since you're new and sound like you want to learn I'm being kind of vague in the hopes that it will point you in the right direction of figuring it out yourself. Teach a man to fish blah blah blah. Also I'm too lazy to download the script myself and look at it ;)

Video Nasty
Jun 17, 2003

I scanned over the file, and the guy commented the dick out of it. If those comments were maintained even remotely like the file is on that site, you should be fine to read through it and understand what's happening.
It is daunting that there's a lot to read, but be cautious with powershell doing remove-item, because it will not go to the recycle bin. I am not certain you can recover those files once they are removed.

12 rats tied together
Sep 7, 2006

Briantist posted:

Accessing and managing AD in PowerShell is a dream

I agree, in fact, I would say you don't even need to worry about outputting to excel spreadsheets because you can just do whatever it is you need to do. Instead of "Get, Format, Output, Read, Do" you can just do "Get, Do".

quote:

So I've got a dirty great log from a DNS server and I've written a nice PS one-liner that uses Select-String to find IPs in it. The idea is to go through the file, parse it and give an output of unique IPs, just one unique IP on each line. I've got Select-String working fine and defined IPs via a regular expression, using Select Matches I get a return of just the IP (like using grep or something). But now I want to use the Sort-Object function followed by the Get-Unique function.

In the script I've tried the inital line and then just pipe to each function, finishing with piping to an output file.. but it doesn't work. I've tried putting the results of the inital parse into variable and then piping that variable to each function on new line in order (like you'd do in vbs) but that didn't work either. How do I progressive operations on a variable with functions?

I've never used get-unique but there should be no reason why you can't use it. Both of these work for me:

code:
$matchedIP = (Get-Content .\file)
$uniques = $null

[regex]$regex = 'a regular expression'

$newline = ($regex.matches($matchedIP).Value)

$newline | % {

if ($uniques -notlike "*$_*") {
    Write-Output "adding $_"
    $uniques += "$_ `r`n"
    }

else {
    Write-Output "duplicate at $_"
    }
}

$uniques
and (the easier way)
code:
[regex]$regex = 'a regular expression'

$output = ($regex.matches( (Get-Content .\file) ).Value) | Sort-Object -Unique

Wicaeed
Feb 8, 2005
:woop:

Forgot that there was an optional -EmailSubject field that isn't required, but is automatically set to a string with a few variables. Just set $ComputerName = gc env:computername and called the variable in the default subject line.

Not really too pretty, but it works.

code:
if ($EmailTo -or $EmailFrom -or $EmailSmtpServer) {
    if (($EmailTo,$EmailFrom,$EmailSmtpServer) -contains '') {
        Write-Warning 'EmailTo EmailFrom and EmailSmtpServer parameters only work if all three parameters are used, no email sent...'
    } else {
        $EmailSplat = @{
            To = $EmailTo
            From = $EmailFrom
            SmtpServer = $EmailSmtpServer
            Attachments = $LogFile
        }
        if ($EmailSubject) {
            $EmailSplat.Subject = $EmailSubject
        } else {
            $ComputerName = gc env:computername
            $EmailSplat.Subject = "deleteold.ps1 started at: $StartTime FolderPath: $FolderPath on $ComputerName"
        }
    }
}

Mr_Angry
May 15, 2003
A severe disappointment
College Slice
I've been converting my digital library over to MP4 using HandBrake and as I had thousands of files the UI would have been tedious to use. So, I wrote a quick PowerShell script that given a directory and file type will use the command line version of HandBrake to just auto convert everything to MP4. Just last weekend I completed everything and I'm streaming MP4 files via Plex to my Roku 3 just fine so I thought perhaps people here could use it. Usage is simple:

.\CLIBatch.ps1 -cliPath <path to where HandBrake is installed> -sourceDirectory <directory to convert> -filter <file types, i.e., *.avi>
Sample: .\CLIBatch.ps1 -cliPath 'C:\Program Files\Handbrake' -sourceDirectory D:\Scratch -filter *.avi

And here's the code (sorry about the long string but HandBrake has a lot of command line switches):
code:
Param(
    [string]$cliPath = $(throw, "-cliPath is required"),
    [string]$sourceDirectory = $(throw, "-sourceDirectory is required"),
    [string]$filter = $(throw, "-filter is required")
)

# make sure the cliPath ends with a '\' character
if ($cliPath.EndsWith("\\") -eq $false)
{
    $cliPath += "\"
}

# define the name of the Handbrake CLI executable
$hbCLIExe = $cliPath + "HandbrakeCLI.exe"

# set up the default options for HandBrake CLI, use the "here string @' option to easily handle all
# the double quotes and command line options
$hbCLIOptions = @'
    -i "<input_file>" -t 1 --angle 1 -c 1 -o "<output_file>" -f mp4 --crop 0.0:0.0 --loose-anamorphic --modulus 2 -e x264 -q 20 --vfr -a 1 -E av_aac -6 dpl2 -R Auto -B 160 -D 0 --gain 0 --audio-fallback ac3 --encoder-preset=veryfast --encoder-level="4.0" --encoder-profile=main --verbose=1
'@

# array to hold all files to be converted
$filesToConvert = @()

# iterate through all file types identified by $filter, building an entire list of files with full
# path information saved so that we can then convert each file one by one
Get-ChildItem -Path $sourceDirectory -Filter $filter -Recurse | ForEach-Object -Process {
    # save the current item in the pipeline
    $filesToConvert += $_.FullName
}

# keep track of how many files we have to convert and how many are left
$totalFiles = $filesToConvert.Length
$currentFile = 0

# output results so far
Write-Host "Have found $totalFiles file(s) of type `"$filter`""

# iterate through all files and start the conversion process
foreach ($file in $filesToConvert)
{
    $cmdLine = $hbCLIOptions
    $currentFile++

    # convert the output file name to .mp4
    $iIndex = $file.LastIndexOf('.')

    # if we did not find a '.' then we have a malformed string, raise an error
    # and exit out
    if ($iIndex -eq -1)
    {
        $errMessage = "Invalid file name `"" + $file + "`", exiting out"
        throw $errMessage
    }

    # create the output file name
    $outputFile = $file.Substring(0, $iIndex) + ".mp4"

    # replace the input and output files and trim off leading white spaces
    $cmdLine = $cmdLine.Replace("<input_file>", $file)
    $cmdLine = $cmdLine.Replace("<output_file>", $outputFile)
    $cmdLine = $cmdLine.TrimStart(' ')

    # call Handbrake to convert the file, be sure to trim off leading white spaces
    # and to wait for the process to complete before starting the next conversion
    # so as to not over-tax the CPU
    Write-Host "[$currentFile of $totalFiles] converting `"$file`" to `"$outputFile`""
    Start-Process $hbCLIExe $cmdLine -Wait -WindowStyle Minimized
}

Write-Host "All file conversions completed!"

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Use join-path to combine path parts, it will make your life easier.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Ithaqua posted:

Use join-path to combine path parts, it will make your life easier.
Yes, this. I love Join-Path. It takes pipeline input too, which is how I like to use it:

code:
$env:WinDir | Join-Path -ChildPath 'System32' | Join-Path -ChildPath 'Drivers' | Join-Path -ChildPath 'etc' | Join-Path -ChildPath 'hosts'
This can be verbose when you have a lot of components because you can't give it an array (which I think is stupid). You can also use the underlying .Net way:

code:
[System.IO.Path]::Combine($env:WinDir, 'System32', 'Drivers', 'etc', 'hosts')

Spazz
Nov 17, 2005

Is it ever actually worth it to force invoke the GC?

code:
[System.GC]::Collect()

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Spazz posted:

Is it ever actually worth it to force invoke the GC?

code:
[System.GC]::Collect()

If you have to ask then the answer is no

Nahrix
Mar 17, 2004

Can't afford to eat out
I have a question about output formatting. Hours of googling hasn't helped.

I'm trying to output an array of server names within an array of server sets. I've tried building custom objects with properties, so that I can use format-table, but a serverset object contains an array of strings as a property, which doesn't output properly.

I can change the data structure if it helps solve the problem.

The kind of formatting I'm looking for is like this:

code:
ServerSet1             ServerSet2              ServerSet3
----------             ----------              ----------
Server1                Server4                 Server6
Server2                Server5                 Server7
Server3                                        Server8
                                               Server9


ServerSet4             ServerSet5
----------             ----------
Server10               Server15
Server11               Server16
Server12               Server17
Server13
Server14
Where both axis of the effective 2D-array data structure (serversets and servers) are variable, and the headers would fill the horizontal space of the window, and then wrap around vertically once they've reached the edge. There might be a very simple way of accomplishing this using Format-Table that's going over my head, but I've worked on it for a whole day with no success.

Thanks for any advice.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Nahrix posted:

I have a question about output formatting. Hours of googling hasn't helped.

I'm trying to output an array of server names within an array of server sets. I've tried building custom objects with properties, so that I can use format-table, but a serverset object contains an array of strings as a property, which doesn't output properly.

I can change the data structure if it helps solve the problem.

The kind of formatting I'm looking for is like this:

code:
ServerSet1             ServerSet2              ServerSet3
----------             ----------              ----------
Server1                Server4                 Server6
Server2                Server5                 Server7
Server3                                        Server8
                                               Server9


ServerSet4             ServerSet5
----------             ----------
Server10               Server15
Server11               Server16
Server12               Server17
Server13
Server14
Where both axis of the effective 2D-array data structure (serversets and servers) are variable, and the headers would fill the horizontal space of the window, and then wrap around vertically once they've reached the edge. There might be a very simple way of accomplishing this using Format-Table that's going over my head, but I've worked on it for a whole day with no success.

Thanks for any advice.
I think the best way for us to help you is to give us the array of serverset objects. You can do this by serializing it into clixml:

code:
$myServersets | Export-Clixml -Path 'C:\whatever\serversets.xml'
Then post that so that we can use the exact objects you're working with. Or post the code you're using to create them in the first place.

However going off of what you've given, I'm assuming you created a single object that contains a separate property (NoteProperty) for each server set, so what you might do is simultaneously create a ScriptProperty for each NoteProperty that joins the array with newlines, and then you can call Format-Table with those props only.

code:
# Previous code to build your object here

$myObject | Add-Member -MemberType ScriptProperty -Name ServerSet1_Display -Value { $this.ServerSet1 -join "`r`n" }

# More script properties here

$myObject | Format-Table ServerSet1_Display,ServerSet2_Display,ServerSet3_Display -Wrap
Using -Wrap should do the wrapping you want already.

You can also look into the -View parameter of Format-Table and try working with a formatting file. I've never tried this before, so if you go that way please post your experience.
Anyway, seeing some code of yours or having the object to work with would really go a long way in getting the output you want (not even touching on why you want that output yet).

Briantist fucked around with this message at 21:41 on Apr 24, 2015

Nahrix
Mar 17, 2004

Can't afford to eat out
You're correct; I have an object with NoteProperties to store the data.

I deliberately didn't provide any code because I'm aware that the way that I chose to store the data might not be the best way to accomplish the desired output. I would be willing to change the way data was stored, allowing anyone willing to help the freedom to choose the easiest implementation.

The reason I want that display format is because I'm working with a lot of servers / server sets, and I'm writing a script to manage deploying files to all of them, and I'd like it to open up with a view of all the servers on a single screen. That display format, in my opinion, maximizes the console's real-estate, so that I can view all servers simultaneously without having to scroll.

Let me gather the xml, and I'll post it in a follow-up reply shortly.

Edit: Here's 1 "serverset" object within the array of serversets

Edit2: And here's the function that creates the objects, in case it's easier to follow than the xml

Edit3: The ScriptProperty method you described works perfectly, thank you!

Nahrix fucked around with this message at 23:00 on Apr 24, 2015

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Briantist: I just realised you're briantist.com. A blog I've referenced several times.

Nice shout out from Ned Pyle the other day.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Swink posted:

Briantist: I just realised you're briantist.com. A blog I've referenced several times.

Nice shout out from Ned Pyle the other day.
Thanks! Glad to hear I've been helpful. That article Ned Pyle contacted me about has been around for a little over a year and never really got any traffic or comments, so it was definitely a surprise.

Just curious, which articles have you referenced before?

zzMisc
Jun 26, 2002

snackcakes posted:

That's pretty cool. Is it possible to do the reverse? Enter a file and see what processes are using it?

Hey I know this was from weeks ago now, but I've been meaning to post this and now it's cleaned up, so here:

code:
function Get-LockingProcess() {
<#
    .SYNOPSIS
    Find the process that is locking a particular file.
    .PARAMETER FileName
    The name(s) of the file(s) to check.
    .PARAMETER LockedFile
    The System.IO.FileInfo object(s) to check.
    .OUTPUTS
    Retuns a set of System.Diagnostics.Process objects with an additional "User" NoteProperty indicating what user has it locked.
    .EXAMPLE
    Get-LockingProcess *.dll | select Path,ProcessName,User
    .EXAMPLE
    Dir -Recurse -Path "$env:windir\system32" -Include powershell.exe | Get-LockingProcess | Stop-Process
    (Say goodnight!)
    .EXAMPLE
    Get-LockingProcess "$env:windir\system32\*.dll"
    (Grab a soda first)
#>
    [cmdletbinding()]
    param(
        [Parameter(ParameterSetName="ByName",Mandatory=$true,ValueFromPipeline=$true,Position=0)][String[]]$FileName,
        [Parameter(ParameterSetName="ByFile",Mandatory=$true,ValueFromPipeline=$true,Position=0)][System.IO.FileInfo[]]$LockedFile
    )
    begin{
        [System.Diagnostics.Process[]]$processes = Get-Process # Keep us from having to do this multiple times in cases of multiple files
    }

    process{
        foreach( $ThisName in $FileName ) {
            $TheseFiles = dir $ThisName -Force
            foreach ($ThisFile in $TheseFiles) {
                write-debug $ThisFile.FullName
                foreach ($p in $processes) {
                    if( ($p.Modules).FileName -Contains $ThisFile.FullName) {
                        $q = get-wmiobject -Class win32_process -Filter "handle LIKE '$($p.Id)%'"
                        $p | Add-Member -MemberType NoteProperty -Name 'User' -Value "$($q.GetOwner().Domain)\$($q.GetOwner().User)" -Force
                        Write-Output $p
                    }
                }
            }
        }
        foreach( $ThisFile in $LockedFile ) {
            foreach ($p in $processes) {
                if( ($p.Modules).FileName -Contains $ThisFile.FullName) {
                    $q = get-wmiobject -Class win32_process -Filter "handle LIKE '$($p.Id)%'"
                    $p | Add-Member -MemberType NoteProperty -Name 'User' -Value "$($q.GetOwner().Domain)\$($q.GetOwner().User)" -Force
                    Write-Output $p
                }
            }
        }
    }

    end{}
}

zzMisc fucked around with this message at 19:06 on Apr 28, 2015

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Ali Aces posted:

Hey I know this was from weeks ago now, but I've been meaning to post this and now it's cleaned up, so here:

code:
function Get-LockingProcess() {
<#
    .SYNOPSIS
    Find the process that is locking a particular file.
    .PARAMETER FileName
    The name(s) of the file(s) to check.
    .PARAMETER LockedFile
    The System.IO.FileInfo object(s) to check.
    .OUTPUTS
    Retuns a set of System.Diagnostics.Process objects with an additional "User" NoteProperty indicating what user has it locked.
    .EXAMPLE
    Get-LockingProcess *.dll | select Path,ProcessName,User
    .EXAMPLE
    Dir -Recurse -Path "$env:windir\system32" -Include powershell.exe | Get-LockingProcess | Stop-Process
    (Say goodnight!)
    .EXAMPLE
    Get-LockingProcess "$env:windir\system32\*.dll"
    (Grab a soda first)
#>
    [cmdletbinding()]
    param(
        [Parameter(ParameterSetName="ByName",Mandatory=$true,ValueFromPipeline=$true,Position=0)][String[]]$FileName,
        [Parameter(ParameterSetName="ByFile",Mandatory=$true,ValueFromPipeline=$true,Position=0)][System.IO.FileInfo[]]$LockedFile
    )
    begin{
        [System.Diagnostics.Process[]]$processes = Get-Process # Keep us from having to do this multiple times in cases of multiple files
    }

    process{
        foreach( $ThisName in $FileName ) {
            $TheseFiles = dir $ThisName -Force
            foreach ($ThisFile in $TheseFiles) {
                write-debug $ThisFile.FullName
                foreach ($p in $processes) {
                    if( ($p.Modules).FileName -Contains $ThisFile.FullName) {
                        $q = get-wmiobject -Class win32_process -Filter "handle LIKE '$($p.Id)%'"
                        $p | Add-Member -MemberType NoteProperty -Name 'User' -Value "$($q.GetOwner().Domain)\$($q.GetOwner().User)" -Force
                        Write-Output $p
                    }
                }
            }
        }
        foreach( $ThisFile in $LockedFile ) {
            foreach ($p in $processes) {
                if( ($p.Modules).FileName -Contains $ThisFile.FullName) {
                    $q = get-wmiobject -Class win32_process -Filter "handle LIKE '$($p.Id)%'"
                    $p | Add-Member -MemberType NoteProperty -Name 'User' -Value "$($q.GetOwner().Domain)\$($q.GetOwner().User)" -Force
                    Write-Output $p
                }
            }
        }
    }

    end{}
}

This is really cool, totally stealing this.

Just a comment about this:
code:
begin{
        [System.Diagnostics.Process[]]$processes = Get-Process # Keep us from having to do this multiple times in cases of multiple files
    }
In theory I agree with the concept, but the way powershell's pipeline works means that you can't tell how long it will be between process{} calls, so the information could be stale as you proceed through each pipeline object. Consider this pipeline (as an example):
code:
Get-LockingProcess | ForEach-Object { Start-Sleep -Seconds 300 }
Obviously this is contrived, but as a function author you don't know how long the steps in the rest of the pipeline will take. You can add a parameter to override that functionality, like a [Switch] called $Realtime and then when you call it with that, you call Get-Process each iteration. Lets the caller decide whether the accuracy or performance is more important.

Anyway it probably isn't important for this particular function, but I've been bitten by this before (not realizing how much time could pass between iterations).

Adbot
ADBOT LOVES YOU

zzMisc
Jun 26, 2002

Briantist posted:

In theory I agree with the concept, but the way powershell's pipeline works means that you can't tell how long it will be between process{} calls, so the information could be stale as you proceed through each pipeline object. Consider this pipeline (as an example):
code:
Get-LockingProcess | ForEach-Object { Start-Sleep -Seconds 300 }
Obviously this is contrived, but as a function author you don't know how long the steps in the rest of the pipeline will take. You can add a parameter to override that functionality, like a [Switch] called $Realtime and then when you call it with that, you call Get-Process each iteration. Lets the caller decide whether the accuracy or performance is more important.

Anyway it probably isn't important for this particular function, but I've been bitten by this before (not realizing how much time could pass between iterations).

I'm not sure I follow (still kind of a neophyte at this), but if I'm getting you correctly:

the process{} block runs, and each iteration of it sends its output down the pipeline before the next iteration? It's already weird to me that the foreach loop doesn't seem to work like you'd expect it (I'd expect it to work in the begin{} block just as well, but it doesn't with pipeline input), but now I think I'm beginning to understand; so each call to write-output will actually send that output down the pipeline to the next, even before this function is finished? I didn't even realize that.

In any case, yeah for this particular function I wasn't really concerned about the process list going stale while it's running, but it's possible to end up with multiple of the same process, pipe the first instance into stop-process then get errors on the rest since it's pulling them from a cache. Not sure that's really a problem worth fixing to me, but it is good to understand.

So I guess what happens with pipeline input is:

Every function in the pipeline's begin{} block is run
The process{} block for the first command that's receiving pipeline input is run on the first object in the pipeline
Any output from that process is sent on down to any following commands to process{}
Then (or in parallel?) the first command process{}es the next element in its pipeline
Once all elements are finished the end{} blocks are called

Have I got that right?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply