Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I'm fighting badly with a script to scan a list of machines for installed KBs. I create a remote WindowsUpdate Session and everything goes pretty good until I start trying to filter the data. I was never able to get the Search method of IUpdateSearcher to return anything meaningful, so I used some methodology I found in a TechNet article to walk the collection. The line where everything goes sideways is the loop to iterate across the results. It gives me exactly what I'm looking for, but it spins up a ton of network activity and takes about 20 seconds to process. That feels like an awful long time to iterate across a completed query (like, 300 lines of text). While it only takes about 20 seconds, that's fairly unwieldy when you multiply that by 100 or 1000 machines. I'm not anything approaching a PS expert, but is there a way to just pull the entire collection over locally once so that I can run all my comparisons against that?

code:
Param(
	[Parameter(Mandatory=$true,Position=1)]
	[string]$kbs,
	
	[Parameter(Mandatory=$true,Position=2)]
	[string]$computers
)

function Get-Matches($Pattern) { 
  begin { $regex = New-Object Regex($pattern) }
  process { foreach ($match in ($regex.Matches($_))) { ([Object[]]$match.Groups)[-1].Value } }
}

function Get-KBs($pcname){
    if (Test-Connection -Count 1 -Quiet $pcname){
        $OS = Get-WmiObject -Computer $pcname -Class Win32_OperatingSystem
        $Report = @()
        $objSession = [activator]::CreateInstance([type]::GetTypeFromProgID("Microsoft.Update.Session",$pcname))
        $objSearcher= $objSession.CreateUpdateSearcher()
        $colSucessHistory = $objSearcher.QueryHistory(0, $objSearcher.GetTotalHistoryCount())
        Foreach($objEntry in $colSucessHistory | where {$_.ResultCode -eq '2'}) {
            $Report += $objEntry.Title
        }
        $objSession = $null
        
        $kb_regex = "(" + [string]::Join("|",$kbItems) + ")"
        $kb_matches = $Report | Get-Matches $kb_regex
        if ($kb_matches){
            return ($pcname + "`t" + $OS.caption + "`t" + $kb_matches.Length + " KBs Found`t" + [string]::Join("`t",$kb_matches))
        } else {
            return ($pcname + "`t" + $OS.caption + "`tNo KBs Found")
        }
    } else {
        Write-Output $($pcname + "`tCannot connect to PC.")
    }
}

#################################################

Clear-Host

Write-Output $("Machine Name`tOS Version`tKBs Installed")
$kbItems = Get-Content $kbs
$pcs = Get-Content $computers
	
foreach($pc in $pcs){
	Write-Host "Querying $pc, please wait..."
    Get-KBs($pc) 
}

Adbot
ADBOT LOVES YOU

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Briantist posted:

But there are 2 ways to mitigate this:

1. Parallelize the process by using PowerShell Jobs. Then the 20 second pauses run in parallel (not perfectly, it will probably take more than 20 seconds just to set up all the jobs).

Be a bit careful with this; jobs are not threads. You spawn at least one new process with each background job. If you have 1000 servers, then consider whether the machine you're running this on can handle 1001+ PowerShell sessions and the associated DCOM (?) traffic from the Update Searcher concurrently.

2. Don't use a remote searcher. Instead use a PowerShell remoting session so that your code is running on the remote machine, then just use the local searcher. This requires having PowerShell remoting setup and configured on your servers. If you don't have this already, then you should; it's incredibly useful. I have an article about configuring it through Group Policy. Then, you can use:

I'll take a look when I get back in on Monday, but I'm pretty sure PowerShell remoting is not enabled and I don't have the pull to get the GPO changed. I'm running my scripts from a Win7x86 box with 4 gig of RAM and about a billion layers of cruft. I've done the parallelization both through jobs and through just good old "for-each -parallel". Either one helps, but I don't get too many concurrent jobs. Thanks for the links, though. I'll give them a read.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

AreWeDrunkYet posted:

I know this is a PowerShell thread, but isn't this better handled by SCCM?

You would think so. But in the last month of listening to our conference calls with Microsoft, I've begun to suspect that SCCM reporting is currently held together with baling wire and bubble gum. There's some goofy stuff going on like machines reporting as compliant immediately on release of a SUS package based on the fact that the machines are unreachable so the SUS server defaults to the last known catalog, which of course doesn't have the current package in it. Also, I don't have access to enough DBA time to write anything big and custom to get just this sort of info (which would be an arseload of work).

In regards to the other suggestions, I have verified that we don't have Powershell remoting enabled, and every variant of parallelization I've tried has capped out rather low because as soon as it tries to do a significant number of simultaneous connections, it causes the networking on the box to break down (it was dropping my RDP session at some points).

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Newf posted:

How do you use powershell regex on multi-line files?

Given a file foo.txt:
code:
this file
has
three lines
The command (get-content -raw foo.txt) -replace ".*" , "hello" | write-output produces

code:
hellohello
hellohello
hellohello
instead of the expected
code:
hello
What am I doing wrong?

Well, for starters, ".*" is a like a loaded gun. It matches on an empty string so you get weird situations like this. You probably want to match on ".+" for most situations like that since it won't match on nothing. That's why you are getting double "hello" on each line. Also, get-content is going to split objects on line breaks. If you really want it to be a big long stream, you'll want to -Replace "`n|`r","" to clear the linebreaks.

To get your desired output, you'd want to use something line:
code:
((get-content -raw foo.txt) -Replace "`n|`r","") -replace ".+" , "hello" | write-output
If you gave us a better idea of what your real inputs/outputs look like, we could help you refine it further.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I'm trying to script out getting the total size of a few directories using:

code:
$startFolder = "C:\Users"

$colItems = (Get-ChildItem $startFolder -Recurse -Force | Measure-Object -property Length -sum)
"$startFolder -- " + "{0:N2}" -f ($colItems.sum / 1MB) + " MB"
But, I'm having a couple of hangups. First, it's crapping out on any file > 248 characters in length. Second, specifically for Users, it's seeing all the weirdo special folders (AppData. Documents, Music, etc.) multiple times and wildly overcounting the size of the containing folders.

Any ideas?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Briantist posted:

Yeah I agree, I meant to edit this in actually, but sure if you're going to copy all of the files, walking the tree first will probably be faster even if it starts later. But if you're going to have conditions on which files are included or not, like with a Where-Object condition, it could be desirable to take each object as it comes. You could go more advanced with it and background the copy with a powershell job or runspace or something, and then effectively you'll be walking the tree and copying concurrently. But this is far from the original question.. just trying to illustrate the differences.

I'd be more inclined to do any copy as I walk simply because the files may change during the walk and I wouldn't want stale metadata.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Wicaeed posted:

PowerCLI Time:

Trying to count the number of vCPUs that have been configured in a VM resource pool in our vCenter server. Problem is that we have similarly named resource pools across multiple clusters. I wrote something that can get me 90% of what I'm looking for, what I really need is the total count across all resource pool instances:

code:
Foreach ($rp in Get-Resourcepool -Name "ResourcePool") { 
    $vCPU = Get-VM -Location $rp | Measure-Object -Property NumCPU -SUM | Select -ExpandProperty Sum
    $rp | Select Name,
    @{N='vCPU assigned to VMs';E={$vcpu}}
    }
My results look something like this:

code:
Name                                                                                                            vCPU assigned to VMs
----                                                                                                            --------------------
prod-ubu14                                                                                                                   196
prod-ubu14                                                                                                                   108
prod-ubu14                                                                                                                   168
I'm wracking my brain trying to come up with a way to count all those instances of a result returned in a foreach loop and tally it up at the end. The problem is expanded by the fact that the results have the same name.

Group-Object?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
$computers | %{$(get-wmiobject -ComputerName $_ -class win32_computersystem).username}

Toshimo fucked around with this message at 22:16 on Sep 3, 2015

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
pre:
set-strictmode -version 2.0


$folders = Get-ChildItem C:\Windows\System32\catroot -Force -Recurse | Where-Object{($_.PSIsContainer)} | Select Name, FullName

$filesize_list = @()

ForEach($folder in $folders){
     $name = [Guid]::Empty
     if([Guid]::TryParse($folder.Name,[ref]$name))
     {
        $filesize_list += Get-Item $folder.FullName | Select -Property FullName,@{Name="Size"; `
                          Expression = {(Get-ChildItem $_.FullName | Measure-Object -property length -sum).Sum + 0}}
     }
}

$file_groups = $filesize_list | Select FullName, Size | Group-Object -Property Size | Where-Object {($_.Count -gt 1) -and ($_.Name -gt 0)} `
               | Select Name, @{Name="Group"; Expression = {$_.Group.FullName}} 

$file_hashes = @{}

ForEach( $file_group in $file_groups){
    ForEach( $file_to_hash in $file_group.Group){
        $file_hashes.Add( $file_to_hash, (Get-ChildItem $file_to_hash -Force -Recurse | Get-FileHash | Select Hash))
    }
}
$file_hashes.GetEnumerator() | Group-Object -Property Value | Select Count, @{Name="Matches"; Expression = {$_.Group.Name}} `
                | Out-GridView -OutputMode Single | Select -ExpandProperty Group
Oh no what have I typed at 2am.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
If you run this on both machines, what do you get:

code:
PS C:\> [System.Globalization.DateTimeFormatInfo]::CurrentInfo.FullDateTimePattern

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Karthe posted:

I don't have access to Windows 8 at the moment but here's what I get back on Windows 10:


I imagine it'll be different on Windows 8. If this is the case, is there anything I can incorporate into my script to help control date output in my CSV exports?

I'm not entirely sure that's the variable that's causing you problems then, since it doesn't match the output. You can try taking "-NoTypeInformation" off your export and seeing what kind of variable it is kicking out for the date.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
DUDE NO. What did you make me do to my machine?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

FISHMANPET posted:

That's kind of impressive, but also shouldn't have worked as written because I was only "printing" the word match.

I uncommented the first print.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
nvm

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Harl37 posted:

I'm trying to put a script together in Powershell to reformat a file, and having some troubles.

code:
foreach ($line in $file)
{
	if ($line.StartsWith("2") -eq $true)
		{
			do stuff
		}
}
It's the StartsWith() part, if I specify a number, the script works like I want it to. But what I want the StartsWith() to match is any number 0-9. Any variation of [0-9] I put in there and the "do stuff" part doesn't happen. Is there an easy way to make that happen?

StartsWith isn't a Powershell method, it's a .Net String method and it only takes strings, not regexes.

Just use:
if ($line -match "^2")

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

22 Eargesplitten posted:

I'm getting an access denied error while trying to move items. Which is weird, because I could move one item successfully. It's only when I'm trying to move multiple that it doesn't work.

code:
$downloads = "C:\Users\me\downloads\test1"
$OldItems = gci $downloads | Where-Object {$_.LastAccessTime -lt (get-date).AddDays(-30)}
ForEach-Object { move-item -LiteralPath C:\users\me\Downloads -destination "C:\Users\me\Downloads\Test2"} -InputObject $OldItems
The error is

code:
Move-Item : Access to the path 'C:\users\me\Downloads' is denied.
At line:3 char:27
+ ForEach-Object { move-item <<<<  -LiteralPath C:\users\me\Downloads -destination "C:\Users\me
 -InputObject $OldItems
    + CategoryInfo          : WriteError: (C:\users\me\Downloads:DirectoryInfo) [Move-Item], IOExc
    + FullyQualifiedErrorId : MoveDirectoryItemIOError,Microsoft.PowerShell.Commands.MoveItemCommand
I'm running the ISE as administrator. Does anyone have any idea? I know it's probably ugly as hell, this is me learning by doing. I'm also reading A Month of Lunches, this is just something I've wanted to do for a while. Eventually it's going to move all of the old files from Downloads to Downloads Archive, and do the same thing with Documents and Documents archive.

Have you checked to make sure you are actually getting any $OldItems and that just touching the files hasn't reset all their LastAccessTime to Now?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

22 Eargesplitten posted:

Yes, I am. Moving the item isn't changing the access time. I ended up getting it working, but only if I'm doing \downloads\test1, which is what I meant it to be. Unfortunately, if I get it past testing, that's what it needs to be.

This does the job for me (assuming "test1" and "Test2" are directory names):
code:
$UserName  = "foo"
$Downloads = "C:\Users\$UserName\Downloads\test1"
$OldItems = Get-ChildItem $Downloads | Where-Object {$_.LastAccessTime -lt (Get-Date).AddDays(-30)} | Select Name
$OldItems | % { Move-Item -LiteralPath "C:\users\$UserName\Downloads\$($_.Name)" -Destination "C:\Users\$UserName\Downloads\Test2" -WhatIf}
Obviously, remove the WhatIf when you are ready for prime time.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Uziel posted:

Just dove into powershell today when I got a request that seemed like it would be a pain in the rear end to write in a batch script: take a list of FQDNs and do an nslookup on them.
The only issue I'm encountering is getting the Dns Client info as I'm running on Windows 7 and Get-DnsClient is only available on Windows 8 and above despite Powershell 3.
How else can I grab the client DNS server name and IP address?

Does "Get-WMIObject -Class Win32_NetworkAdapterConfiguration" not contain what you are looking for?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Is there a way to perform arithmetic on the results of a regex inside a -replace statement?

Something like:
code:
7 .. 39 | % {Get-ChildItem "*$($($_.toString()).PadLeft(3,'0'))*"} | Rename-Item -NewName { $_.Name -replace '(\d{3})', "$($($($('$1').ToString() - 3).toString()).PadLeft(3,'0'))" }

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Sefal posted:

i'm writing a script to only keep the latest 3 files of each file in 1 directory

These files are logs which are generated every week. Rhey all start with the <name>.conf<Date>
There are 20 different names
I foundthis script on the internet that is similar to what I want. But the trouble i'm having is i'm not sure how to define each unique file in this directory
Currently the script looks like this.
I've now copied this script and changed the filemask name for each other file.


code:
 # Defines how many files you want to keep?
$Keep = 3

# Specifies file mask
$FileMask = "<name>.conf*"

# Defines base directory path
$Path = "driveletter:\path\"

# Creates a full path plus file mask value
$FullPath = $Path + $FileMask

# Creates an array of all files of a file type within a given folder, reverse sort.
$allFiles = @(Get-ChildItem $FullPath) | sort-object -Property {$_.CreationTime} -Descending 

# Checks to see if there is even $Keep files of the given type in the directory.
If ($allFiles.count -gt $Keep) {

    # Creates a new array that specifies the files to delete, a bit ugly but concise.
    $DeleteFiles = $allFiles[$($allFiles.Count - ($allFiles.Count - $Keep))..$allFiles.Count]

    # ForEach loop that goes through the DeleteFile array
    ForEach ($DeleteFile in $DeleteFiles) {

        # Creates a full path and delete file value
        $dFile = $Path + $DeleteFile.Name

        # Deletes the specified file
        Remove-Item $dFile 
    }
}
But there has to be a way to filter each unique file in directory?
I'm not sure how to do that

What format is your date in and are you preferring to determine age by that datecode as opposed to last file written timestamp?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
GAZE UPON MY WORKS, YE MIGHTY, AND DESPAIR!

Using the assumption that the log files will be in <NAME>.confMMDDYY format, like so:
code:
PS C:\tmp> ls


    Directory: C:\tmp


Mode                LastWriteTime         Length Name                                                                                        
----                -------------         ------ ----                                                                                        
-a----         03/07/16     16:48              0 abc.conf010115                                                                              
-a----         03/07/16     17:07              0 abc.conf020115                                                                              
-a----         03/07/16     17:11              0 abc.conf030115                                                                              
-a----         03/07/16     17:11              0 abc.conf040415                                                                              
-a----         03/07/16     16:48              0 cde.conf010215                                                                              
-a----         03/07/16     17:08              0 cde.conf010515                                                                              
-a----         03/07/16     17:11              0 cde.conf010715                                                                              
-a----         03/07/16     17:11              0 cde.conf011315                                                                              
And assuming we want to use the date codes in the filename, instead of relying on LastWriteTime:


PHP code:
$Logs_To_Keep = 3
$Log_Path = "C:\tmp"

Get-ChildItem -Path $Log_Path "*.conf*" -Force | Where-Object {-not $_.PSIsContainer} | Select Name | `
    % { [regex]::Replace($_.name, "\.conf.*", "") } | Group-Object | % { Get-ChildItem "$($path)\\$($_.Name).conf*" | `
    Select Name | % { [regex]::Replace($_.Name, "(.*)\.conf(\d{2})(\d{2})(\d{2})", '$4$2$3.conf$1') } | Sort-Object -Descending | `
    Select-Object -Skip $Logs_To_Keep | % { [regex]::Replace($_, "(\d{2})(\d{2})(\d{2})\.conf(.*)", '$4.conf$2$3$1') | `
    % { Write-Output "Pruning Log File $($Log_Path)\\$($_)"; Remove-Item "$($Log_Path)\\$($_)" -WhatIf } } }
Produces proper output of:
code:
Pruning Log File C:\tmp\\abc.conf010115
What if: Performing the operation "Remove File" on target "C:\tmp\abc.conf010115".
Pruning Log File C:\tmp\\cde.conf010215
What if: Performing the operation "Remove File" on target "C:\tmp\cde.conf010215".

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

nielsm posted:

Does anyone know of a convenient way to interactively pick an OU (to create some object in) from AD, in a script? I can probably write something myself to make a basic menu kind of thing, but maybe something already exists.

If you are just looking for the UI part and not the interacting with AD part:

https://technet.microsoft.com/en-us/library/ff730949.aspx

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Test-Path $location\foo
Test-Path "$location\foo"

See what you get?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Trying to write a short script to retime subtitles. The function works by itself, but trying to invoke it inside a -replace doesn't work at all. Is there another good method for feeding this data through and doing a mass on-the-fly replace?

php:
<?
$ShiftAmount = -12500

function Shift-TimeCode {
Param ([string]$code)
Process{
$input = $code
[int[]] $timecode = $input.Split(":").Split(",")

$rawtimecode = 3600000*$timecode[0] + 60000*$timecode[1] + 1000*$timecode[2] + $timecode[3]

$rawtimecode += $ShiftAmount

if ($rawtimecode -lt 0) {
    echo "Error: Shift amount exceeds start time."
}

$timecode[0] = [int][math]::floor($rawtimecode / 3600000)
$rawtimecode -= 3600000*$timecode[0]
$timecode[1] = [int][math]::floor($rawtimecode / 60000)
$rawtimecode -= 60000*$timecode[1]
$timecode[2] = [int][math]::floor($rawtimecode / 1000)
$rawtimecode -= 1000*$timecode[2]
$timecode[3] = $rawtimecode

$output = "$($timecode[0].ToString().PadLeft(2,"0")):$($timecode[1].ToString().PadLeft(2,"0")):$($timecode[2].ToString().PadLeft(2,"0")),$($timecode[3].ToString().PadLeft(3,"0"))"

return $output


}
}

$test = "01:02:44,377" -replace "(\d{2}:\d{2}:\d{2},\d{3})", "$(Shift-TimeCode -code $1)"?>

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
http://ss64.com/ps/

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Hughmoris posted:

Vague question but does anyone here use Powershell for things outside of sysadmin type work? For web scraping, text parsing, console applications etc...? I like exploring new languages but I'm not going to need it for any sort of administrator duties.

I did this the other night and it's gross but it worked so...

code:
$base_url = "http://killsixbilliondemons.com/"
$output_dir = "C:\tmp\k6bd\"
$files_to_download = New-Object System.Collections.ArrayList
$files_to_download.Add('http://killsixbilliondemons.com/chapter/wielder-of-names/page/12/') > $null

while($files_to_download.Count -ne 0){
    
    $current_file = $files_to_download[0]
    $current_file_base_array = $current_file.Trim('/').Split("/")
    $current_file_base = $current_file_base_array[$current_file_base_array.Count -1]
    
    if (!(Test-Path "$output_dir\$current_file_base")){
         Invoke-WebRequest -Uri "$current_file" -OutFile "$output_dir\$current_file_base"
    }

    $temp_array = @()

    Get-Content "$output_dir\$current_file_base" | % { $_ -match "http://(?:killsixbilliondemons`.com|[a-z0-9]+`.cloudfront.net)?/[^`"'?`)`#`<]+" > $null; $matches.GetEnumerator() | % { $temp_array += $_.Value } }

    $temp_array | Sort -Unique | % {
        $temp_name = $_
        $current_file_base_array = @($($temp_name -replace $base_url -replace 'http://[a-z[0-9]+`.cloudfront.net').Split("?", 1))[0].Trim('/').Split("/")
        $current_file_base = $current_file_base_array[$current_file_base_array.Count -1]
        if (!(Test-Path "$output_dir\$current_file_base")){
            if (!($files_to_download.Contains($temp_name))){
                $files_to_download.Add($temp_name) > $null
                echo "Test: $temp_name"
            }
        }
    }
            
    $files_to_download.Remove($current_file)
}

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I've got a PS script that just makes a windows form and does everything from there. Is there a way to not hold a blank command window open in the background if running this from a shortcut?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

anthonypants posted:

I've asked this question before, too, and if that's what you really want, you should rewrite your thing in C#.

Gross

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Powershell is cool and good because I can assume that it's on every windows box at work and that I can send anyone a script and it'll work the same (pretty much) everywhere and they'll be able to edit it as needed without having Visual Studio or something on their box (also, we're trying to get the people at work limping into 1 language, I'm not going to try and teach them like 4, I want to retire some day).

Powershell is dumb and bad because its GUI support gives me Visual Aids.

If there's another language that's going to be (a) consistently available on all my windows boxes without having to install a whole 'nother software package and (b) can give me GUIs so that I can give things to non-programmers and not have them stare at me blankly, pls let me know.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

anthonypants posted:

When I made my PowerShell/Forms abomination I made the black console emit a "doing thing..." prompt, so you could just do that.

I unironically like this.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Here's a thing I did for a guy who didn't like how obfuscated the shutdown options are in Win10's start menu. Just an example of the stuff I throw together for folks here:

code:
[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")
[void] [System.Reflection.Assembly]::LoadWithPartialName("System.Drawing") 

$objForm = New-Object System.Windows.Forms.Form 
$objForm.Text = "Windows Power Options"
$objForm.Size = New-Object System.Drawing.Size(185,120) 
$objForm.StartPosition = "CenterScreen"
$objForm.FormBorderStyle = "FixedToolWindow"

$objForm.KeyPreview = $True

$objForm.Add_KeyDown({if ($_.KeyCode -eq "Escape") 
    {$objForm.Close()}})

$ShutdownButton = New-Object System.Windows.Forms.Button
$ShutdownButton.Left = 10
$ShutdownButton.Top = 10
$ShutdownButton.Size = New-Object System.Drawing.Size(75,23)
$ShutdownButton.Text = "&Shutdown"
$ShutdownButton.Add_Click({Stop-Computer -Force})
$objForm.Controls.Add($ShutdownButton)

$RestartButton = New-Object System.Windows.Forms.Button
$RestartButton.Left = 95
$RestartButton.Top = 10
$RestartButton.Size = New-Object System.Drawing.Size(75,23)
$RestartButton.Text = "&Restart"
$RestartButton.Add_Click({Restart-Computer -Force})
$objForm.Controls.Add($RestartButton)

$HibernateButton = New-Object System.Windows.Forms.Button
$HibernateButton.Left = 10
$HibernateButton.Top = 35
$HibernateButton.Size = New-Object System.Drawing.Size(75,23)
$HibernateButton.Text = "&Hibernate"
$HibernateButton.Add_Click({
$PowerState = [System.Windows.Forms.PowerState]::Suspend;
$Force = $false;
$DisableWake = $false;
[System.Windows.Forms.Application]::SetSuspendState($PowerState, $Force, $DisableWake);})
$objForm.Controls.Add($HibernateButton)

$LogoffButton = New-Object System.Windows.Forms.Button
$LogoffButton.Left = 95
$LogoffButton.Top = 35
$LogoffButton.Size = New-Object System.Drawing.Size(75,23)
$LogoffButton.Text = "&Logoff"
$LogoffButton.Add_Click({(Get-WmiObject -Class Win32_OperatingSystem).Win32Shutdown(4)})
$objForm.Controls.Add($LogoffButton)

$CancelButton = New-Object System.Windows.Forms.Button
$CancelButton.Left = 55
$CancelButton.Top = 60
$CancelButton.Text = "&Cancel"
$CancelButton.Add_Click({$objForm.Close()})
$objForm.Controls.Add($CancelButton)

$objLabel = New-Object System.Windows.Forms.Label
$objLabel.Location = New-Object System.Drawing.Size(10,20) 
$objLabel.Size = New-Object System.Drawing.Size(280,20) 
$objForm.Controls.Add($objLabel) 


$objForm.Add_Shown({$objForm.Activate()})
[void]$objForm.ShowDialog()
Now he can just pin that to his taskbar and have the functionality that he wants. There's probably a way simpler and prettier way to mock that up, but v0v.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

The Fool posted:

I have a powershell script that has been giving me a headache today.

The script queries a subsidiaries ad for user information, then updates the users account in our ad. If the account doesn't exist, it creates it.

This script works flawlessly if the account already exists, however, if the script has to create a new account, it does so, but all of the subsequent "Set-ADUser" commands fail. If I run the script a second time, it updates everything. Hell, if I even tell the function to run twice in a row, it spits out a bunch of errors on the first pass, then works fine on the second pass.

I have no idea what the gently caress to do at this point.

Paste ur script.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

The Fool posted:

So, when I was sanitizing the variable names I noticed a Write-Output that should have been a Write-Error. I fixed that in the source script, and now it's working correctly.

Thanks virtual rubber ducks.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
while(1) {gc -Wait "\\path\to\update.status" | Select-String "succeeded" | % {break}}

I'll show myself out.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

The Iron Rose posted:

in the following format: MM/DD/YYYY HH/MM. An example would be "1/12/2018 19:16"

Just to be clear, which format is it, because the format you posted and the date you posted, are not the same in several ways.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Nth Doctor posted:

I have what I hope is a silly question about Powershell vs. Powershell Core.
My boss is a Mac user and asked for me to give him access to a tool we wrote using powershell that tests our web services for basic functionality.
I see that Powershell Core is available for macs, so I tried running my script with pwsh and got this error:
pre:
Invoke-WebRequest : The format of value 'text/xml; charset=utf-8' is invalid.
At C:\test.ps1:9 char:13
+ $response = Invoke-WebRequest `
+             ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [Invoke-WebRequest], FormatException
+ FullyQualifiedErrorId : System.FormatException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
Here's my script, edited down to the Hello World version:
code:
$bodyContent = @"
        <soap:Envelope />
"@

$authorizationHeader = "Basic asdf"
$headers = @{ Authorization = $authorizationHeader }
$endpoint = "http://httpbin.org/anything"
$contentType = "text/xml; charset=utf-8"
$response = Invoke-WebRequest `
    -Method POST `
    -Uri $endpoint `
    -ContentType $contentType `
    -Headers $headers `
    -Body $bodyContent
Write-Host $response
The script runs just fine with powershell.exe

If I changed $contentType to just "text/xml" it works fine as well. Am I being paranoid about the lack of charset information? How should I be formatting the $contentType value if I were to keep the charset?

Try:
$contentType = "text/xml; charset=utf-8;"

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Nth Doctor posted:

Good thought and something I didn't try earlier, but no dice:
pre:
pwsh ./test.ps1
Invoke-WebRequest : The format of value 'text/xml; charset=utf-8;' is invalid.
At C:\test.ps1:9 char:13
+ $response = Invoke-WebRequest `
+             ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [Invoke-WebRequest], FormatException
+ FullyQualifiedErrorId : System.FormatException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand

[-SkipHeaderValidation]

-SkipHeaderValidation
Indicates the cmdlet should add headers to the request without validation.

This switch should be used for sites that require header values that do not conform to standards. Specifying this switch disables validation to allow the value to be passed unchecked. When specified, all headers are added without validation.

This will disable validation for values passed to both the -Headers and -UserAgent parameters.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

PierreTheMime posted:

Yeah for whatever reason I didn’t think to do outgoing streams too, which is obviously now. It took 2.5 hours but I’ll make the update and see what the difference is.

I'm doing ~2.5gb in ~15sec with:

code:
function split($path)
{
    $chunkSize=[math]::Ceiling((Get-Item $path).Length/6)
    $fileName = [System.IO.Path]::GetFileNameWithoutExtension($path)
    $directory = [System.IO.Path]::GetDirectoryName($path)
    $extension = [System.IO.Path]::GetExtension($path)

    $file = New-Object System.IO.FileInfo($path)
    $totalChunks = [int]($file.Length / $chunkSize) + 1
    $digitCount = [int][System.Math]::Log10($totalChunks) + 1

    $reader = [System.IO.File]::OpenRead($path)
    $count = 0
    $buffer = New-Object Byte[] $chunkSize
    $hasMore = $true
    while($hasMore)
    {
        $bytesRead = $reader.Read($buffer, 0, $buffer.Length)
        $chunkFileName = "$directory\$fileName$extension.{0:D$digitCount}.part"
        $chunkFileName = $chunkFileName -f $count
        $output = $buffer
        if ($bytesRead -ne $buffer.Length)
        {
            $hasMore = $false
            $output = New-Object Byte[] $bytesRead
            [System.Array]::Copy($buffer, $output, $bytesRead)
        }
        [System.IO.File]::WriteAllBytes($chunkFileName, $output)
        ++$count
    }

    $reader.Close()
}

split C:\Path\File.txt

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Judge Schnoopy posted:

I need to write an empty array to json instead of $null. What's the least hacky way to do this?

$json = @{}
[array]$array = @()
$json.add('Thing',$array)
$json | convertto-json

Results are:
"Thing" : $null

Desired results are:
"Thing" : [ ]

ConvertTo-JSON is broken when pipelining. "ConvertTo-JSON $json" should do what you expect.

Adbot
ADBOT LOVES YOU

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Ok, so I'm trying to write up a tool we can use in our lab to quickly grab important info from the lab machines and display it in a way that our non-technical techs can handle. I'm not 100% on how the UI stuff works, though. I've got a working form, and it populates great, but when I moved the form population from the end of the scan to updating after each machine, it basically locks me out of the form until the scan completes. I can still see it populating, but the form won't respond to any inputs (I have to close the underlying powershell window to kill it).

https://pastebin.com/h2nb50W9

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply