Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Fool
Oct 16, 2003


ChubbyThePhat posted:

Ah PowerShell 7 now has a ternary operator.... oh boy.

Actually happy about this

Adbot
ADBOT LOVES YOU

mllaneza
Apr 28, 2007

Veteran, Bermuda Triangle Expeditionary Force, 1993-1952




Djimi posted:

Here's the code and the error I'm getting:
code:
$extx= ".xlsx"
$Dir = (get-childitem -Path "p:\myfolder\" -recurse -force | ? {$_.Extension -eq $ext -or $_.Extension -eq $extx})

If that's a copy n' paste of all your code, there's your problem. You've got one reference to $extx, which has an assigned value, and one to $ext which doesn't.

Toast Museum
Dec 3, 2005

30% Iron Chef
Ha, after posting about the last couple release candidates, I'm two days late to the party on PowerShell 7's GA release :toot:

Pile Of Garbage
May 28, 2007



What's the actual deal with these fanciful new PS releases? They're basically poo poo until baked-into a LTS Windows Server release yeah?

mystes
May 31, 2006

Pile Of Garbage posted:

What's the actual deal with these fanciful new PS releases? They're basically poo poo until baked-into a LTS Windows Server release yeah?
They only really just officially replaced the built-in version of powershell with powershell core with the release of powershell 7 just now, but it seems like they intend to bundle it with windows.

Pile Of Garbage
May 28, 2007



As with previous releases it's really only going to be useful for sysadmin stuff if your entire fleet is running the same version (Which is trivial to with SCCM or any other package management software but the hard part is getting project funding/UAT/change approval/whatever else is needed so you can blat it out to several thousand devices).

This isn't exactly a new problem of course, same issue already exists in environments with mixed OS versions (Server 2003 R2 with v2, Server 2008 R2 with v4, Server 2012 R2 with v5, etc). What's going to suck more so than it already does is the bullshit shims people put in to make their scripts backwards compatible (Chocolatey is a good example of this poo poo if you ever look under the hood). Wish people would just fork their stuff to support new PS environments instead of layering in obscene backwards compatible garbage.

mllaneza
Apr 28, 2007

Veteran, Bermuda Triangle Expeditionary Force, 1993-1952




I've got a script that uninstalls a package, sets a scheduled task to reboot the machine, and then returns 0 to KACE. On the PS2 machines where the scheduled task stuff isn't supported, I just put in a catch block to just reboot the machine.

There's probably a better way to do it, but this gets the job done.

Pile Of Garbage
May 28, 2007



So on the non-v2 machines if the scheduled task creation fails for whatever reason it just straight reboots? :lol:

CampingCarl
Apr 28, 2008




I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set.

mllaneza
Apr 28, 2007

Veteran, Bermuda Triangle Expeditionary Force, 1993-1952




Pile Of Garbage posted:

So on the non-v2 machines if the scheduled task creation fails for whatever reason it just straight reboots? :lol:

The scheduled task is a reboot. I want to do an exit 0 so KACE knows the script actually finished. On the v2 machines the task creation fails, so gently caress it, reboot the machine to activate SEP, and I'll worry about the machines stuck in 'Running' state in KACE later.

If brute force didn't work, you weren't using enough of it. In this case, I think I'm using just the right amount of brute force.

Pile Of Garbage
May 28, 2007



mllaneza posted:

The scheduled task is a reboot. I want to do an exit 0 so KACE knows the script actually finished. On the v2 machines the task creation fails, so gently caress it, reboot the machine to activate SEP, and I'll worry about the machines stuck in 'Running' state in KACE later.

If brute force didn't work, you weren't using enough of it. In this case, I think I'm using just the right amount of brute force.

Ah, SEP. Condolences.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

CampingCarl posted:

I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set.

Like most things in code, it depends on how you want to handle it. Aside from making sure your code uses the minimum number to loops and such if you want to improve the speed you may want to split off significant chunks of into separate processing threads. PowerShell is usually pretty decent about parsing lists and file operations but I’m sure there’s some speed to be gained if the list is big enough.

adaz
Mar 7, 2009

CampingCarl posted:

I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set.

For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options!

1. For delete Break the dirs into multiple blocks and use start-job to effectively mutlithread the delete.
2. This is sad but true - the plain old dos DEL command is by far the quickest io for deletes.
3. If you fall back to the .net enumerate files/enumerate directories methods tehy are very quick in that they start returning results right away to powershell instead of wiating to grab _all_ the results.

Toast Museum
Dec 3, 2005

30% Iron Chef

adaz posted:

For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options!

1. For delete Break the dirs into multiple blocks and use start-job to effectively mutlithread the delete.
2. This is sad but true - the plain old dos DEL command is by far the quickest io for deletes.
3. If you fall back to the .net enumerate files/enumerate directories methods tehy are very quick in that they start returning results right away to powershell instead of wiating to grab _all_ the results.

  1. If you're looking for a reason to try PowerShell 7, ForEach-Object -Parallel is loving dope.
  2. Related to that, Robocopy is significantly faster than Move-Item or Copy-Item. I think it might be usable for deletes too, but I don't use it often enough to be sure.
  3. While we're talking about large collections, there's a performance advantage to using the classes in System.Collections.Generic rather than PowerShell's arrays and hashtables. Maybe this has changed in PowerShell 7, but in 5.1, adding to an array or hashtable involves rebuilding the whole collection, which leads to slowdowns that increase with the size of the collection.

Pile Of Garbage
May 28, 2007



Toast Museum posted:

While we're talking about large collections, there's a performance advantage to using the classes in System.Collections.Generic rather than PowerShell's arrays and hashtables. Maybe this has changed in PowerShell 7, but in 5.1, adding to an array or hashtable involves rebuilding the whole collection, which leads to slowdowns that increase with the size of the collection.

Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance.

Toast Museum
Dec 3, 2005

30% Iron Chef

Pile Of Garbage posted:

Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance.

I'll play around with it later today, but if it's a myth, it's one that Microsoft is perpetuating:

Microsoft posted:

Generating a list of items is often done using an array with the addition operator:

code:
$results = @()
$results += Do-Something
$results += Do-SomethingElse
$results
This can be very inefficent because arrays are immutable. Each addition to the array actually creates a new array big enough to hold all elements of both the left and right operands, then copies the elements of both operands into the new array. For small collections, this overhead may not matter. For large collections, this can definitely be an issue.

There are a couple of alternatives. If you don't actually require an array, instead consider using an ArrayList

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
The key here I think is what "large" means in this context. From experience it's not really thousands, I don't know about tens of thousands or hundreds of thousands or millions of objects.

CampingCarl
Apr 28, 2008




adaz posted:

For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options!

1. For delete Break the dirs into multiple blocks and use start-job to effectively mutlithread the delete.
2. This is sad but true - the plain old dos DEL command is by far the quickest io for deletes.
3. If you fall back to the .net enumerate files/enumerate directories methods tehy are very quick in that they start returning results right away to powershell instead of wiating to grab _all_ the results.
For some reason I just assumed powershell and DEL would use the same underlying method to delete. The problem isn't so much the delete as much as quickly comparing files to the list because I expect there to be some files left over. I figured since that is determined all by the filename in this case powershell would at least be good for generating the list of files and then I can use that to either delete or move.

I was not aware of that parallel flag so I will definitely look into that.

Happiness Commando
Feb 1, 2002
$$ joy at gunpoint $$

Pile Of Garbage posted:

Is there any actual evidence, like testing with Measure-Command, to support this?

We used phone timers instead of measure-command, but the end result was that we stopped using += with string arrays at work because of how slow it is.

adaz
Mar 7, 2009

CampingCarl posted:

For some reason I just assumed powershell and DEL would use the same underlying method to delete. The problem isn't so much the delete as much as quickly comparing files to the list because I expect there to be some files left over. I figured since that is determined all by the filename in this case powershell would at least be good for generating the list of files and then I can use that to either delete or move.

I was not aware of that parallel flag so I will definitely look into that.

No they definitely don't. Like I know Del is natively hooked into the kerenel APIs that support > 32 bit paths whereas powershell uses the win32 apis which don't for example.

Parallel is cool but FYI only works on windows hosts last I heard.

Pile Of Garbage posted:

Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance.

Like most things if you're doing it a couple times it won't matter. But if you're adding dozens/hundreds/thousands it will rapdily. If you look @ microsoft's own implementation of stuff like System.Collections.List<T> it internally creates an array of 1000 items to add stuff onto and will grow it natively to avoid allocating all those extra arrays. You can try it yourself I added on measuring list to the code examples.

code:
$ArrayList = New-Object -TypeName 'System.Collections.ArrayList';
$Array = @();
$ListGeneric = New-Object System.Collections.Generic.List[string];

Measure-Command { for($i = 0; $i -lt 10000; $i++)  {  $null = $ListGeneric.Add("Adding item $i")   } };


Measure-Command { for($i = 0; $i -lt 10000; $i++) { $null = $ArrayList.Add("Adding item $i") } };

Measure-Command { for($i = 0; $i -lt 10000; $i++) { $Array += "Adding item $i" } };

adaz fucked around with this message at 18:19 on Mar 12, 2020

Toast Museum
Dec 3, 2005

30% Iron Chef

adaz posted:

Parallel is cool but FYI only works on windows hosts last I heard.

Just tried it in macOS.

code:
1..10 | ForEach-Object -Process {Start-Sleep -Seconds 1}
takes 10 seconds, whereas

code:
1..10 | ForEach-Object -Parallel{Start-Sleep -Seconds 1} -ThrottleLimit 10
takes one second. For all I know, there could be issues that make it choke in less trivial cases, but cross-platform implementation is there, at least.


Fake edit: I just realized you might be thinking of ForEach -Parallel for Workflows. Yeah, that's Windows-only and Workflows-only.

adaz
Mar 7, 2009

Toast Museum posted:

Just tried it in macOS.

code:
1..10 | ForEach-Object -Process {Start-Sleep -Seconds 1}
takes 10 seconds, whereas

code:
1..10 | ForEach-Object -Parallel{Start-Sleep -Seconds 1} -ThrottleLimit 10
takes one second. For all I know, there could be issues that make it choke in less trivial cases, but cross-platform implementation is there, at least.


Fake edit: I just realized you might be thinking of ForEach -Parallel for Workflows. Yeah, that's Windows-only and Workflows-only.

Well this is super awesome. I was indeed thinking of workflows and confusing the two. The fact it works in both is awesome and sort of makes sense since, under the hood, pretty sure it just uses the task parallel library. Nice!

Toast Museum
Dec 3, 2005

30% Iron Chef
It's easily the PowerShell 7 feature I've made the most use of. For dumb company culture reasons, I have to get by without most of the usual enterprise management tools, so

code:
Get-PSSession | ForEach-Object -Parallel 
{
    copy-item -Path 'c:\some\local.file' -Destination 'c:\remote\destination' -ToSession $_
}
has been pretty handy when I've got to distribute some small file on the fly. For my devices and network, the sweet spot for -ThrottleLimit is about 12, with a corresponding twelve-fold speedup compared to processing the loop serially.

Submarine Sandpaper
May 27, 2007


This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry

Potato Salad
Oct 23, 2014

nobody cares


Submarine Sandpaper posted:

This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry

Like, detect all common locations for storing configurations or data?

adaz
Mar 7, 2009

Submarine Sandpaper posted:

This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry

I think you're looking for get-psprovider

BeastOfExmoor
Aug 19, 2003

I will be gone, but not forever.
I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal?

code:
    if($Destfilename -and $File.Name -ne $DestFilename) {

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

BeastOfExmoor posted:

I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal?

code:
    if($Destfilename -and $File.Name -ne $DestFilename) {

I do not understand.

When you use if($Destfilename), you are only checking to see if the variable exists. If you want to see if the file exists, you should be using Test-Path.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I tried to mimic what I think your intent is, but don't see any issue.

code:
$working_dir = "C:\Users\toshimo\Documents\PowerShell\"
$File1 = "Test1.txt"
$File2 = "Test1!.txt"

$Destfilename = $File1
$File = Get-ChildItem ($working_dir + $File1)
if((Test-Path ($working_dir + $Destfilename)) -and ($File.Name -ne $DestFilename)) { echo "No Match" } else { echo "Match" }

$Destfilename = $File2
$File = Get-ChildItem ($working_dir + $File2)
if((Test-Path ($working_dir + $Destfilename)) -and ($File.Name -ne $DestFilename)) { echo "No Match" } else { echo "Match" }

$Destfilename = $File1
$File = Get-ChildItem ($working_dir + $File2)
if((Test-Path ($working_dir + $Destfilename)) -and ($File.Name -ne $DestFilename)) { echo "No Match" } else { echo "Match" }

$Destfilename = $File2
$File = Get-ChildItem ($working_dir + $File1)
if((Test-Path ($working_dir + $Destfilename)) -and ($File.Name -ne $DestFilename)) { echo "No Match" } else { echo "Match" }


    Directory: C:\Users\toshimo\Documents\PowerShell


Mode                LastWriteTime         Length Name                                                                                                                                                                                                 
----                -------------         ------ ----                                                                                                                                                                                                 
-a----        3/23/2020   1:34 AM              8 Test1!.txt                                                                                                                                                                                           
-a----        3/23/2020   1:32 AM              8 Test1.txt                                                                                                                                                                                            
Match
Match
No Match
No Match

BeastOfExmoor
Aug 19, 2003

I will be gone, but not forever.

Toshimo posted:

I do not understand.

When you use if($Destfilename), you are only checking to see if the variable exists. If you want to see if the file exists, you should be using Test-Path.

Sorry, I should've explained that. Earlier in the script I'm grabbing a Get-Childitem list from different directories and building an object consisting of similar, but not exact, filenames, so I already know $DestFilename, if present, exists. The check to see if $DestFilename isn't $null is just a hacky way of solving an earlier logic issue when $SourceDir has more files than $DestDir.

Either way, I gave this a look with fresh eyes and ran some more tests, and it appears that the issue is happening somewhere in the code before this and I can't quite get it to recreate if I make a test directory, so I'm honestly not sure what's happening.

It's an edge case for a tool I have to run with a bit of human oversight anyway, so I'll ignore for now and come back and try to trace it somewhere down the line.

Thanks for the write up.

Toast Museum
Dec 3, 2005

30% Iron Chef

BeastOfExmoor posted:

I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal?

code:
    if($Destfilename -and $File.Name -ne $DestFilename) {

Does -eq/-ne rely on regex? I didn't think so, but if they do, you could use -like or [string]::equals(string a, string b)

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Is there a better way to build this object?

The end result here is importing a list of the last time a mailbox has been logged into, into a database.

This works:
code:
$bobsData = Get-MailboxStatistics -Identity "Bob Morales" | Select-Object DisplayName, Departme
nt, OU, LastLogonTime

DisplayName    Department OU LastLogonTime
-----------    ---------- -- -------------
Bob Morales               3/25/2020 3:02:59 PM

Write-SqlTableData -Credential $dbCredential -ServerInstance db01 -Database email_stats -SchemaName "dbo" -TableName LastLogon -InputData $bobsData
However, this is the original script. It does a few more things so it can get the OU/Department of the users:

code:
$mbxStats = @()

$mailboxes = Get-Mailbox -ResultSize Unlimited
$mailboxes | ForEach-Object {
$stats = Get-MailboxStatistics -id $_ 

$user  = get-user -id $_

# Create a new instance of a .Net object

$mbx = New-Object System.Object


$mbx | Add-Member -MemberType NoteProperty -Value $stats.Displayname -Name DisplayName

$mbx | Add-Member -MemberType NoteProperty -Value $user.Department -Name Department

$mbx | Add-Member -MemberType NoteProperty -Value $user.OrganizationalUnit -Name OU

$mbx | Add-Member -MemberType NoteProperty -Value $stats.LastLogonTime -Name LastLogon

$mbxStats += $mbx

}

$mbxStats | Sort-Object OrganizationalUnit, Department, LastLogon | Export-Csv "\\fs01\data\it\reports\email report\Lastlogon.csv" -NoTypeInformation
Someone manually imports that CSV right now, which doesn't make any sense when it can just be INSERTed directly to the database.

And if I try the Write-SqlTableData command like the first example, I get this error:
code:
Write-SqlTableData : The given value of type Object from the data source cannot be converted to type varchar of the
specified target column.
At line:1 char:1
+ Write-SqlTableData -Credential $dbCredential -ServerInstance db01 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : WriteError: ([dbo].[LastLogon]:Table) [Write-SqlTableData], InvalidOperationException
    + FullyQualifiedErrorId : WriteToTableFailure,Microsoft.SqlServer.Management.PowerShell.WriteSqlTableData
I'm trying to read up on objects/arrays/etc work in Powershell. Is it barking about the 'row' object itself or the DateTime value?

Would I be better off building the result set a slightly different way in the second example? Or would it make more sense to iterate through the first examples result set, and add the rest of the data for the user?

The Fool
Oct 16, 2003


I don't have a lot of advice except:

1. I like using Select-Object to build custom objects:
code:
$customArray = @()

For-Each ($item in $source) {
    $customObject = '' | Select-Object Prop1,Prop2
    $customObject.Prop1 = $item.Prop1
    $customObject.Prop2 = $item.Prop2
    $customArray += $customObject
}

$customArray
-------

Prop1     	Prop2
-----		-----
Property1 	Property2
Property1 	Property2
Property1 	Property2

2. Anytime I need to dump something into a database, I just switch to python + pandas because you can just do this with your dataframe:
code:
df.to_sql(db_tablename, con=engine, index=False, if_exists='replace')

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I'd try making a PSCustomObject:
code:
$mbxStats = @()

$mailbox_list = Get-Mailbox -ResultSize Unlimited 

ForEach ($mailbox in $mailbox_list) {
    $mail_stats = Get-MailboxStatistics -id $mailbox
    user  = Get-User -id $mailbox

    $userObject = [PSCustomObject]@{
            id                = ''                     # You may omit this if your database doesn't need it.
            DisplayName       = $stats.Displayname
            Department        = $user.Department
            OU                = $user.OrganizationalUnit
            LastLogonTime     = $stats.LastLogonTime
        }

    Write-SqlTableData -Credential $dbCredential -ServerInstance db01 -Database email_stats -SchemaName "dbo" -TableName LastLogon -InputData $userObject
}

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Thanks guys, I'll play around with those if I get a break from laptop issuing and vpn support tomorrow.

I'm trying to learn more powershell since the new job is 90% windows.

Submarine Sandpaper
May 27, 2007


Is there a handy shorthand for getting array objects as values in a hashtable into an array without messing around out-string or some -join command to avoid the string conversion giving a system.object.whatever?

Pile Of Garbage
May 28, 2007



Submarine Sandpaper posted:

Is there a handy shorthand for getting array objects as values in a hashtable into an array without messing around out-string or some -join command to avoid the string conversion giving a system.object.whatever?

See the ExpandProperty parameter for Select-Object: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/select-object?view=powershell-7#parameters

Example:

code:
PS C:\> Get-ChildItem | Select-Object -Property Name

Name
----
ESD
NVIDIA
PerfLogs
Program Files
Program Files (x86)
temp
Users
Windows


PS C:\> Get-ChildItem | Select-Object -ExpandProperty Name

ESD
NVIDIA
PerfLogs
Program Files
Program Files (x86)
temp
Users
Windows
As demonstrated the former returns key/value pairs whilst the latter only returns values.

The Fool
Oct 16, 2003


So you have a hashtable like this?
code:
$hashtable = @{ "array1" = 'this','is','array','1'; "array2" = 'hello','array','2' }
$hashtable

Name                           Value
----                           -----
array1                         {this, is, array, 1}
array2                         {hello, array, 2}


What's wrong with just doing this?
code:
$array1 = $hashtable.array1
$array1

this
is
array
1

Or am I missing something?

Pile Of Garbage
May 28, 2007



That's perfectly valid.

Edit: for further reading as usual I recommend the PS about topics, specifically the one for hashtables: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_hash_tables?view=powershell-5.1

Pile Of Garbage fucked around with this message at 19:24 on Apr 29, 2020

Adbot
ADBOT LOVES YOU

Submarine Sandpaper
May 27, 2007


Oh I half assed my question. Specifically when converting to CSV it attempts to .tostring() everything so if $hashtable is a value in an array that field returns System.Collections.Hashtable. So when I'm building my report I'm doing some absurd poo poo like

($hashtable | Out-String).trim()

Submarine Sandpaper fucked around with this message at 04:34 on Apr 30, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply