|
ChubbyThePhat posted:Ah PowerShell 7 now has a ternary operator.... oh boy. Actually happy about this
|
# ? Mar 6, 2020 06:16 |
|
|
# ? May 14, 2024 07:53 |
|
Djimi posted:Here's the code and the error I'm getting: If that's a copy n' paste of all your code, there's your problem. You've got one reference to $extx, which has an assigned value, and one to $ext which doesn't.
|
# ? Mar 6, 2020 07:15 |
|
Ha, after posting about the last couple release candidates, I'm two days late to the party on PowerShell 7's GA release
|
# ? Mar 6, 2020 18:26 |
|
What's the actual deal with these fanciful new PS releases? They're basically poo poo until baked-into a LTS Windows Server release yeah?
|
# ? Mar 6, 2020 18:36 |
|
Pile Of Garbage posted:What's the actual deal with these fanciful new PS releases? They're basically poo poo until baked-into a LTS Windows Server release yeah?
|
# ? Mar 6, 2020 21:51 |
|
As with previous releases it's really only going to be useful for sysadmin stuff if your entire fleet is running the same version (Which is trivial to with SCCM or any other package management software but the hard part is getting project funding/UAT/change approval/whatever else is needed so you can blat it out to several thousand devices). This isn't exactly a new problem of course, same issue already exists in environments with mixed OS versions (Server 2003 R2 with v2, Server 2008 R2 with v4, Server 2012 R2 with v5, etc). What's going to suck more so than it already does is the bullshit shims people put in to make their scripts backwards compatible (Chocolatey is a good example of this poo poo if you ever look under the hood). Wish people would just fork their stuff to support new PS environments instead of layering in obscene backwards compatible garbage.
|
# ? Mar 7, 2020 03:40 |
|
I've got a script that uninstalls a package, sets a scheduled task to reboot the machine, and then returns 0 to KACE. On the PS2 machines where the scheduled task stuff isn't supported, I just put in a catch block to just reboot the machine. There's probably a better way to do it, but this gets the job done.
|
# ? Mar 7, 2020 04:25 |
|
So on the non-v2 machines if the scheduled task creation fails for whatever reason it just straight reboots?
|
# ? Mar 7, 2020 06:49 |
|
I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set.
|
# ? Mar 10, 2020 02:04 |
|
Pile Of Garbage posted:So on the non-v2 machines if the scheduled task creation fails for whatever reason it just straight reboots? The scheduled task is a reboot. I want to do an exit 0 so KACE knows the script actually finished. On the v2 machines the task creation fails, so gently caress it, reboot the machine to activate SEP, and I'll worry about the machines stuck in 'Running' state in KACE later. If brute force didn't work, you weren't using enough of it. In this case, I think I'm using just the right amount of brute force.
|
# ? Mar 10, 2020 03:56 |
|
mllaneza posted:The scheduled task is a reboot. I want to do an exit 0 so KACE knows the script actually finished. On the v2 machines the task creation fails, so gently caress it, reboot the machine to activate SEP, and I'll worry about the machines stuck in 'Running' state in KACE later. Ah, SEP. Condolences.
|
# ? Mar 10, 2020 04:25 |
|
CampingCarl posted:I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set. Like most things in code, it depends on how you want to handle it. Aside from making sure your code uses the minimum number to loops and such if you want to improve the speed you may want to split off significant chunks of into separate processing threads. PowerShell is usually pretty decent about parsing lists and file operations but I’m sure there’s some speed to be gained if the list is big enough.
|
# ? Mar 10, 2020 11:36 |
|
CampingCarl posted:I have to clean up a lot of old files that my company has already delivered. Luckily I can get a list of the file names out of SQL of what has been delivered but the files I want to delete are in various subfolders and file types. Basically I have a list containing thing01.pdf and I want to delete(or move) everything named thing01.* from a folder structure. Is there a better/faster way than just foreach through the whole list? The list will probably have a couple hundred thousand filenames and that is just a starting set. For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options! 1. For delete Break the dirs into multiple blocks and use start-job to effectively mutlithread the delete. 2. This is sad but true - the plain old dos DEL command is by far the quickest io for deletes. 3. If you fall back to the .net enumerate files/enumerate directories methods tehy are very quick in that they start returning results right away to powershell instead of wiating to grab _all_ the results.
|
# ? Mar 11, 2020 16:13 |
|
adaz posted:For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options!
|
# ? Mar 11, 2020 16:40 |
|
Toast Museum posted:While we're talking about large collections, there's a performance advantage to using the classes in System.Collections.Generic rather than PowerShell's arrays and hashtables. Maybe this has changed in PowerShell 7, but in 5.1, adding to an array or hashtable involves rebuilding the whole collection, which leads to slowdowns that increase with the size of the collection. Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance.
|
# ? Mar 11, 2020 17:02 |
|
Pile Of Garbage posted:Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance. I'll play around with it later today, but if it's a myth, it's one that Microsoft is perpetuating: Microsoft posted:Generating a list of items is often done using an array with the addition operator:
|
# ? Mar 11, 2020 19:58 |
|
The key here I think is what "large" means in this context. From experience it's not really thousands, I don't know about tens of thousands or hundreds of thousands or millions of objects.
|
# ? Mar 11, 2020 20:48 |
|
adaz posted:For absolutely FUCKALLYUGE deletes and moves powershell isnt really your best choice. It's a little too abstract for that stuff to work very well. With that said you have some options! I was not aware of that parallel flag so I will definitely look into that.
|
# ? Mar 12, 2020 02:37 |
|
Pile Of Garbage posted:Is there any actual evidence, like testing with Measure-Command, to support this? We used phone timers instead of measure-command, but the end result was that we stopped using += with string arrays at work because of how slow it is.
|
# ? Mar 12, 2020 03:53 |
|
CampingCarl posted:For some reason I just assumed powershell and DEL would use the same underlying method to delete. The problem isn't so much the delete as much as quickly comparing files to the list because I expect there to be some files left over. I figured since that is determined all by the filename in this case powershell would at least be good for generating the list of files and then I can use that to either delete or move. No they definitely don't. Like I know Del is natively hooked into the kerenel APIs that support > 32 bit paths whereas powershell uses the win32 apis which don't for example. Parallel is cool but FYI only works on windows hosts last I heard. Pile Of Garbage posted:Is there any actual evidence, like testing with Measure-Command, to support this? I ask because I've been told the same thing before but only for System.Collections.ArrayList and when I asked the same question re evidence I got only silence. It really feels like one of those myths that's just been propagated. I really don't see how something that extends the base System.Array which PS uses would improve performance. Like most things if you're doing it a couple times it won't matter. But if you're adding dozens/hundreds/thousands it will rapdily. If you look @ microsoft's own implementation of stuff like System.Collections.List<T> it internally creates an array of 1000 items to add stuff onto and will grow it natively to avoid allocating all those extra arrays. You can try it yourself I added on measuring list to the code examples. code:
adaz fucked around with this message at 18:19 on Mar 12, 2020 |
# ? Mar 12, 2020 18:03 |
|
adaz posted:Parallel is cool but FYI only works on windows hosts last I heard. Just tried it in macOS. code:
code:
Fake edit: I just realized you might be thinking of ForEach -Parallel for Workflows. Yeah, that's Windows-only and Workflows-only.
|
# ? Mar 12, 2020 19:07 |
|
Toast Museum posted:Just tried it in macOS. Well this is super awesome. I was indeed thinking of workflows and confusing the two. The fact it works in both is awesome and sort of makes sense since, under the hood, pretty sure it just uses the task parallel library. Nice!
|
# ? Mar 12, 2020 22:18 |
|
It's easily the PowerShell 7 feature I've made the most use of. For dumb company culture reasons, I have to get by without most of the usual enterprise management tools, socode:
|
# ? Mar 13, 2020 12:47 |
This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry
|
|
# ? Mar 18, 2020 20:31 |
|
Submarine Sandpaper posted:This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry Like, detect all common locations for storing configurations or data?
|
# ? Mar 18, 2020 21:25 |
|
Submarine Sandpaper posted:This is a silly one but how can you determine all the stores on a machine eg sqlserver, certificates, registry I think you're looking for get-psprovider
|
# ? Mar 18, 2020 21:48 |
|
I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal? code:
|
# ? Mar 23, 2020 04:50 |
|
BeastOfExmoor posted:I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal? I do not understand. When you use if($Destfilename), you are only checking to see if the variable exists. If you want to see if the file exists, you should be using Test-Path.
|
# ? Mar 23, 2020 06:43 |
|
I tried to mimic what I think your intent is, but don't see any issue.code:
|
# ? Mar 23, 2020 06:55 |
|
Toshimo posted:I do not understand. Sorry, I should've explained that. Earlier in the script I'm grabbing a Get-Childitem list from different directories and building an object consisting of similar, but not exact, filenames, so I already know $DestFilename, if present, exists. The check to see if $DestFilename isn't $null is just a hacky way of solving an earlier logic issue when $SourceDir has more files than $DestDir. Either way, I gave this a look with fresh eyes and ran some more tests, and it appears that the issue is happening somewhere in the code before this and I can't quite get it to recreate if I make a test directory, so I'm honestly not sure what's happening. It's an edge case for a tool I have to run with a bit of human oversight anyway, so I'll ignore for now and come back and try to trace it somewhere down the line. Thanks for the write up.
|
# ? Mar 23, 2020 16:31 |
|
BeastOfExmoor posted:I am running the following if statement to figure out if $Destfilename both exists and also is not equal for $File.Name in a ForEach loop. It works most of the time, but I noticed an issue when the filename(s) contains a '!' character. I'm assuming this is a regex issue, but in all my searching I'm not seeing a way to ignore regex and compare string variables as literal? Does -eq/-ne rely on regex? I didn't think so, but if they do, you could use -like or [string]::equals(string a, string b)
|
# ? Mar 23, 2020 20:29 |
|
Is there a better way to build this object? The end result here is importing a list of the last time a mailbox has been logged into, into a database. This works: code:
code:
And if I try the Write-SqlTableData command like the first example, I get this error: code:
Would I be better off building the result set a slightly different way in the second example? Or would it make more sense to iterate through the first examples result set, and add the rest of the data for the user?
|
# ? Mar 25, 2020 20:49 |
|
I don't have a lot of advice except: 1. I like using Select-Object to build custom objects: code:
code:
|
# ? Mar 25, 2020 21:16 |
|
I'd try making a PSCustomObject:code:
|
# ? Mar 25, 2020 21:44 |
|
Thanks guys, I'll play around with those if I get a break from laptop issuing and vpn support tomorrow. I'm trying to learn more powershell since the new job is 90% windows.
|
# ? Mar 26, 2020 01:00 |
Is there a handy shorthand for getting array objects as values in a hashtable into an array without messing around out-string or some -join command to avoid the string conversion giving a system.object.whatever?
|
|
# ? Apr 29, 2020 19:01 |
|
Submarine Sandpaper posted:Is there a handy shorthand for getting array objects as values in a hashtable into an array without messing around out-string or some -join command to avoid the string conversion giving a system.object.whatever? See the ExpandProperty parameter for Select-Object: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/select-object?view=powershell-7#parameters Example: code:
|
# ? Apr 29, 2020 19:16 |
|
So you have a hashtable like this?code:
code:
|
# ? Apr 29, 2020 19:17 |
|
That's perfectly valid. Edit: for further reading as usual I recommend the PS about topics, specifically the one for hashtables: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_hash_tables?view=powershell-5.1 Pile Of Garbage fucked around with this message at 19:24 on Apr 29, 2020 |
# ? Apr 29, 2020 19:22 |
|
|
# ? May 14, 2024 07:53 |
Oh I half assed my question. Specifically when converting to CSV it attempts to .tostring() everything so if $hashtable is a value in an array that field returns System.Collections.Hashtable. So when I'm building my report I'm doing some absurd poo poo like ($hashtable | Out-String).trim() Submarine Sandpaper fucked around with this message at 04:34 on Apr 30, 2020 |
|
# ? Apr 30, 2020 04:31 |