|
Jethro posted:I had no idea about , and the true meaning of @() until I looked it up the other day, so thank you, New Yorp New Yorp, for giving me a reason to finally learn about one of the parts of Powershell that no one gets right in posted snippets. The unary operator still doesn't behave "correctly" (for my personal definition of correctly, of course) when piping into ConvertTo-Json. It's definitely good to know, but it's not a solution to the specific scenario I was bitching about. code:
code:
code:
|
# ? Feb 4, 2019 17:53 |
|
|
# ? May 28, 2024 14:48 |
|
Maybe Converto-Json is just busted, like how -Whatif is silently ignored on a bunch of AD cmdlets.
|
# ? Feb 5, 2019 02:35 |
|
Any of you folks use Chocolatey? I've got an Oracle client that is being an absolute fucker and I'm fairly sure that I'm missing something. We have an response file for the install, but the executable requires a full path to said file. No ./tools/foo.rsp here. I can run the install straight from PS just fine, but once it goes into Chocolatey it breaks all to hell; the exit code is consistent with being unable to find the response file. I've copied the file to C: just to have it in a static place outside of the packaging process, but it still fails during packing install. Anyone have any suggestions on potential next steps to resolve? I'm currently out of ideas.
|
# ? Feb 15, 2019 16:42 |
|
Trying to construct a function with some parameter sets, not sure if what I want to do is actually possible. This is a "get" function. By default I want it to get "all" the parameters, but I want to allow using -AllFields:$false to only get "default" values. I ALSO want to, in a separate parameter set, allow a -Fields parameter where the user can specify which fields they want. It wouldn't make sense to pass AllFields and Fields in the same command so I'd like to use Parameter Sets to prevent that... So I wrote this but it doesn't seem very elegant. Checking if AllFields was specified then acting on it, else checking if fields was specified and acting on that, else building the default URL. And using a DefaultParameterSetName that doesn't exit.code:
|
# ? Feb 20, 2019 00:11 |
|
Help me save my own sanity. I’m trying to scrape for a specific device on specific PCs using WMI wrapped in an invoke-command. If I do it manually PC by PC, it works. It’s only when I add in Import-CSV file.csv | does it return nothing. The CSV is just: Name PC1 PC2 PC3 Ideas?
|
# ? Feb 20, 2019 02:09 |
|
FISHMANPET posted:Trying to construct a function with some parameter sets, not sure if what I want to do is actually possible. This is a "get" function. By default I want it to get "all" the parameters, but I want to allow using -AllFields:$false to only get "default" values. I ALSO want to, in a separate parameter set, allow a -Fields parameter where the user can specify which fields they want. It wouldn't make sense to pass AllFields and Fields in the same command so I'd like to use Parameter Sets to prevent that... So I wrote this but it doesn't seem very elegant. Checking if AllFields was specified then acting on it, else checking if fields was specified and acting on that, else building the default URL. And using a DefaultParameterSetName that doesn't exit. Edit: Ignore this. Mario has it below. Irritated Goat posted:Help me save my own sanity. I’m trying to scrape for a specific device on specific PCs using WMI wrapped in an invoke-command. If I do it manually PC by PC, it works. It’s only when I add in Import-CSV file.csv | does it return nothing. Can you tell if it's misbehaving on your computer, or on the remote computer? Are you passing local variables to the remote computer with the $using:variable? Credentials being passed successfully? Do you know at what point in your script/import it's failing? One thing that helps me troubleshooting a new script is having Write-Host "Some-Command -and $arguements1 -but $argument2 | Do-something -useful" right before whatever the command is, so I can see what the script is running versus what I think it should be running. All of a sudden I'm passing a $null argument? Easy to see. sloshmonger fucked around with this message at 03:42 on Feb 20, 2019 |
# ? Feb 20, 2019 02:32 |
|
FISHMANPET posted:I guess I could also do a switch statement on $PSCmdlet.ParameterSetName but that doesn't change the meat of the problem which is that I'm having to define my "default" twice. I also don't know which is better coding practice, what I've got above or switch on ParameterSetName. code:
Mario fucked around with this message at 03:08 on Feb 20, 2019 |
# ? Feb 20, 2019 02:37 |
|
Irritated Goat posted:Help me save my own sanity. I’m trying to scrape for a specific device on specific PCs using WMI wrapped in an invoke-command. If I do it manually PC by PC, it works. It’s only when I add in Import-CSV file.csv | does it return nothing. Is that a scope issue ? Yeah, that's probably scope. At a guess, you're probably importing the CSV in one function and trying to do something with the contents in another.
|
# ? Feb 20, 2019 07:36 |
|
Mario posted:
Maybe I'm not quite understanding, because if you set the ValidateSet to only $True then you can't do -param:$false. Now granted, setting switches to true by default is apparently against best practice so I guess it's kind of weird that I'm asking "what's the best way to follow best practices when I'm violating best practices." The reason I'm doing this is my team has a publicly available Powershell Module to interact with data from Google Sheets: https://github.com/umn-microsoft-automation/UMN-Google but I'm sure very few, if any, people are using it. And the module I'm working on, Get-GFilePermissions, is currently so useless that even if you're using the module, you're probably not using that function. But it's there, and it does something and so it's possible that someone is using it. So I'm changing the function to be slightly smarter, and debating how much I want to preserve the current format of the data that it returns vs improving it to be actually useful (but making it easy for someone to fix their code if they want to keep using it the old dumb way?). I'm most assuredly thinking way too hard about this dumb cmdlet because I could make it work and make it pull all the data by adding 9 characters but damnit I wanna do it right!
|
# ? Feb 20, 2019 17:51 |
|
mllaneza posted:Is that a scope issue ? Yeah, that's probably scope. At a guess, you're probably importing the CSV in one function and trying to do something with the contents in another. Now that I'm at a PC, the code I'm using is: code:
If I do code:
|
# ? Feb 20, 2019 20:11 |
|
Not sure if typo, but it should be code:
code:
|
# ? Feb 20, 2019 20:16 |
|
The Fool posted:Not sure if typo, but it should be Typo. I always get those swapped till Powershell yells at me.
|
# ? Feb 20, 2019 20:19 |
FISHMANPET posted:Maybe I'm not quite understanding, because if you set the ValidateSet to only $True then you can't do -param:$false. Now granted, setting switches to true by default is apparently against best practice so I guess it's kind of weird that I'm asking "what's the best way to follow best practices when I'm violating best practices." code:
Submarine Sandpaper fucked around with this message at 20:46 on Feb 20, 2019 |
|
# ? Feb 20, 2019 20:24 |
|
I decided I was probably overthinking this, and also not really keeping in line with the philosophy of the rest of the modules, so I went with this. Generally speaking, with the rest of the cmdlets, we'll grab everything and return it to you and let you filter it locally rather than trying to construct a REST call to get only what you specify. So if you only want some of the fields, just get them all and select-object those properties. By default it will append /?fields=* to the URI, unless you specify -Default fields in which case it will leave that out. It also lets you specify a PermissionID which gets you one specific permission object rather than all permissions for a file. code:
|
# ? Feb 20, 2019 21:00 |
|
Irritated Goat posted:Now that I'm at a PC, the code I'm using is: Stop doubling up on Select-Object and use Where-Object instead of Select-String. Something like code:
|
# ? Feb 20, 2019 21:27 |
|
FISHMANPET posted:Maybe I'm not quite understanding, because if you set the ValidateSet to only $True then you can't do -param:$false. Now granted, setting switches to true by default is apparently against best practice so I guess it's kind of weird that I'm asking "what's the best way to follow best practices when I'm violating best practices."
|
# ? Feb 21, 2019 03:27 |
|
Yeah there's some vmware cmdlets that have -confirm be true by default, and to override that you need to specify -confirm:$false and so I thought I'd be clever and do that, but according to PSScriptAnalyzer setting switches to true by default is bad practice so I decided to stop overthinking it.
|
# ? Feb 21, 2019 03:30 |
|
Have any of you guys played around with PowerShell Core (6) on Linux or Mac OS X? How was it? Bonus points if you've also used it with OMI. I don't want to bash Bash (ehhhh) but I like the structured data that PowerShell offers a lot more than the pure text of UNIX-based systems. if I can manage a system just as well with pwsh, I don't see why I shouldn't.
|
# ? Mar 1, 2019 22:30 |
|
FISHMANPET posted:Yeah there's some vmware cmdlets that have -confirm be true by default, and to override that you need to specify -confirm:$false and so I thought I'd be clever and do that, but according to PSScriptAnalyzer setting switches to true by default is bad practice so I decided to stop overthinking it. Did it complain about setting switch arguments in general or about specifically setting Confirm switch arguments? The former doesn't make sense but the latter kind of does given how the Confirm argument is built-in and exposed when SupportsShouldProcess is enabled. This page provides a better run-down of how it works: https://docs.microsoft.com/en-us/powershell/developer/cmdlet/requesting-confirmation-from-cmdlets#supporting-confirmation-requests. From what I understand if you're wrapping cmdlets which support should-process then you you should be setting the $ConfirmPreference variable appropriately so that it can be read by the wrapped cmdlet. Edit: to maybe clarify, the intention is that things like confirmation and what-if should be passed-through from the wrapping script/method. ThatNateGuy posted:Have any of you guys played around with PowerShell Core (6) on Linux or Mac OS X? How was it? Bonus points if you've also used it with OMI. I don't want to bash Bash (ehhhh) but I like the structured data that PowerShell offers a lot more than the pure text of UNIX-based systems. if I can manage a system just as well with pwsh, I don't see why I shouldn't. I haven't tried PowerShell Core but I do have a deep dislike of Bash scripting due to its exceptionally obtuse syntax. I guess the usefulness of PowerShell Core will depend on how much of .NET has been ported across to .NET Core.
|
# ? Mar 2, 2019 15:01 |
|
I had to fix a module that called a library because it was all lower case in the psm1 file but mixed case in the actual file name and that actually matters on Linux. I was *this* close to actually making a github account so I could ask them to fix it.
|
# ? Mar 3, 2019 03:56 |
|
I'm having a problem with trying to get a split or another way to break up a string to work. Overall, what I'm trying to do is go to my (GMAIL) mailbox and run a report on all the messages that are in there, including the file names of attachments. Here's what I have.code:
Unfortunately, the split method doesn't work on $ContentType - according to powershell, it does not contain that method. I also tried using a substring method, but that is also not present. I have tried converting the variable to string by using [string]$ContentType, but that didn't really do anything As an example, this is what $ContentType contains if there is an attachment: Content-Type: application/octet-stream; name=lead_attachment.xml I have a second issue where there could be multiple instances of Content-Type in the header, but if I can't get it to pull the string data anyway there's not much point in getting it to narrow down the right value.
|
# ? Mar 5, 2019 16:51 |
|
$ContentType.GetType() will tell you what that object actually is. If you just "call" the variable in your shell or use write-host to print it to the console it will convert it to a string to display, but that error means it's probably not a string. It should have a .ToString() method if you really want to do string manipulation on it but you may be able to do more with the original object in its native type. You can also pipe it into Get-Member ($ContentType | Get-Member) and it will show you type type as well as all properties and methods of the object.
|
# ? Mar 5, 2019 17:29 |
|
I tried the GetType and it turns out this is a System.ValueType and if I do a .Value, it returns that it is multipart/mixed. Which I guess explains part of the problem. So, I can type $ContentType.Value(), so how would I go about selecting just the second or third part of this multipart record?
|
# ? Mar 5, 2019 18:11 |
|
Hard to say without having an object of that type to play around with in the console, but everything is an object, so if it's got a Value property, you can do "$ContentType.Value | get-member" and see what's there. And so on and so on, it's objects all the way down. Also you wouldn't use () after Values, because Values is a property not a method. So GetType() is a method so it needs the parenthesis (because methods can potentially take parameters) but properties don't take parameters so you'd get a failure if you tried Values().
|
# ? Mar 5, 2019 18:28 |
|
CzarChasm posted:I'm having a problem with trying to get a split or another way to break up a string to work. Overall, what I'm trying to do is go to my (GMAIL) mailbox and run a report on all the messages that are in there, including the file names of attachments. Here's what I have. You may be able to use the PowerShell -split operation rather than the .NET .split() method code:
|
# ? Mar 5, 2019 18:28 |
|
Thanks for the help, the -split managed to get me what's in the field in a way I can use it.
|
# ? Mar 5, 2019 21:18 |
|
Finally updated to 5.1 and immediately got to use a class, which is super handy. Not that custom objects didn't work before, but this format is familiar and reads better.
|
# ? Mar 20, 2019 15:49 |
|
Ok, so I'm trying to write up a tool we can use in our lab to quickly grab important info from the lab machines and display it in a way that our non-technical techs can handle. I'm not 100% on how the UI stuff works, though. I've got a working form, and it populates great, but when I moved the form population from the end of the scan to updating after each machine, it basically locks me out of the form until the scan completes. I can still see it populating, but the form won't respond to any inputs (I have to close the underlying powershell window to kill it). https://pastebin.com/h2nb50W9
|
# ? Mar 20, 2019 22:30 |
|
Toshimo posted:Ok, so I'm trying to write up a tool we can use in our lab to quickly grab important info from the lab machines and display it in a way that our non-technical techs can handle. I'm not 100% on how the UI stuff works, though. I've got a working form, and it populates great, but when I moved the form population from the end of the scan to updating after each machine, it basically locks me out of the form until the scan completes. I can still see it populating, but the form won't respond to any inputs (I have to close the underlying powershell window to kill it). PowerShell isn't meant to do UIs, it's a console scripting language which performs best in situations with as little interaction as possible. In fact in its truest form PowerShell is meant to run without interaction whilst doing the heavy-lifting for things related to automation, orchestration and reporting. The only reason that it can do UIs altogether is because it can leverage the entire .NET Framework which happens to include classes for creating UIs. Maybe it'd be easier to just have PowerShell output the data you require into a location where it can be consumed by a real front-end application.
|
# ? Mar 22, 2019 13:34 |
|
That's exactly why we're lifting our json data into postgres. Powershell is good at consuming and writing json to ingest, transform, and act on data, but it's really not good at displaying it anywhere other than custom powershell views in the console. I have been known to do an out-gridview here and there in a pinch but it's not my favorite.
|
# ? Mar 22, 2019 13:48 |
|
Toshimo posted:Ok, so I'm trying to write up a tool we can use in our lab to quickly grab important info from the lab machines and display it in a way that our non-technical techs can handle. I'm not 100% on how the UI stuff works, though. I've got a working form, and it populates great, but when I moved the form population from the end of the scan to updating after each machine, it basically locks me out of the form until the scan completes. I can still see it populating, but the form won't respond to any inputs (I have to close the underlying powershell window to kill it). It's usually easier to just use c# once you're doing gui stuff.
|
# ? Mar 22, 2019 13:55 |
|
Is there a simple way to install a standalone version of the SQL Server Module that I'm missing? MS docs say you have to download it from their servers via Install-Module, but all that's blocked on our network. We've attempted extracting the module from a SQL full install and manually doing it, but it becomes a nightmare of needing random .dlls and other files.
|
# ? Mar 22, 2019 15:13 |
|
You should figure out how to get that unblocked, it's like saying "work blocks Ubuntu update servers how do I apt-get install anything"
|
# ? Mar 22, 2019 16:25 |
|
They also block Github and Chocolatey. I was explicitly hired to work on the latter and have to use a personal laptop/wayback machine to reference docs. So not gonna happen.
|
# ? Mar 22, 2019 16:44 |
|
mystes posted:You need to run ShowDialog in a separate thread using Start-Job or something but it's kind of annoying in powershell. Thanks, I'll give it a shot. I don't really have the excess time and bandwidth to develop apps in another language, so I'm kinda at the point where if I can't make something work with the tools I've got, I'm just gonna not bother.
|
# ? Mar 22, 2019 17:25 |
|
Warbird posted:They also block Github and Chocolatey. I was explicitly hired to work on the latter and have to use a personal laptop/wayback machine to reference docs. So not gonna happen. From said other machine Save-Module then copy it over.
|
# ? Mar 22, 2019 18:51 |
|
That appears to be the way to go, but you're then running into some dependencies on having NuGet installed and some other weirdness. I'll keep playing with it, but I suspect we're just going to use a 2017 full MySql install with just the tools and then doing the standalone SSRS install before running the post install config steps.
|
# ? Mar 22, 2019 19:11 |
|
Toshimo posted:Thanks, I'll give it a shot. I don't really have the excess time and bandwidth to develop apps in another language, so I'm kinda at the point where if I can't make something work with the tools I've got, I'm just gonna not bother. code:
mystes fucked around with this message at 20:11 on Mar 22, 2019 |
# ? Mar 22, 2019 20:08 |
|
Warbird posted:They also block Github and Chocolatey. I was explicitly hired to work on the latter and have to use a personal laptop/wayback machine to reference docs. So not gonna happen. To be clear that's the stupidest loving thing in the world and if I were you I'd be looking for another job if they've hired you to do something explicitly and then explicitly prevented you from doing that thing. Because yeah as you've figured out there's no good way to reinvent packaging.
|
# ? Mar 22, 2019 21:08 |
|
|
# ? May 28, 2024 14:48 |
|
Oh believe me I'm either out of here or increasing my rate by a nice chunk when this contract is up in a few months.
|
# ? Mar 22, 2019 21:19 |