Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Toshimo
Aug 23, 2012

He's outta line...

But he's right!
So, this may sound dumb, and that's probably because it is, but uhhh let me kinda describe what I do and what I'm thinking about doing about that, so please give me enough room to hang myself before you bring out the pitchforks.

My team's sole focus is deploying install scripts through CM for enterprise deployment and we are like 95%+ PowerShell, with a tiny fraction of MSIs (I've done like 1 or 2 MSIs in my first year).

We have 5 people on the team and 1 shared "Standard" library that we include with like... sets up a few variables and logging, and has a handful of poorly documented "common" functions. Otherwise, I'm just sort of left to code each new thing from scratch, in part because this team moved over from WISE Scripting a few years back and a lot of the code from the 1st couple of years of PowerShell has not... aged gracefully.

All our applications are stored on a big ol' network share, scripts and payloads and everything, for posterity. I'm free to paw through and use anything I like, but that's largely dependant on having the institutional knowledge to know how each app was done to find stuff for reuse, and again, a lot of the older stuff isn't great.

My working theory is to beg a bitbucket setup off the team that manages that (even though it's not really supposed to be used for this), and at least getting all our PS code up into it, so we've got something reasonably searchable so maybe there's a little more standardization and a little less reinventing the wheel.

I won't be able to put all the payload files up there, I expect, but maybe I can also export all the applications from CM and throw the XML up there as well (we've really only just started actually exporting and removing old obsolete CM applications, we had ~7 years of Everything Ever Written still clogging up CM before Microsoft put their foot down and told us the servers were keeling over and they weren't going to be able to support our infrastructure if we kept doing that).

Does any of this sound logical? I've got a lot of management and team goodwill I can burn, but effectively a $0 budget, and probably a million security restrictions, but I can't see us all just operating our 5 little independent code fiefdoms forever.

Adbot
ADBOT LOVES YOU

The Fool
Oct 16, 2003


Get stuff organized into repos ASAP. If your org is already on bitbucket it’s probably fine but Azure devops is free for 5 users and you’ll get some lightweight project management, repos, and pipelines.

Azure devops pipelines work amazingly well as task runners.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

The Fool posted:

Get stuff organized into repos ASAP. If your org is already on bitbucket it’s probably fine but Azure devops is free for 5 users and you’ll get some lightweight project management, repos, and pipelines.

Azure devops pipelines work amazingly well as task runners.


Thanks.

I know like... 1 of those words, but I'll start looking into it. I'm here for the next 20ish years, most likely, so I've got time to grow it. I don't think we're going to get any MS stuff for free, even if only my team of 5 wanted to use it, though. But if I figure out what I'm doing, I can probably get any MS thing eventually, since we've got like a literal billion dollar MS contract.

adaz
Mar 7, 2009

Toshimo posted:

So, this may sound dumb, and that's probably because it is, but uhhh let me kinda describe what I do and what I'm thinking about doing about that, so please give me enough room to hang myself before you bring out the pitchforks.

My team's sole focus is deploying install scripts through CM for enterprise deployment and we are like 95%+ PowerShell, with a tiny fraction of MSIs (I've done like 1 or 2 MSIs in my first year).

We have 5 people on the team and 1 shared "Standard" library that we include with like... sets up a few variables and logging, and has a handful of poorly documented "common" functions. Otherwise, I'm just sort of left to code each new thing from scratch, in part because this team moved over from WISE Scripting a few years back and a lot of the code from the 1st couple of years of PowerShell has not... aged gracefully.

All our applications are stored on a big ol' network share, scripts and payloads and everything, for posterity. I'm free to paw through and use anything I like, but that's largely dependant on having the institutional knowledge to know how each app was done to find stuff for reuse, and again, a lot of the older stuff isn't great.

My working theory is to beg a bitbucket setup off the team that manages that (even though it's not really supposed to be used for this), and at least getting all our PS code up into it, so we've got something reasonably searchable so maybe there's a little more standardization and a little less reinventing the wheel.

I won't be able to put all the payload files up there, I expect, but maybe I can also export all the applications from CM and throw the XML up there as well (we've really only just started actually exporting and removing old obsolete CM applications, we had ~7 years of Everything Ever Written still clogging up CM before Microsoft put their foot down and told us the servers were keeling over and they weren't going to be able to support our infrastructure if we kept doing that).

Does any of this sound logical? I've got a lot of management and team goodwill I can burn, but effectively a $0 budget, and probably a million security restrictions, but I can't see us all just operating our 5 little independent code fiefdoms forever.

I was in sort of your position 6 years or so ago, and your steps seem pretty logical. Here is, personally, some concrete steps I would try - you're effectively trying to move your team from a disconnected bunch of people to a dev team


Gitlab offers free private repos as well and has a bit more of a mature devops stack too if you ever get into that. It's my personal favorite nowadays. regardless...

1. Organize all your stuf into a repo.
2. give it some sort of structure! Maybe around ... program install type? Developer? Doesnt matter!
3. set a default branch!
4. tell folks to MR/PR in requests for new code, and let everyone peer review them to both spread knowledge/best practices/code reuse!
5. Start insisting documentation - powershell has pretty good built in documentation abilities in their scripts'
6. Setup a template everyone should use for their script that pull in the (hopefully now) better documented common functions
7. More Advanced: you can add gitlab/ADO runners that do stuff like static code analysis to check for best practices when people put in PRs and install scripts!
8. Even More Advanced: eventually, you can on merge have it automatically deploy the script/package to CM and setup the advertisement (that whaat they still call it?) package, etc on merge .. and nobody ever manually touches the CM console ever again, all done through code :)

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Yeah, like today I wanted to stop a service in a script and I was like... I think I remember doing this before but idk which script it was in. So, I had to think it all out and do it all in steps because you can't just "stop a service".

It's:
  1. Verify the service exists.
  2. Bail if it doesn't.
  3. Verify the service is running.
  4. Log that you are stopping the service.
  5. Set the wait duration.
  6. Stop the service as a job.
  7. Cycle a while loop every second to check if it's still running.
  8. Bail and Log if the job fails.
  9. After the wait duration elapses, check if the service is running.
  10. If it is, log and bail.
  11. If it's good and stopped, log and continue.

And then do it all in reverse at the end of the script to start the service again.

I set myself a reminder to pull that code out and save it in my snippets folder at the end of the week (I've been in PowerBI training all week so, my bandwidth until Friday is limited or I'd do it while it's fresh).

adaz
Mar 7, 2009

Toshimo posted:

Yeah, like today I wanted to stop a service in a script and I was like... I think I remember doing this before but idk which script it was in. So, I had to think it all out and do it all in steps because you can't just "stop a service".

It's:
  1. Verify the service exists.
  2. Bail if it doesn't.
  3. Verify the service is running.
  4. Log that you are stopping the service.
  5. Set the wait duration.
  6. Stop the service as a job.
  7. Cycle a while loop every second to check if it's still running.
  8. Bail and Log if the job fails.
  9. After the wait duration elapses, check if the service is running.
  10. If it is, log and bail.
  11. If it's good and stopped, log and continue.

And then do it all in reverse at the end of the script to start the service again.

I set myself a reminder to pull that code out and save it in my snippets folder at the end of the week (I've been in PowerBI training all week so, my bandwidth until Friday is limited or I'd do it while it's fresh).

for stuff like this it's really a good idea to keep an index of your functions and what you do - like a README.MD in the directory. Also, naming conventions help, I try and followed powershell's own suggestions. And then organization in a module.. which you can import into your script and use over again yay

Public documentation of your own "internal api" of stuff is for sure one of the best benefits of getting a template/index/naming convention

Zaepho
Oct 31, 2013

Toshimo posted:

We have 5 people on the team and 1 shared "Standard" library that we include with like... sets up a few variables and logging, and has a handful of poorly documented "common" functions. Otherwise, I'm just sort of left to code each new thing from scratch, in part because this team moved over from WISE Scripting a few years back and a lot of the code from the 1st couple of years of PowerShell has not... aged gracefully.

I'm going to address just this bit here. If y'all aren't using https://github.com/PSAppDeployToolkit/PSAppDeployToolkit you should really take a look at it. It may help with a ton of the common library stuff. It also helps encourage building uninstalls alongside the installs.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Zaepho posted:

I'm going to address just this bit here. If y'all aren't using https://github.com/PSAppDeployToolkit/PSAppDeployToolkit you should really take a look at it. It may help with a ton of the common library stuff. It also helps encourage building uninstalls alongside the installs.

We absolutely are not, and I'll take a look at it, thanks.

Part of the problem with moving the group wholesale to any new platform at this point is that we're very beholden to a lot of legacy logging requirements (self-inflicted, not mandated by management), so everything has to log "just so". Unfortunately, each team member does it ~just a little differently~ and even vary it up by script, which is part of why I want to standardize. I figure it's probably a three-step process at this point (for that particular thing): Step 1 is getting all our shared code synched up; Step 2 is to start finding a library system to move to so we aren't writing it ourselves all the time; Step 3 is to migrate our logging to use something compatible with the new system.

A good example of a basic problem I'm trying to fix:

  • Our logging writes to ~4 separate log files of varying verbosity, by default, with some scripts adding more. this is in addition to any CM logs generated.
  • These 4 log files are written for different audiences and obscured away at different levels based on technical competency of the target audience.
  • One of the log files, the most public-facing, and the one we recommend end users and low-tier techs check, is written at script exit with an extension based on script function and status, replacing all previous logs of that type, regardless of extension.
  • The typical extensions are .S (successful install), .F (Any failure), .U (successful uninstall), .I (script incomplete, probably pending reboot).
  • Buuuuuttttt, there's no standard for what these logs contain. Some of the are 0-byte empty files, some include return codes, some include timestamps, some have human-readable summaries, it entirely varies not only by who wrote it, but how they were feeling that day.

I was asked during QC on a script by our senior team member why I had a line at the end of my script that returned a slightly verbose text description with my return code stone point, and I replied that I thought it was good practice to put something in the short, public-facing log that would be informative to the casual observer, and that I was a bit concerned that a lot of them were empty or just a return code. He replied with "I don't know the last time I ever looked at one of those short logs, we just go straight to the verbose ones". I told him that we had 100,000 users and that there were 5 of us, so I thought it was important for the other 99,995 people who would look at a log to have something meaningful, especially if they were reporting a problem to the help desk. He's a pretty good guy, and said he'd look into updating our standard template to have a more standardized default message in the short log, thankfully, but it's very much just one thing on the Big Pile of Cruft That Needs Addressing.

adaz
Mar 7, 2009

I mean that sounds awful. I would suggest just using something like NLog which is a very popular, widely use .NET logging framework with powershell and let it handle most of that for you. its not super hard to use with powershell. I havnet used it but Poshlog uses serialog and looks really nice. Serialog is the other super full featured .net logging framework that everyone uses. but all of them by default let you do things like send only error messages to a specific log and verbose/debug/info to another log file. and they stream write as the script runs,etc. They also support a million different datasources - not just text files but hooks into sql, the console, and other third party log aggregation frameworks like ELK

adaz fucked around with this message at 16:03 on Jun 24, 2021

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
I mean, I count myself thankful that we do do a bunch of logging, and QC, and testing, and a number of other helpful things, even if we aren't exactly at Best Practices level, yet. And that the team and management are both pretty receptive to change (although preferably incremental).

And sometimes the stuff catches me doing stuff that I can do better, even if it's Not Wrong. Like, I had a recent script where I was removing some item properties, and I just wanted them gone, didn't really care if the item had the properties in the first place. So, I just Removed them, and let PS catch the exception. Not So Fast. Even though I was catching the exception and even though it didn't matter that I was trying to remove something that didn't exist, it started bloating up the super verbose transcript log with Informative Error Messages. So, I just added a quick check for existence, and now my logs are clean. I guess it's Technically More Correct this way, even if there's no actual practical difference, but I'd rather get nudged towards being a little more meticulous every now and then, if it also catches me when I Legit gently caress Up.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
If my "CM" you mean MEMCM (formerly but not actually SCCM) aka ConfigMgr, then you can also log in a format that CMTrace will understand, because odds are good you'll be comfortable with it, and you know it will be there if you're troubleshooting deployments.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

FISHMANPET posted:

If my "CM" you mean MEMCM (formerly but not actually SCCM) aka ConfigMgr, then you can also log in a format that CMTrace will understand, because odds are good you'll be comfortable with it, and you know it will be there if you're troubleshooting deployments.

I will deliver to you the unfortunate news that writing code that returns meaningful data to SCCM/MEMCM/ConfigMgr has at some point been dismissed as an option because "it was just not actually reading the return codes anyway" or something. It's another thing on the pile for me to investigate.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I'm not talking about returning data to MEMCM, I'm talking about writing logs in a formation that CMTrace will understand:
https://janikvonrotz.ch/2017/10/26/powershell-logging-in-cmtrace-format/
If you're not familiar with/aware of CMTrace then look for it in C:\Windows\CCM\CMtrace.exe and use it to open up some logs in C:\Windows\CCM\Logs and be amazed at its capability.

adaz
Mar 7, 2009

CMTrace y'all are bringing back MEMORIES of years ago of debugging CCMEXec logs <3

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
The tail -f of Windows!

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
I know there's a way to do it but for whatever reason my brain and searching are failing me: how does one succinctly build a list of search results from three or more arrays looking for a match in object property?

I have a number of Select-String search results and need to capture the filename, line number, and line content so obviously just keeping the MatchInfo object onhand is a good idea. The trick is they want to search for multiple patterns that are match. For doing it with two it's relatively simple but I'm brainfarting on the best way to do more in a scalable way.

The idea would be you'd hunt for "foo", "bar", and "buzz" and if two different files existed that contained all of them you'd get back 6 results (the MatchInfo values for 3 hits on 2 files).

Any thoughts?

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

PierreTheMime posted:

I know there's a way to do it but for whatever reason my brain and searching are failing me: how does one succinctly build a list of search results from three or more arrays looking for a match in object property?

I have a number of Select-String search results and need to capture the filename, line number, and line content so obviously just keeping the MatchInfo object onhand is a good idea. The trick is they want to search for multiple patterns that are match. For doing it with two it's relatively simple but I'm brainfarting on the best way to do more in a scalable way.

The idea would be you'd hunt for "foo", "bar", and "buzz" and if two different files existed that contained all of them you'd get back 6 results (the MatchInfo values for 3 hits on 2 files).

Any thoughts?

Something like this?

code:
$a = @('1', '2', '3')
$b = @('2', '3', '4')
$c = @('3', '4', '5')

$a + $b + $c | Select-Object -Unique

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

New Yorp New Yorp posted:

Something like this?

code:
$a = @('1', '2', '3')
$b = @('2', '3', '4')
$c = @('3', '4', '5')

$a + $b + $c | Select-Object -Unique

Well, the issue is that I need to match it on the filename property specifically, the other values of the object may differ because the same file name might exist in different places across difference file systems.

Basically someone made a whole bunch of scripts and never kept track of them and now they want to be able to search which scripts contain shared resources so they know if any changes/migrations might impact others.

Edit: I can probably just use multiple patterns in Select-String now that I think about it, I'll tinker around.

PierreTheMime fucked around with this message at 23:14 on Jul 12, 2021

Luna
May 31, 2001

A hand full of seeds and a mouthful of dirt


Crosspost from the programming thread:

Not sure if PowerShell questions are considered programing related but I'd like to see if anyone can offer advice. This also is in Azure so that may further isolate me on this.


I'm trying to loop through all of my Azure subscriptions and pull automation account expiry date information. I can get this to work with single subscriptions but when I try it with multiple subs, it only returns data from the last sub processed. The issue seems to be keeping the subscription context through the lower loops. I have similar issues if I nest the loops or run them sequentially.

The initial ForEach loop works, it grabs all resource groups from all subscriptions. When I pass it the results on to the next ForEach, that is when it loses the context switching.

Any advice is appreciated.

code:
 
$Subscriptions = @('Sub1','Sub2','Sub3')

ForEach ($Sub in $Subscriptions){
   Set-AzContext $Sub
   $RGs += Get-AzResourceGroup

            ForEach ($RG in $RGs){
                $AAs = Get-AzAutomationAccount
            }
}

Luna fucked around with this message at 20:27 on Jul 16, 2021

Toast Museum
Dec 3, 2005

30% Iron Chef

Luna posted:

Crosspost from the programming thread:

Not sure if PowerShell questions are considered programing related but I'd like to see if anyone can offer advice. This also is in Azure so that may further isolate me on this.


I'm trying to loop through all of my Azure subscriptions and pull automation account expiry date information. I can get this to work with single subscriptions but when I try it with multiple subs, it only returns data from the last sub processed. The issue seems to be keeping the subscription context through the lower loops. I have similar issues if I nest the loops or run them sequentially.

The initial ForEach loop works, it grabs all resource groups from all subscriptions. When I pass it the results on to the next ForEach, that is when it loses the context switching.

Any advice is appreciated.

code:
 
$Subscriptions = @('Sub1','Sub2','Sub3')

ForEach ($Sub in $Subscriptions){
   Set-AzContext $Sub
   $RGs += Get-AzResourceGroup

            ForEach ($RG in $RGs){
                $AAs = Get-AzAutomationAccount
            }
}
    I haven't used that module, but two things jump out at me:

  • Why $RGs += Get-AzResourceGroup rather than $RGs = Get-AzResourceGroup?
  • Don't you need to specify the resource group name for Get-AzAutomationAccount?

Does this work any better?
code:
$AAs = foreach ($Sub in $Subscriptions)
{
    Set-AzContext $Sub
    Get-AzResourceGroup | ForEach-Object
    {
        Get-AzAutomationAccount -ResourceGroupName $_.ResourceGroupName
    }
}

Mario
Oct 29, 2006
It's-a-me!
$AAs gets overwritten with each loop over $RGs, so it makes sense that only the last retrieved value is available. Also, building arrays and looping with foreach..in feels awkward in PS rather than using the pipeline.

Try something like:
code:
$Subscriptions = @('Sub1','Sub2','Sub3')

$AAs = $Subscriptions | Foreach-Object { Set-AzContext | Out-Null; Get-AzAutomationAccount }
Each invocation of Get-AzAutomationAccount puts its results onto the pipeline because we don't assign it to a variable. These get flattened into a single collection and assigned to $AAs. Set-AzContext has its output piped to Out-Null (much like /dev/null) so it does not pollute the main pipeline result. Get-AzResourceGroup is omitted since it appears you want to get automation accounts in all resource groups anyways.

Luna
May 31, 2001

A hand full of seeds and a mouthful of dirt


Toast Museum posted:

    I haven't used that module, but two things jump out at me:

  • Why $RGs += Get-AzResourceGroup rather than $RGs = Get-AzResourceGroup?
  • Don't you need to specify the resource group name for Get-AzAutomationAccount?

Does this work any better?
code:
$AAs = foreach ($Sub in $Subscriptions)
{
    Set-AzContext $Sub
    Get-AzResourceGroup | ForEach-Object
    {
        Get-AzAutomationAccount -ResourceGroupName $_.ResourceGroupName
    }
}

Thanks Toast, this works better but I need to pull the Certificate info from all the automation accounts (get-azautomationcertificate) from the results.

My initial thought was to whittle down the results from Subscriptions -> ResourceGroups -> AutomationAccounts -> AutomationCertificates. If I get to the automationaccounts, then I am back trying to nest another loop for the cert info and I've lost context again. I like Marios idea about keeping everything in the pipeline but I don't think I can stretch it that far.

Luna
May 31, 2001

A hand full of seeds and a mouthful of dirt


Mario posted:

$AAs gets overwritten with each loop over $RGs, so it makes sense that only the last retrieved value is available. Also, building arrays and looping with foreach..in feels awkward in PS rather than using the pipeline.

Try something like:
code:
$Subscriptions = @('Sub1','Sub2','Sub3')

$AAs = $Subscriptions | Foreach-Object { Set-AzContext | Out-Null; Get-AzAutomationAccount }
Each invocation of Get-AzAutomationAccount puts its results onto the pipeline because we don't assign it to a variable. These get flattened into a single collection and assigned to $AAs. Set-AzContext has its output piped to Out-Null (much like /dev/null) so it does not pollute the main pipeline result. Get-AzResourceGroup is omitted since it appears you want to get automation accounts in all resource groups anyways.

Thanks Mario. I like this idea but I have to start with the resource group collection because my end goal is to get the automation cert info and get-azautomationcertificate requires resourcegroupname.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Why are you adding to the list of Resource Groups and looping through the entire list each time?

Toast Museum
Dec 3, 2005

30% Iron Chef

Luna posted:

Thanks Mario. I like this idea but I have to start with the resource group collection because my end goal is to get the automation cert info and get-azautomationcertificate requires resourcegroupname.

Have you tried piping an AutomationAccount to Get-AzAutomationCertificate? The docs say that AutomationAccount objects have AutomationAccountName and ResourceGroupName properties, and that Get-AzAutomationCertificate's -AutomationAccountName and -ResourceGroupName parameters accept pipeline input by property name.

Luna
May 31, 2001

A hand full of seeds and a mouthful of dirt


Toast Museum posted:

Have you tried piping an AutomationAccount to Get-AzAutomationCertificate? The docs say that AutomationAccount objects have AutomationAccountName and ResourceGroupName properties, and that Get-AzAutomationCertificate's -AutomationAccountName and -ResourceGroupName parameters accept pipeline input by property name.

Yea, I piped everything together and threw it in a foreach loop and it worked. Thanks for knocking some sense into me. I'm not a natural coder so I take everything step by step and miss the bigger picture sometimes.

Luna
May 31, 2001

A hand full of seeds and a mouthful of dirt


FISHMANPET posted:

Why are you adding to the list of Resource Groups and looping through the entire list each time?

Because I am dumb as a bag of hair.

MustardFacial
Jun 20, 2011
George Russel's
Official Something Awful Account
Lifelong Tory Voter
I need powershell help because I am a broke-brain.

Basically what I want it to do is take a pre-compiled list of computers, iterate through that list looking for a certain installed program, and when it finds that program open an command prompt on the remote machine and:
  • Uninstall it
  • Delete all local GPO's
  • Update GPO's.

I can get it to iterate through the list and find the program installation, however when it comes time to take actions on those machines it just blows through all of the commands way too fast and exits. Or it prompts me for another set of credentials (which it shouldn't) and then fails, and now for some reason it seems to just be crashing or exiting without even looking for the program install. My powershell skills are amateur at best and I honestly can't figure out why this isn't working so any ideas or directions I can go in would be great. Here is the code:

code:
$Credential = Get-Credential
$PCName = Get-Content "C:\path\to\file\remotecomputers.txt"

foreach ($name in $PCName) {
    $prog = Get-WmiObject Win32_product -ComputerName $name -Credential $Credential | Where-Object Name -eq "ProgramX"
    if ($prog) {
        Write-Host "Found ProgramX install on $name"
        Invoke-Command -ComputerName $name -Credential $Credential -ScriptBlock {Start-Process cmd.exe -ArgumentList {"/c C:\windows\ProgramX\programx.exe /uninstall" };
            Start-Process -wait cmd.exe -ArgumentList {"/c RD /S /Q `"%WinDir%\System32\GroupPolicyUsers`""};
            Start-Process -wait cmd.exe -ArgumentList {"/c RD /S /Q `"%WinDir%\System32\GroupPolicy`""};
            Start-Process -wait cmd.exe -ArgumentList {"/c gpupdate /force"}
        }
        Write-Host "Finished process on $name"
    }
}

Mario
Oct 29, 2006
It's-a-me!
Do you actually need to involve cmd.exe? It seems simpler to remove items and run the uninstaller/gpupdate from PS directly.

MustardFacial
Jun 20, 2011
George Russel's
Official Something Awful Account
Lifelong Tory Voter
For the program I do. The only way it can be uninstalled is by running the setup exe with the /uninstall argument. There is no uninstall key in the registry. It probably would be easier to just call Remove-Item on the directories though.

Toast Museum
Dec 3, 2005

30% Iron Chef

MustardFacial posted:

For the program I do. The only way it can be uninstalled is by running the setup exe with the /uninstall argument. There is no uninstall key in the registry. It probably would be easier to just call Remove-Item on the directories though.

The question is whether there's anything preventing you from writing your ScriptBlocks like this:
code:
Start-Process "C:\windows\ProgramX\programx.exe" -ArgumentList "/uninstall"
Edit: assuming -Wait makes it necessary to use Start-Process at all.

Toast Museum fucked around with this message at 01:17 on Oct 8, 2021

Toast Museum
Dec 3, 2005

30% Iron Chef
Sorry for the back-to-back posts, but some other thoughts:

I'd use Get-CimInstance within Invoke-Command instead of using Get-WmiObject. That'll reduce the number of remote connections you open by the number of computers with that program. Also, since Get-WmiObject uses a different mechanism than Invoke-Command to reach remote computers, getting rid of it reduces your odds of running into firewall issues and whatnot.

You may want to find an alternative to querying Win32_Product

quote:

Win32_product class isn't query optimized. Queries such as select * from Win32_Product where (name like 'Sniffer%') require WMI to use the MSI provider to enumerate all of the installed products and then parse the full list sequentially to handle the where clause. This process also starts a consistency check of packages installed, verifying, and repairing the install.
(Emphasis added.) If the program always lives in the same location, a simple Test-Path might work, at least as a first approximation.

guppy
Sep 21, 2004

sting like a byob
I am an okay scripter, but I don't do a lot of PowerShell, so I have what are probably some pretty basic questions. I need to add a new DNS server to option 6 in DHCP on all the scopes on a whole mess of servers. I do see the Get- and Set-DHCPServerV4OptionValue cmdlet, but my main questions are:

- Can I add a new DNS server, or do I have to just set the value to a new array -- apparently it takes an array, right? -- that includes all of the DNS servers I want?

- What if I want to add the new one as the first one in the list, and not the last?

- If I have to just get the current list, and create a new array with the new one + the existing ones, how exactly do I do that in PS?

I was able to get the list of current ones with

code:
Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select Value
I'm not sure if I should actually be selecting that property or if I should just get the whole option. If I select the Value property, I get back an object of type PSCustomObject, with IP addresses in {curly, braces}. If I get the whole option the returned object is a CimInstance. I also obviously only need to care about this if I can't just add a new address to the front of the list.

Obviously I've removed my test server name and IP address. I am aware after some research that Set-DHCPServerV4OptionValue has an actual -DnsServer argument and I don't have to use OptionID 6, but I don't know if it matters. I don't really know how to build this new array, and my experimenting hasn't worked. Say my current servers are 5.6.7.8 and 9.10.11.12, and I want to add 1.2.3.4 to the beginning of the list. I tried this:

code:
$old = Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select Value

$new = @("1.2.3.4")

foreach ( $server in $old )
{
  $new += $server
}
This adds the old DNS servers to the array, but for reasons I don't understand, when I enumerate the contents of $new afterward, the original 1.2.3.4 entry is gone, even though, if I enumerated its contents immediately after the second line, it's in there.

I also don't know whether it's correct to put 1.2.3.4 in there as a string! But if I omit the quotes, PS throws an error.

Toast Museum
Dec 3, 2005

30% Iron Chef

guppy posted:

I was able to get the list of current ones with

code:
Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select Value
I'm not sure if I should actually be selecting that property or if I should just get the whole option. If I select the Value property, I get back an object of type PSCustomObject, with IP addresses in {curly, braces}. If I get the whole option the returned object is a CimInstance. I also obviously only need to care about this if I can't just add a new address to the front of the list.

Obviously I've removed my test server name and IP address. I am aware after some research that Set-DHCPServerV4OptionValue has an actual -DnsServer argument and I don't have to use OptionID 6, but I don't know if it matters. I don't really know how to build this new array, and my experimenting hasn't worked. Say my current servers are 5.6.7.8 and 9.10.11.12, and I want to add 1.2.3.4 to the beginning of the list. I tried this:

code:
$old = Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select Value

$new = @("1.2.3.4")

foreach ( $server in $old )
{
  $new += $server
}
This adds the old DNS servers to the array, but for reasons I don't understand, when I enumerate the contents of $new afterward, the original 1.2.3.4 entry is gone, even though, if I enumerated its contents immediately after the second line, it's in there.

I also don't know whether it's correct to put 1.2.3.4 in there as a string! But if I omit the quotes, PS throws an error.

Caveat: I haven't used the DhcpServer module specifically.

It looks like you may be running into issues because $new ends up containing multiple types, which is probably causing something to choke on it later in the script. As written, this line
code:
Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select Value
is going to give you an array of Selected.CimInstance objects. To get just the value of the property named Value, you want one of the following:

code:
Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select -ExpandProperty Value
# or
(Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6).Value
If Value is an object with a property named IPAddress or something, then to get just those addresses, you'd want
code:
Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6 | Select -ExpandProperty Value | Select -ExpandProperty IPAddress
# or
(Get-DHCPServerV4OptionValue -ComputerName dhcpservername -ScopeID 1.2.3.0 -OptionID 6).Value.IPAddress
Re: the original value of $new going missing, if you're testing this in the terminal, the way the output is formatted can make it easy to overlook the original entry. For example:
code:
PS> $Old = [PSCustomObject]@{IPAddress = @("5.6.7.8")},
[PSCustomObject]@{IPAddress = @("9.10.11.12")},
[PSCustomObject]@{IPAddress = @("13.14.15.16")},
[PSCustomObject]@{IPAddress = @("17.18.19.20")}
PS> $new = @("1.2.3.4")
PS> foreach ( $server in $old )
{
   $new += $server
}
PS> $new
1.2.3.4

IPAddress
---------
{5.6.7.8}
{9.10.11.12}
{13.14.15.16}
{17.18.19.20}
A mixed array like this usually isn't great because, to use the example above, now you've got one IP address at $new[0], and the rest within arrays in each added item's IPAddress property. Whatever the array is being fed to probably won't know what to do with at least some of those items.

Re: putting the IP address in quotes, for most cmdlets, that's fine; PowerShell is usually pretty good about implicitly converting types. If you're dealing with a cmdlet or .NET method that specifically needs you to feed it an IPAddress object, you can explicitly cast the string:
code:
PS> $foo = [IPAddress]"1.2.3.4"
PS> $foo | Get-Member

   TypeName: System.Net.IPAddress

...

guppy
Sep 21, 2004

sting like a byob
Thanks! That's great, detailed info. I will give this another go.

Toast Museum
Dec 3, 2005

30% Iron Chef
PowerShell 7.2 is out!

Microsoft Update support is a nice quality-of-life improvement. The expanded ANSI support and new $PSStyle automatic variable have got me obsessing over my color theme choices.

Pile Of Garbage
May 28, 2007



Toast Museum posted:

PowerShell 7.2 is out!

Microsoft Update support is a nice quality-of-life improvement. The expanded ANSI support and new $PSStyle automatic variable have got me obsessing over my color theme choices.

Also notable that it's built on .NET 6 which also entered GA at the same time I think (Alongside Visual Studio 2022).

Toast Museum
Dec 3, 2005

30% Iron Chef

Pile Of Garbage posted:

Also notable that it's built on .NET 6 which also entered GA at the same time I think (Alongside Visual Studio 2022).

Yeah, I should have mentioned that. I'm working on a compiled module as a side project, and C# 10's new syntax has been nice to have. Between file-scoped namespaces and global using directives, I find myself writing a lot less boilerplate. Visual Studio 2022's upgraded IntelliSense is pretty nice to have as well, even if it does sometimes make odd choices. I do wish Visual Studio included a PowerShell Module project type, though.

Of mostly personal significance, a breaking change introduced in .NET 6 got me started with contributing to the PowerShell repository. After seeing how quickly the issue I reported got resolved, I started looking through the open issues for something to help with. It's the first public repository I've contributed to, and it feels pretty cool to know that a few lines of code that I wrote are slated for PowerShell 7.3.

Nth Doctor
Sep 7, 2010

Darkrai used Dream Eater!
It's super effective!


Toast Museum posted:

Of mostly personal significance, a breaking change introduced in .NET 6 got me started with contributing to the PowerShell repository. After seeing how quickly the issue I reported got resolved, I started looking through the open issues for something to help with. It's the first public repository I've contributed to, and it feels pretty cool to know that a few lines of code that I wrote are slated for PowerShell 7.3.

Nice!

Adbot
ADBOT LOVES YOU

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
Hell, yeah, my dudes. Powershell has been added to SA's bbcode: [code=Powershell]

PowerShell code:
$Credential = Get-Credential
$PCName = Get-Content "C:\path\to\file\remotecomputers.txt"

foreach ($name in $PCName) {
    $prog = Get-WmiObject Win32_product -ComputerName $name -Credential $Credential | Where-Object Name -eq "ProgramX"
    if ($prog) {
        Write-Host "Found ProgramX install on $name"
        Invoke-Command -ComputerName $name -Credential $Credential -ScriptBlock {Start-Process cmd.exe -ArgumentList {"/c C:\windows\ProgramX\programx.exe /uninstall" };
            Start-Process -wait cmd.exe -ArgumentList {"/c RD /S /Q `"%WinDir%\System32\GroupPolicyUsers`""};
            Start-Process -wait cmd.exe -ArgumentList {"/c RD /S /Q `"%WinDir%\System32\GroupPolicy`""};
            Start-Process -wait cmd.exe -ArgumentList {"/c gpupdate /force"}
        }
        Write-Host "Finished process on $name"
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply