Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xsf421
Feb 17, 2011

Dirt Road Junglist posted:

Also, if you don't have a test environment, using -WhatIf at the end of a command can tell you what it's going to do without it actually committing any actions.

Don't rely on this, unfortunately. It's too random when the command decides to ignore it.

Adbot
ADBOT LOVES YOU

The Fool
Oct 16, 2003


xsf421 posted:

Don't rely on this, unfortunately. It's too random when the command decides to ignore it.

It's not random, but WhatIf support is sporadic, even in first-party modules.

This is a reasonable way to get a list of cmdlets that don't support whatif in a given module.

code:
 Get-Command -module $moduleName | Where { -Not $_.Parameters['whatif']} | Select Name

Pile Of Garbage
May 28, 2007



The bottom line is that even if a CmdletBinding declaration includes SupportsShouldProcess you should never assume that it will work in the way you expect it to as it depends entirely on how the cmdlet has been written. Always worth testing behaviour beforehand.

thebigcow
Jan 3, 2001

Bully!
All the AD cmdlets had broken support for -whatif

They might still idk.

mllaneza
Apr 28, 2007

Veteran, Bermuda Triangle Expeditionary Force, 1993-1952




So today I'm updating our new-ish director about the reporting I'm running to find Win7 and XP machines that have their patches for the current massive gaping exploit. He puts in "So you're using PowerShell Sessions to gather the info, right ?"

Bitch please. The next machine I find in that domain that allows PS Remoting will be the first one. There's a reason almost every script I write calls psexec at some point.

Speaking of which, does anyone have any handy techniques for getting information back from psexec ? I rarely get an error that shows up in a catch{} block, but an accurate count on "Access Denied" would have been just loving super today.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
So I'm having an odd issue where adding an AD user to an ACL usually works but occasionally only adds them as an unresolved SID, so usage based on the account name fails. Any ideas?

Here's the function I wrote. Like I said, works most of the time, but occasionally generates the user as a S-1-5-21* SID and I'm sure it's something I'm doing.
code:
Function AddADPermission {
	Param ([String]$Name, [String]$Path, [String]$Permission, [String]$Inheritance)
	$Folder = Get-Item -LiteralPath "$Path"
	$ACL = $Folder.GetAccessControl()
			
	ForEach ($User in $ACL.access) {
		#If AD object previously had permissions, clear them to avoid inconsistencies
		If ($User.IdentityReference -Match [regex]::escape($Name)) {
			$ACL.RemoveAccessRule($User) > $null
		}
	}
	#Apply modify permissions to ACL
	Write-Host "Updating permissions for $Name on folder $Path"
	$Rule = New-Object System.Security.Accesscontrol.FileSystemAccessRule($Name,$Permission,$Inheritance,"None","Allow")
	$ACL.SetAccessRule($Rule)
	$Folder.SetAccessControl($ACL)
}

Submarine Sandpaper
May 27, 2007


you can pass via the AD user's sid rather than their sAm

Also (if it's working this probably doesn't matter) but powershell has get-acl and set-acl so you don't need to use the .*AcessControl() methods. Otherwise have you noticed if your intermittent failures are where the user already has permissions? I had a project shelved just adding permissions since it could potentially be too resource intensive if the stars aligned to gently caress up backups. I may have also been fed a crock of poo poo.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
It’s primarily been used as a part of a larger script for creating and applying new users to an SFTP share/service. I can try redoing it using the applets instead of accessing the ACL directly, but like I said it works most of the time.

:iiam:

Submarine Sandpaper
May 27, 2007


Sometimes powershell just fucks up. I have the following:

code:
$Accounts = get-aduser -filter {enabled -eq $true -and name -like "SV*"} -Properties PasswordLastSet,Manager |
    where { $_.PasswordLastSet -lt (get-date).addyears(-1) }
And it occasionally pulls disabled accounts which really raises questions with the report :shrug:

The Fool
Oct 16, 2003


I think filter is more reliable if you use the string "True"

Defenestrategy
Oct 24, 2010

My company moved to ~The Cloud~ and I'm learning this powershell thing. I made a pretty neato script to do some standard AAD stuff with, but I want to know is there a way to take arguments in to your script?


Consider a really basic script named script.ps1
code:
$name = Read-Host -Prompt "Whats the users principle name?"

$userid = Get-AzureAdUser -ObjectID "$name"

So instead of going through the dialog options to generate the user id instead call "C:\script.ps1 john.doe@x.com" and for it to fill in the $name variable and then generate $userid.

reL
May 20, 2007

Defenestrategy posted:

Consider a really basic script named script.ps1
code:
$name = Read-Host -Prompt "Whats the users principle name?"

$userid = Get-AzureAdUser -ObjectID "$name"

So instead of going through the dialog options to generate the user id instead call "C:\script.ps1 john.doe@x.com" and for it to fill in the $name variable and then generate $userid.

param([string]$Name)

$name is now a usable variable in your script.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Defenestrategy posted:

My company moved to ~The Cloud~ and I'm learning this powershell thing. I made a pretty neato script to do some standard AAD stuff with, but I want to know is there a way to take arguments in to your script?


Consider a really basic script named script.ps1
code:
$name = Read-Host -Prompt "Whats the users principle name?"

$userid = Get-AzureAdUser -ObjectID "$name"

So instead of going through the dialog options to generate the user id instead call "C:\script.ps1 john.doe@x.com" and for it to fill in the $name variable and then generate $userid.

Not trying to be a jerk, but did you google this first? The phrase "powershell script arguments" brings up dozens of results that answer your question.

mllaneza
Apr 28, 2007

Veteran, Bermuda Triangle Expeditionary Force, 1993-1952




You'll also want to look into Read-Host, and also the Get-Content/foreach paradigm.

TITTIEKISSER69
Mar 19, 2005

SAVE THE BEES
PLANT MORE TREES
CLEAN THE SEAS
KISS TITTIESS




I need to figure out whose mailboxes are being forwarded to OldGuy, and instead forward them all to NewGuy. On-prem Exchange 2013. Where do I begin?

Potato Salad
Oct 23, 2014

nobody cares


Google Create a forwarding rule and scroll past the Outlook resulta

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I'm looking for things to poke at with this generic double-hop error message I'm getting while trying to use vagrant to provision a Windows VM:

quote:

Program 'python.exe' failed to run: A specified logon session does not exist. It may already have been terminated.

Everything I see about the nature of the error implies the double hop problem. I don't understand why I would be seeing that since I should just be running a single hop here. Furthermore, it specifically only pukes when running python scripts from the shell. To add even more wrinkles, I'm using Python's pip command to install some commands beforehand. In fact, any other kind of command I run before that python command will be fine. It'll puke on that command wherever it is in my provision steps that Vagrant is running. It's even stranger that I can run "python -m pip" to invoke pip and also be okay.

Vagrant is using WinRM to access the remote VM and there are some specific instructions to use it that disable some strictness:
https://www.vagrantup.com/docs/boxes/base.html#base-winrm-configuration

That doesn't bother me because the VM is isolated to this machine and doesn't, say, provide services to the outside or anything. Still, I am getting problems like this.

Is there anything else to consider beyond classic double hop stuff? For one, I am running vagrant from Linux so some of the procedures--particularly involving running PowerShell commands on the host as if it were a Windows box--don't even apply. I'm also not, say, connecting to this VM just to connect to some other session elsewhere. So I am very confused about it.

It also looks like this error does not happen on the second attempt to provision the same VM instance. So it could very well be some vagrant shenanigans, but I wanted to isolate the possibilities with Windows remoting.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Powershell is p cool. I haven't messed with it seriously in quite a while but in a couple hours I just set up:

- A scheduled task that runs every 10 mins
- Checks when last run successfully (using Get-ScheduledTask)
- Builds an SQL query string using date stamp acquired from above if greater than 10 minutes
- Connect to logging database and get info (using System.Data.SqlClient.SQLConnection)
- Parse results, build string out of exceptions
- If exception string <> "" log event to EventViewer (using Write-EventLog)
- If exception string <>"" email exceptions (using Net.Mail.SmtpClient and System.Net.Mail.MailMessage)
- If exception string <>"" post exceptions to dedicated Slack channel (using Invoke-RestMethod and hooks.slack.com)

It's real neat!

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
Is there a better way of parsing AWS S3 metadata values other than using the s3api and playing with the JSON? I wrote this as it works, but it's a bit clunky:

PowerShell code:
$Objects = Get-S3Object -BucketName <INSERTBUCKETHERE>
ForEach($Object in $Objects) {
	If ($Object.Key -Match '.+?/$') {
		Write-Host "Object: $($Object.Key)"
		(aws s3api head-object --bucket $Object.BucketName --key $Object.Key | Out-String | ConvertFrom-Json).metadata | Get-Member -Type NoteProperty | ForEach-Object { Write-Host ("   {0,-30}{1,-45}" -f "Key: $($_.Name)","Value: $($_.Definition.Split("=")[1])") }
	}
}
Edit: The above is specifically to get metadata from folder keys only, as I’m using that as a configuration space for various functions.

PierreTheMime fucked around with this message at 17:14 on Jun 25, 2019

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

So here's a thing. I'm building JSON fulfillment requests from a CSV file. It's not exact, but it's roughly this:
code:
$csvJSONout = '[';
$csvTop = 'order_id','tracking_number','items_sku','items_quantity';
$csvIn = Import-Csv -Path .\test2.csv -Header $csvTop;
foreach ($p in $csvIn)
{
    $csvJSONout+=('{"order_id":' + $p.'order_id' + '"' + ',"packages": [{"tracking_number":' + '"' + $p.'tracking_number' + '"' + ',"items": [{ "sku": ' + '"' + $p.'items_sku' + '"' + ',"quantity": ' + $p.'items_quantity' + '}] }] },');
}
$csvJSONout += ']';
(I know about the bad comma on the last row)

But what I've been told now, is that the data isn't normalized very well. I might have cases where order_id is not unique, so like:
code:
+----------+----------+--------+-----+
| order_id | tracking |  sku   | qty |
+----------+----------+--------+-----+
|     1111 | XYZ      | BLEH-X |   1 |
|     1111 | XYZ      | BLEH-X |   1 |
|     1112 | ZYX      | BLAH11 |   4 |
|     1111 | XYZ      | BLIHZZ |   1 |
+----------+----------+--------+-----+
Ideally I'd like to be able to "roll up" both:
- All SKUs belonging to an order_id and
- Sum up Qty when a SKU is indicated for that order_id over multiple rows (e.g. BLEH-X above should be one entry for Qty 2)

I would in turn use this information to make one JSON request for the order, that keeps adding to tracking_number / items array for that order_id instead of multiple individual requests against the same order_id the current method would do.

I'm pretty sure I'd know how to do this in C#/VB, but am wondering if there's some wow neato thing like | Combine-ByKey "order_id" or something like that in PowerShell I'm not aware of.

If it helps, I can ensure that the CSV file is sorted by order_id, e.g. the next row would always be either the same order or a new one.

EDIT-And if you want to make cool pre/code tables like the above you can go here:
https://github.com/ozh/ascii-tables

EDIT EDIT-Hmm Group-Object -> Select-Object

Scaramouche fucked around with this message at 08:38 on Jun 26, 2019

Potato Salad
Oct 23, 2014

nobody cares


Bug #2607 on cyclic dependences probably isn't going to be fixed in ps5.1


It's okay, I I love making weird hacky module manifests :smithicide:

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

PierreTheMime posted:

Is there a better way of parsing AWS S3 metadata values other than using the s3api and playing with the JSON? I wrote this as it works, but it's a bit clunky:

PowerShell code:
$Objects = Get-S3Object -BucketName <INSERTBUCKETHERE>
ForEach($Object in $Objects) {
	If ($Object.Key -Match '.+?/$') {
		Write-Host "Object: $($Object.Key)"
		(aws s3api head-object --bucket $Object.BucketName --key $Object.Key | Out-String | ConvertFrom-Json).metadata | Get-Member -Type NoteProperty | ForEach-Object { Write-Host ("   {0,-30}{1,-45}" -f "Key: $($_.Name)","Value: $($_.Definition.Split("=")[1])") }
	}
}
Edit: The above is specifically to get metadata from folder keys only, as I’m using that as a configuration space for various functions.

New lines and intermediary variables are free

Potato Salad
Oct 23, 2014

nobody cares


fishmanpet are you a pester contributor

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I did contribute about 4 or 5 lines so technically yes.

12 rats tied together
Sep 7, 2006

PierreTheMime posted:

Is there a better way of parsing AWS S3 metadata values other than using the s3api and playing with the JSON? I wrote this as it works, but it's a bit clunky:

PowerShell code:
$Objects = Get-S3Object -BucketName <INSERTBUCKETHERE>
ForEach($Object in $Objects) {
	If ($Object.Key -Match '.+?/$') {
		Write-Host "Object: $($Object.Key)"
		(aws s3api head-object --bucket $Object.BucketName --key $Object.Key | Out-String | ConvertFrom-Json).metadata | Get-Member -Type NoteProperty | ForEach-Object { Write-Host ("   {0,-30}{1,-45}" -f "Key: $($_.Name)","Value: $($_.Definition.Split("=")[1])") }
	}
}
Edit: The above is specifically to get metadata from folder keys only, as I’m using that as a configuration space for various functions.

It's weird to mix the powershell toolkit and awscli like this IMO. If you're using the powershell toolkit, can you pipe into Get-S3ObjectMetadata? That will come back with something that powershell is more readily equipped to deal with (an instance of Amazon.S3.Model.MetadataCollection), and then you can use normal powershell stuff to grab attributes of the response without having to fight with json.

For an awscli-only approach I would recommend piping into jq.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

12 rats tied together posted:

It's weird to mix the powershell toolkit and awscli like this IMO. If you're using the powershell toolkit, can you pipe into Get-S3ObjectMetadata? That will come back with something that powershell is more readily equipped to deal with (an instance of Amazon.S3.Model.MetadataCollection), and then you can use normal powershell stuff to grab attributes of the response without having to fight with json.

For an awscli-only approach I would recommend piping into jq.

I tried that originally, but the Get-S3ObjectMetadata cmdlet only contains a MetadataCollection, which contains keys, not the values. Frustratingly, MetadataEntry is right below this in the SDK doc and has the obviously-more-useful key/value pair, but it's not used here. This is especially annoying because the Java SDK can get values just fine.

It's always possible I'm missing something (because I often do), but I cannot find a way to get the metadata value from the non-cli cmdlets.

The object event adds the key/value pair in the .Add() method but the collection returned is just the key. :suicide:

PierreTheMime fucked around with this message at 17:41 on Jun 27, 2019

12 rats tied together
Sep 7, 2006

Great point -- looking closer at this cmdlet, it looks like they're bubbling the keys/values up into GetObjectMetadataResponse? For example, if you need ETag, it's right there (key and value) in the response object, not the metadata collection. Same thing with Last-Modified, ContentLength, etc.

It is super confusing that they would not exist, keys and values, in the metadata collection though. I do not like the powershell toolkit and I generally just use awscli/boto3. :kiddo:

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
Yeah, I need to get around to learning Python. For that reason I'm setting aside time to convert some of my AWS-specific Java SDK stuff to boto3, half for experience and half because others around here are more comfortable with it.

I've come to the conclusion that working with Java and PowerShell primarily has given me a serious case of verbosity-poisoning, as the abbreviated/succinct style of Python just bothers me in a way I can't describe.

Mario
Oct 29, 2006
It's-a-me!
It looks like you can just read values by square-bracket indexing with the key, but this is documented as the Item property since that's the underlying CLR property name:

https://github.com/aws/aws-sdk-net/blob/1a5187e0dcddadba10a6b90fd827f2e485af0f9c/sdk/src/Services/S3/Custom/Model/MetadataCollection.cs#L38

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

Mario posted:

It looks like you can just read values by square-bracket indexing with the key, but this is documented as the Item property since that's the underlying CLR property name:

https://github.com/aws/aws-sdk-net/blob/1a5187e0dcddadba10a6b90fd827f2e485af0f9c/sdk/src/Services/S3/Custom/Model/MetadataCollection.cs#L38

Ugh, that's so awkward but okay. Thanks for pointing that out.

This works:
PowerShell code:
$Objects = Get-S3Object -BucketName INSERTBUCKETNAMEHERE
ForEach($Object in $Objects) {
	If ($Object.Key -Match '.+?/$') {
		Write-Host "Object: $($Object.Key)"
		$Metadata = (Get-S3ObjectMetadata -Bucket $Object.BucketName -Key $Object.Key).Metadata
		ForEach ($Key in $Metadata.Keys) {
			Write-Host ("   {0,-45}{1,-45}" -f "Key: $Key","Value: $($Metadata[$Key])")
		}
	}
}

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Does AWS Powershell support you know, actually querying the table in some fashion? Or is it just a meta-organizer tool?

I'm part through hooking one up (I can't seem to be able to provide a sort key that satisfies "the provided key element does not match the schema") and I saw this on a related StackOverflow:
code:
So I'm running a scan of a DynamoDB table with ~500 records using the AWS CLI through Powershell, because the AWS Powershell Tools don't support DDB query/scan operations
So have I been wasting my time?

Happiness Commando
Feb 1, 2002
$$ joy at gunpoint $$

AWS Powershell is definitely the bastard offspring. IDK if the rest of the API documentation is as terrible as the Powershell one is, but its frequently wrong. Also much of their getting started documentation says "Here's how to do it in AWS CLI" and then ignores Powershell. Furthermore there's no (official?) API to Powershell lookup table - I've had to click on my best guesses for the cmdlets I want and then read the body text to see what API calls it makes.

Also don't get me started on parameters that are documented as optional but are in fact mandatory and only have one allowed value.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
Yeah it really is surprising how long AWS has been around and still they’re cleaning up basic things, even in their most popular interfaces.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

I sort of got it working using aws-cli but as mentioned the documentation is pretty dire for the powershell specific stuff. I got like 90% of the way there and will probably revisit it using native PS objects; the biggest stumbling block to me is that you must use the existing indexes as part of any query object, and it never really specifies where/how you should do that, so some random guess and test was involved. I'd probably have it working in PS, but one of the bright sparks who made the table (and didn't make any useful indexes) made one of the columns a reserved AWS keyword, and normal Powershell escaping doesn't work for getting around that.

I think there might be something to this guy's approach but I gave up fixing it halfway through as the impression I get is that it's no longer current:
https://www.powershellgallery.com/packages/domainAwsPowershellTools/1.0.2/Content/domainAwsPowershellTools.psm1

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
Anyone have experience working with Google API service accounts? I can read Sheets data easily enough using an API key, but writing needs an OAuth token and it’s a pretty annoying process. I’ve had everything else working just fine using Invoke-RestMethod, but if I need to install the Google module I guess I can.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Google "umn-Google powershell" and you'll find our module that includes oauth stuff, either authenticating as you or as a service account token. It's only in a github branch right now, but something like gshit drop down (I made a typo in the branch name and never fixed it!) includes code that will do a lot of the work for you to create it, you still have to create the "application" or whatever in the Google developer console, but once you've got your client secret and app id and setup your redirect uri properly, it'll do the rest of the work.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

FISHMANPET posted:

Google "umn-Google powershell" and you'll find our module that includes oauth stuff, either authenticating as you or as a service account token. It's only in a github branch right now, but something like gshit drop down (I made a typo in the branch name and never fixed it!) includes code that will do a lot of the work for you to create it, you still have to create the "application" or whatever in the Google developer console, but once you've got your client secret and app id and setup your redirect uri properly, it'll do the rest of the work.

Thanks, got it working from that. I was also trying to use the JSON credential it provided (as it was recommended), but I gave in and have a dependency on the .p12 cert now. Having worked with other oauth functions like Keycloak this still seems overly complicated, but it'll have to do.

LODGE NORTH
Jul 30, 2007

I don't even know if this is the right place to ask this, but I'm dealing with a dilemma.

I have this script to run here:

code:
for file in *; do
if [[ -f "$file" ]]; then
mkdir "${file%.*}"
mv "$file" "${file%.*}"
fi
done
The script essentially takes the file names of the files inside the directory I run the script, makes a folder or folders with the same name as the file or files, and puts those items in their respective folder. If two items share the same name (different extensions) then it puts all of them into one folder with the same name. Easy stuff.

However, I need to get this to run on 100 separate folders. The way I do it now is Folder1/Folder2/Folder3/files where Folder 3 is where I cd to in order to make the split folders with the filenames etc etc.

The branches where things separate are beneath Folder 1. Folder 1 houses 100 folders (collectively referred to as Folder(s) 2), but in each of those folders is a folder (Folder 3) with the files within them that I need to run the command above on.

LODGE NORTH fucked around with this message at 01:04 on Jul 18, 2019

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
Are you looking to extend your current script in shell or do you want it extended/converted to Powershell?

Adbot
ADBOT LOVES YOU

mystes
May 31, 2006

LODGE NORTH posted:

I don't even know if this is the right place to ask this, but I'm dealing with a dilemma.

I have this script to run here:

code:
for file in *; do
if [[ -f "$file" ]]; then
mkdir "${file%.*}"
mv "$file" "${file%.*}"
fi
done
The script essentially takes the file names of the files inside the directory I run the script, makes a folder or folders with the same name as the file or files, and puts those items in their respective folder. If two items share the same name (different extensions) then it puts all of them into one folder with the same name. Easy stuff.

However, I need to get this to run on 100 separate folders. The way I do it now is Folder1/Folder2/Folder3/files where Folder 3 is where I cd to in order to make the split folders with the filenames etc etc.

The branches where things separate are beneath Folder 1. Folder 1 houses 100 folders (collectively referred to as Folder(s) 2), but in each of those folders is a folder (Folder 3) with the files within them that I need to run the command above on.
Use PowerShell, OP, it's Slightly Less Bad(TM).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply