Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Inspector_666
Oct 7, 2003

benny with the good hair

The Fool posted:

Maybe a calculated property?

E: https://4sysops.com/archives/add-a-calculated-property-with-select-object-in-powershell/

Phone posting, but maybe I’ll write up an example when I get back to my desk.

Yeah, I was poking around in here but I had trouble getting the actual values to spit out instead of the class, and making the whole collection into a variable is specifically warned against in the Powershell output since the Get-Group return is almost 4000 items. I think working with the values in Excel is going to be the best way forward.

Adbot
ADBOT LOVES YOU

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
You've got the case where ManagedBy (assume that's what you mean by Owner?) has multiple values, and in that case for each user you'd want them to have their own row, so the end result of a group with 2 owners would be 2 rows with identical data except for ManagedBy?

This could be done on the pipeline but I think it needs nested for loops and that gets ugly fast.
code:
$groups = Get-Group -Identity * -ResultSize Unlimited | select DisplayName,SamAccountName,WindowsEmailAddress,ManagedBy
foreach ($group in $groups) {
    foreach ($owner in $group.ManagedBy) {
        [PSCustomObject]@{
            DisplayName = $group.DisplayName
            SamAccountName = $group.SamAccountName
            WindowsEmailAddress = $group.WindowsEmailAddress
            ManagedBy = $owner
        } | Export-Csv -Path "C:\Users\Documents\PSReporting\AllGroupsManagedBy.csv" -Append
    }
}
Totally written without running. I don't think a calculated property will work because essentially you want one item to come in from the pipeline and for 2 objects to come out, and I don't think calculated properties can do that?
E: There's no problem putting 4000 objects into a variable, especially if you're only selecting a few properties. I've got a regular job that runs where one of the variables takes up 12GB of memory, and other than needing to make sure the system has enough memory there's no problems.

FISHMANPET fucked around with this message at 22:52 on Feb 10, 2020

Inspector_666
Oct 7, 2003

benny with the good hair
Ideally I would just want 2 cells for the ManagedBy value, one with each name (yes, that's the "Owner" I'm talking about.) Pretty much in my perfect world it would just feed the array into the CSV as is, the comma would split the value into 2 cells and I could just add another column header in the final output.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Hmm, that doesn't seem very useful for doing anything but handing to humans to read (and I don't think calculated properties can do a dynamic number of properties anyway) so Excel may be your best bet.

If you knew what the max number of owners you would have, you could do something like @{label="Owner 1";expression={$_.ManagedBy[0]}} and so on for the max number...

Inspector_666
Oct 7, 2003

benny with the good hair

FISHMANPET posted:

Hmm, that doesn't seem very useful for doing anything but handing to humans to read (and I don't think calculated properties can do a dynamic number of properties anyway) so Excel may be your best bet.

If you knew what the max number of owners you would have, you could do something like @{label="Owner 1";expression={$_.ManagedBy[0]}} and so on for the max number...

Yeah I may just keep playing with the data to get that max number, I doubt it's very high. The idea here is initially just getting it looked at by humans, but I figured if it was multiple cells on one line I can just reverse whatever split there is to feed it back into the system in the future if need be.

Also that's good to know about the variable, Powershell pops up a warning over 1000 objects and I just didn't want to accidentally nuke the remote box I'm working on.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
This is very bad and I feel bad for even writing it, and I'm not sure if it will actually work, but:
code:
Get-Group -Identity * -ResultSize Unlimited |ForEach-Object {$owners = foreach ($owner in 1..$($_.ManagedBy.Count)) {@{label="Owner $owner";expression={$_.ManagedBy[$owner-1]}}}} | select (@("DisplayName","SamAccountName","WindowsEmailAddress") + $owners) | Export-Csv -Path "C:\Users\Documents\PSReporting\AllGroupsManagedBy.csv"
In theory $owners will an array of calculated property objects Owner 1, Owner 2, etc etc. For every group it will set it to be the number of owners in that group, then select the properties you wanted plus this dynamically calculated list of calculated properties.

Inspector_666
Oct 7, 2003

benny with the good hair
Ok I checked and at least one group has 31 values in the ManagedBy field so yeah I think I may have gotten out in front of the issue. Thanks for your help though.

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
I had this amazing idea that I should, instead of putting the scripts file share including Powershell modules into the PATH, use the existing functionality for providing a repository and installing/updating packages from it.

It's just a side project and every time I find some time to work on it I find another dumb garbage problem with it. On the menu today: Repositories.

So a user can add a repository using Register-PSRepository. A user can install a script or module system-wide using an elevated prompt.

The PSRepository is a user-level setting. A different user can see installed scripts/modules, but they can't update them because they have no idea which repository they came from.

You cannot define a repository system-wide. (Unless you are MS I guess, since the PSGallery repo appears for everyone).

The Get-PSRepository and Register-PSRepository commandlets are also incredibly slow so putting them into the default profile is a no-go. Powershell is already unfathomably slow to start.


Who is the brain-dead jerk who designed this? Who thought this was good? Who are the assholes who jerk themselves raw over how great Powershell is? This is poo poo. All of it. There is a feature request on Github that is over two years old with no movement whatsofuckingever.

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
code:
$repo = @{
    SourceLocation = "\\redacted\PSRepo"
    Name = "PSRepo"
    InstallationPolicy = "Trusted"
}

if (Get-PSRepository $repo.Name -ErrorAction SilentlyContinue) {
    Set-PSRepository @repo
} 
else {
    Register-PSRepository @repo
}
This script runs in over 3 seconds.

Three.

Seconds.

Here is a non-exhaustive list of things that take less than 3 seconds:
  • Linux kernel initialisation from the EFI loading the binary to Linux invoking init, including the initramfs stage
  • Starting postgresql11 with a medium-sized database, from invoking the command to accepting connections
  • Starting apache2 with prefork, from invoking the command to accepting connections
  • Getting to the login screen of my desktop at home, from selecting the kernel to run from the grub menu


What the gently caress is Powershell doing that is more complicated than these things?!

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
The powershell gallery isn't even system wide, it's just setup to be added to every profile (it can even be removed from your profile). There's a verification step when adding a repository, along with a flag to skip verification, does that save any time?

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
I can't find any flag like that. I can specify the provider, but hardcoding "NuGet" doesn't speed it up at all.

Unrelated, while digging I found this file: "C:\Windows\System32\WindowsPowerShell\v1.0\Examples\profile.ps1"

Here's the full file content.

code:
#  Copyright (c) Microsoft Corporation.  All rights reserved.
#  
# THIS SAMPLE CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND,
# WHETHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.
# IF THIS CODE AND INFORMATION IS MODIFIED, THE ENTIRE RISK OF USE OR RESULTS IN
# CONNECTION WITH THE USE OF THIS CODE AND INFORMATION REMAINS WITH THE USER.
Amazing sample. You fucks.

Pile Of Garbage
May 28, 2007



Antigravitas posted:

code:
$repo = @{
    SourceLocation = "\\redacted\PSRepo"
    Name = "PSRepo"
    InstallationPolicy = "Trusted"
}

if (Get-PSRepository $repo.Name -ErrorAction SilentlyContinue) {
    Set-PSRepository @repo
} 
else {
    Register-PSRepository @repo
}
This script runs in over 3 seconds.

Three.

Seconds.

Here is a non-exhaustive list of things that take less than 3 seconds:
  • Linux kernel initialisation from the EFI loading the binary to Linux invoking init, including the initramfs stage
  • Starting postgresql11 with a medium-sized database, from invoking the command to accepting connections
  • Starting apache2 with prefork, from invoking the command to accepting connections
  • Getting to the login screen of my desktop at home, from selecting the kernel to run from the grub menu


What the gently caress is Powershell doing that is more complicated than these things?!

OK I've done some digging and there's a couple of things going on here. First off, when you run that first *-PSRepository cmdlet PS automatically imports the PowerShellGet and PackageManagement modules into the session. On my system that takes ~700ms:

code:
PS C:\> (Measure-Command -Expression { Import-Module -Name PowerShellGet,PackageManagement }).TotalMilliseconds
717.8508
Next, the dumbest thing: when you run any of the cmdlets exported by the PowerShellGet module (e.g. Get-PSRepository) an internal method named Check-PSGalleryApiAvailability is called. This method does a ping to www.microsoft.com and then does a HTTP GET to the PSGallery repository to check connectivity. This check is only performed once (Status stored internally as $Script:PSGalleryApiChecked) but adds about ~1100ms to the execution time. This is why subsequent cmdlet execution is faster:

code:
PS C:\> Import-Module -Name PowerShellGet,PackageManagement
PS C:\> (Measure-Command -Expression { Get-PSRepository }).TotalMilliseconds
1741.4454
PS C:\> (Measure-Command -Expression { Get-PSRepository }).TotalMilliseconds
564.4145
So yeah, there's your ~3 seconds. Unfortunately I don't have any suggestions to alleviate the situation...

Edit: one idea, not sure how well supported it is but you could try just manipulating the PSRepositories.xml file itself under %LocalAppData%\Microsoft\Windows\PowerShell\PowerShellGet\PSRepositories.xml. It looks fairly straight-forward and it would be a hell of a lot faster to modify that file instead of using the cmdlets.

Pile Of Garbage fucked around with this message at 15:27 on Feb 12, 2020

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
Thank you for digging. I hate it.

I wonder how many issues we could solve by just nullrouting all Microsoft IP space at the edge…

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:

Pile Of Garbage posted:

Edit: one idea, not sure how well supported it is but you could try just manipulating the PSRepositories.xml file itself under %LocalAppData%\Microsoft\Windows\PowerShell\PowerShellGet\PSRepositories.xml. It looks fairly straight-forward and it would be a hell of a lot faster to modify that file instead of using the cmdlets.

Heh, yes, I just found it. I have absolutely zero qualms about editing that myself.


http://schemas.microsoft.com/powershell/2004/04

quote:

The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.



//edit:

For future reference, that's apparently "CLIxml" and it's garbage. As usual. At least there are cmdlets for it…

Antigravitas fucked around with this message at 15:36 on Feb 12, 2020

Pile Of Garbage
May 28, 2007



Antigravitas posted:

Heh, yes, I just found it. I have absolutely zero qualms about editing that myself.


http://schemas.microsoft.com/powershell/2004/04




The file is just standard CLI XML so you can import with Import-Clixml, make whatever changes you want to the object and then export it again with Export-Clixml. For example, changing the SourceLocation of the FileShareRepo repo (Note that the changes won't be visible until the PowerShellGet module is reloaded because it caches the repositories):

code:
$PSRepositoriesPath = Join-Path -Path $env:LOCALAPPDATA -ChildPath 'Microsoft\Windows\PowerShell\PowerShellGet\PSRepositories.xml'
$PSRepositories = Import-Clixml -Path $PSRepositoriesPath
$PSRepositories.FileShareRepo.SourceLocation = '\\example\path'
$PSRepositories | Export-Clixml -Path $PSRepositoriesPath
Edit:

Antigravitas posted:

//edit:

For future reference, that's apparently "CLIxml" and it's garbage. As usual. At least there are cmdlets for it…

:lol: it's fine, just an object serialisation format.

Pile Of Garbage fucked around with this message at 15:46 on Feb 12, 2020

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:

Pile Of Garbage posted:

:lol: it's fine, just an object serialisation format.

For sanity testing I just round tripped the serialised objects and the file changed significantly. At least nuget seems to still be able to digest it.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

Antigravitas posted:

I can't find any flag like that. I can specify the provider, but hardcoding "NuGet" doesn't speed it up at all.

Unrelated, while digging I found this file: "C:\Windows\System32\WindowsPowerShell\v1.0\Examples\profile.ps1"

Here's the full file content.

code:
#  Copyright (c) Microsoft Corporation.  All rights reserved.
#  
# THIS SAMPLE CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND,
# WHETHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.
# IF THIS CODE AND INFORMATION IS MODIFIED, THE ENTIRE RISK OF USE OR RESULTS IN
# CONNECTION WITH THE USE OF THIS CODE AND INFORMATION REMAINS WITH THE USER.
Amazing sample. You fucks.

OK I was thinking of Register-PackageSource which has a SkipValidate flag. Somehow Package Sources are related to PS Repositories in ways I don't understand, but I think behind the scenes adding a PS Repository will first register a Package Source. But Pile of Garbage did the deep digging and I don't think that would save you anything.

CzarChasm
Mar 14, 2009

I don't like it when you're watching me eat.
I'm trying to build a program using powershell that will track Exchange emails going to and from salesmen and exporting the details to a text file. So run the program, open the mailbox, open the inbox and sent folders, read the To, From, Date and Body of the message and write that info to a line on a text file. Read the next message, repeat. I have another program that will read the text file, compare the email address from that list to our customer database and update a record for that customer with those details.

I have a program right now, that does the first half fairly well. Open mailbox, read messages, get info, export to text file. That's fine. The catch with the program is that the user ID and password need to be entered. I can hard code the credentials into the script, and it runs, but it will only run for that user. I can (in theory) publish this script out to everyone and say "Run this, put in your credentials when prompted" and it will go to the exchange server and pull the needed data to build an individual text file. Then I could combine all the individual text files, and import from there.

However, my boss wants this completely automated. So the program would run from the server, go through every mailbox, build a single text file and then have program 2 scrape and update the customer database. But the problem comes in with the credentials again. Either I'd have to have the user IDs and passwords documented somewhere, and it would have to do a loop based off of a list of users, or I'd need some kind of administrator override that can access every mailbox.

Option one sucks and is the opposite of secure since it would not only require user passwords to be contained in plain text, but as soon as a user's system password was updated (or if anyone is hired or leaves) the list of credentials would need to be updated.

Option 2 is best because if there is an admin account that can get access to all email accounts, then it doesn't matter what those accounts have for passwords or if anyone is hired or leaves. The program just loops until there are no more mailboxes. But I don't know how to manage that or if it is even a possibility. I am an administrator, and I can (with some manipulation) get into everyone's mailbox on my exchange server, but I don't see a way to do this programatically. I just don't even know where to start or if this is even possible.

The Fool
Oct 16, 2003


CzarChasm posted:

Option 2 is best because if there is an admin account that can get access to all email accounts, then it doesn't matter what those accounts have for passwords or if anyone is hired or leaves. The program just loops until there are no more mailboxes. But I don't know how to manage that or if it is even a possibility. I am an administrator, and I can (with some manipulation) get into everyone's mailbox on my exchange server, but I don't see a way to do this programatically. I just don't even know where to start or if this is even possible.

1. Don't store credentials in your script
2. When you load your credentials, use securestring
3. Make a service account
4. In the exchange admin center, make a role that gives the minimum level of access you need
5. Assign the account that role
(optional) 6. Do your automation in azure devops pipelines so you can store your credential in an azure key vault or as a secure environment variable

Submarine Sandpaper
May 27, 2007


That makes me think, how useful is the built in convert fromsecurestring if you don't have a vault service or otherwise to pull central creds from?

Zaepho
Oct 31, 2013

The Fool posted:

1. Don't store credentials in your script

To piggyback on this, timing for this is perfect. Check out the new module that;s in pre-release https://www.powershellgallery.com/packages/Microsoft.PowerShell.SecretsManagement/0.2.0-alpha1 it handles a lot of the credentials stuff for you in a secure fashion.

Ghostnuke
Sep 21, 2005

Throw this in a pot, add some broth, a potato? Baby you got a stew going!


I'm new to this whole thing, but I'm working on teaching myself. I'm about 20 lessons into the cbt nuggets program, so I know this should be easily (?) done but I haven't gotten far enough in to figure it out yet. Maybe you guys can help?


I need to back up a local folder to a networked folder, and hopefully only copy over the files that are new/changed. I can set up task scheduler to run it when the user wants. The only hiccup is that it has to run on a windows 7 machine, not sure if that handicaps any of the cmdlets that would be normally used. Any ideas?

The Fool
Oct 16, 2003


Submarine Sandpaper posted:

That makes me think, how useful is the built in convert fromsecurestring if you don't have a vault service or otherwise to pull central creds from?

It works but is clunky. You can generate and save encrypted credentials to a txt file, then reload them. If you don't specify a key they are limited to the machine they were generated on. If you do manually specify a key, you need to store that somehow too.

Zaepho posted:

To piggyback on this, timing for this is perfect. Check out the new module that;s in pre-release https://www.powershellgallery.com/packages/Microsoft.PowerShell.SecretsManagement/0.2.0-alpha1 it handles a lot of the credentials stuff for you in a secure fashion.

Yeah, that's super cool.
I haven't had a chance to play with it, but Snover tweeted a thing about this being used to integrate with KeePass last week.


Ghostnuke posted:

I'm new to this whole thing, but I'm working on teaching myself. I'm about 20 lessons into the cbt nuggets program, so I know this should be easily (?) done but I haven't gotten far enough in to figure it out yet. Maybe you guys can help?


I need to back up a local folder to a networked folder, and hopefully only copy over the files that are new/changed. I can set up task scheduler to run it when the user wants. The only hiccup is that it has to run on a windows 7 machine, not sure if that handicaps any of the cmdlets that would be normally used. Any ideas?

As long as WMF 5.1 is installed, you should be fine. If not, you will need to make sure you are only using cmdlets that are for the version of powershell you are targeting. You can see the requirements here: https://docs.microsoft.com/en-us/powershell/scripting/install/windows-powershell-system-requirements?view=powershell-7

For your specific task, you may me better off just using Robocopy

CzarChasm
Mar 14, 2009

I don't like it when you're watching me eat.

The Fool posted:

3. Make a service account
4. In the exchange admin center, make a role that gives the minimum level of access you need
5. Assign the account that role

OK, so build a service (AD?) account, go to exchange admin center, make a role that would have access to all users mailboxes, assign that role to the AD user I created. Then have it run through all the mailboxes on the exchange server as part of my script?

The Fool
Oct 16, 2003


pretty much, yeah

Zaepho
Oct 31, 2013

The Fool posted:

Yeah, that's super cool.
I haven't had a chance to play with it, but Snover tweeted a thing about this being used to integrate with KeePass last week.

I missed the KeePass tweet. That sounds pretty awesome! I'd love for Thycotic to build a vault extension for it since we use that internally and having a common module where we just choose the backend would be perfect for our use case dragging templates/samples around between customers/projects

I also haven't had a chance to put it to use yet. It would have been huge a couple projects back to simplify managing all the service accounts we were using for the AD Migration processes we were running. I'm keeping my eye open for opportunities to leverage it in the future though.

CzarChasm
Mar 14, 2009

I don't like it when you're watching me eat.
Working on a different but related script. I'm sure I'm doing this rear end backwards, but I can't seem to google the correct terms to do what I want.

In short, I need a counter that starts at a specific number and increments by 1 each time an email is processed. So, if I start my counter at 100, and process 15 emails, the counter should be at 115 at the end, and then the next time the program is run, it starts at 115. I have to start it at a specific value because this program is going to be a continuation of another program that has been running for almost 2 years now, and the counter is used as an identifier.

My first thought is to start with $counter = 100 at the start of the program, but of course, every time the program is started fresh, that counter is going to go back to 100 again.

The way I've set this in the program is to have a single line text file that starts my counter. One of the first lines I have is to read the value from this text file, and put it into the variable. Then after the processing is done, the last thing it will do before ending is to take the incremented value and write it back out to the same text file. This works, but my concern is that if my program crashes at any point before the counter is written back to the text file, my count is going to be off, and that could lead to big problems. I could add the line to write the value back to the text file immediately after the counter is incremented, and in that case it would be talking maybe one duplicate rather than a whole days worth.

But I still feel that I must be missing something obvious.

slartibartfast
Nov 13, 2002
:toot:
Is there a way you can look up the current value of the other program's counter as the first thing your script does?

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
Write a journal. One line per processed mail with some metadata, i.e. serial, message ID, processing start and end time.

You can then find the point where you left off by looking at the last line of the file if your processing queue is sorted the same every time you run. If not, you can identify all the mails that were already processed by reading the entire file.

Make sure the journal is human readable, it helps with debugging.

MJP
Jun 17, 2007

Are you looking at me Senpai?

Grimey Drawer
What's the next actual physical book that covers the next level from Learn Powershell in a Month of Lunches? Any time I've been working with scripts beyond what I could do from that book, I'm confronted with terminology that I don't quite get. It's both Powershell functionality but also programmer/dev/coder terminology that I don't have - my only coding background was BASIC and Logowriter.

I'm the kind of person that learns best in an actual classroom environment, but since nobody sends people for classroom training (or at least nobody in the places I've worked) my next best learning method has been to work off of a book. Is there any book that serves as the next level of Powershell, assuming a no-coder sysadmin background, after the original Powershell in a Month of Lunches?

Toast Museum
Dec 3, 2005

30% Iron Chef

MJP posted:

What's the next actual physical book that covers the next level from Learn Powershell in a Month of Lunches? Any time I've been working with scripts beyond what I could do from that book, I'm confronted with terminology that I don't quite get. It's both Powershell functionality but also programmer/dev/coder terminology that I don't have - my only coding background was BASIC and Logowriter.

I'm the kind of person that learns best in an actual classroom environment, but since nobody sends people for classroom training (or at least nobody in the places I've worked) my next best learning method has been to work off of a book. Is there any book that serves as the next level of Powershell, assuming a no-coder sysadmin background, after the original Powershell in a Month of Lunches?

PowerShell Scripting in a Month of Lunches by the same authors. Formerly called PowerShell Toolmaking in a Month of Lunches. Full disclosure, I got sidetracked by actual scripting projects and never finished this one, but it seems like a good resource.

Edit: also, see if you have access to something like LinkedIn Learning (formerly Lynda) or Pluralsight through your employer, local library, etc. In my county, for instance, anyone with a library card has free access to LinkedIn Learning, and not just on-site.

Toast Museum fucked around with this message at 16:45 on Feb 26, 2020

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I don't have any specific course recommendations, but udemy.com is an interesting resource. It's pre-recorded lectures and some activities, but you can usually find a sale where you'll get a course for under $15. It's also some kind of crowd sourced (I'm not exactly sure how they "vet" their instructors") which is partly why I can't recommend any particular course since I haven't gone through any powershell ones. I did a course on docker/kubernetes which was good, but that was on the strength of the instructor. So that's why I can't recommend any specific courses because I don't know how good those instructors are. And if you like classroom instruction because of the ability to interact with classmates and the instructor in real time, you won't get that from something like udemy.

adaz
Mar 7, 2009

CzarChasm posted:

Working on a different but related script. I'm sure I'm doing this rear end backwards, but I can't seem to google the correct terms to do what I want.

In short, I need a counter that starts at a specific number and increments by 1 each time an email is processed. So, if I start my counter at 100, and process 15 emails, the counter should be at 115 at the end, and then the next time the program is run, it starts at 115. I have to start it at a specific value because this program is going to be a continuation of another program that has been running for almost 2 years now, and the counter is used as an identifier.

My first thought is to start with $counter = 100 at the start of the program, but of course, every time the program is started fresh, that counter is going to go back to 100 again.

The way I've set this in the program is to have a single line text file that starts my counter. One of the first lines I have is to read the value from this text file, and put it into the variable. Then after the processing is done, the last thing it will do before ending is to take the incremented value and write it back out to the same text file. This works, but my concern is that if my program crashes at any point before the counter is written back to the text file, my count is going to be off, and that could lead to big problems. I could add the line to write the value back to the text file immediately after the counter is incremented, and in that case it would be talking maybe one duplicate rather than a whole days worth.

But I still feel that I must be missing something obvious.

As someone else mentioned a single line text file is the easiest way to solve this and the most intuitive/easy to debug. You can add gradually increasing layers of complexity if you desire like databases and so forth but..

As for your worry about the transaction failing my suggestion for this is to isolate the email processing logic inside some error handling. Depending on your business requirements ..

pre:
[int]$counter = get-content C:\blah\sometextfile.txt

try {
  // get our email here and do some poo poo
}finally {
 // incrememnet counter
 $counter++ | set-content C:\blah\sometextfile.txt
}

If #3 should _always_ incremement the counter regardless if it succeeds or fails wrap it in try finally with the finally block in charge of incrementing the counter. That will mean you won't ever process that email ever again of course.

Otherwise you can use date/time shenanigans and just write something like last processed time to a file and rely on a serach of email to get you emails or whatever that were sent after that time instead of a raw counter.

CzarChasm
Mar 14, 2009

I don't like it when you're watching me eat.
Thanks again for all the help everyone.

Dirt Road Junglist
Oct 8, 2010

We will be cruel
And through our cruelty
They will know who we are

Toast Museum posted:

PowerShell Scripting in a Month of Lunches by the same authors. Formerly called PowerShell Toolmaking in a Month of Lunches. Full disclosure, I got sidetracked by actual scripting projects and never finished this one, but it seems like a good resource.

Edit: also, see if you have access to something like LinkedIn Learning (formerly Lynda) or Pluralsight through your employer, local library, etc. In my county, for instance, anyone with a library card has free access to LinkedIn Learning, and not just on-site.

Seconding. These are my two primary Powershell guides, aside from SS64 for synatx once in a while.

Djimi
Jan 23, 2004

I like digital data
Hello there group. I am scratching my head on this, and it's probably a dumb assumption on my part, but I don't know.
This is not really my code - I found most it pretty quickly, and searching on the variables used in the main loop you'll find from scripting sites/ Scripting Guy etc.

Objective: I was hopeful it was going to get some information that I've been asked to retrieve, specifically metadata of Office Excel documents.

Here's the code and the error I'm getting:
code:
$extx= ".xlsx"
$Dir = (get-childitem -Path "p:\myfolder\" -recurse -force | ? {$_.Extension -eq $ext -or $_.Extension -eq $extx})

write-host "Here's what we're going to inspect: "
$Dir
#save output
$Save = ".\excel_info_" + $time + ".csv"


$AryProperties = "Title","Author","Creation Date","Last Save Time"
$application = New-Object -ComObject excel.application
$application.Visible = $false #to prevent the document you open to show

$binding = "System.Reflection.BindingFlags" -as [type]
[ref]$SaveOption = "microsoft.office.interop.Excel.WdSaveOptions" -as [type]

Foreach($doc in $Dir)
 {
 write-host "Doc.fullname is " $doc.fullname
  $document = $application.documents.open($doc.fullname)
  $BuiltinProperties = $document.BuiltInDocumentProperties
  write-host "Built in properties are: " $BuiltInDocumentProperties 
  $objHash = @{"Path"=$doc.FullName}
   foreach($p in $AryProperties)
    {Try 
     { 
      $pn = [System.__ComObject].invokemember("item",$binding::GetProperty,$null,$BuiltinProperties,$p) 
      $value = [System.__ComObject].invokemember("value",$binding::GetProperty,$null,$pn,$null)
      $objHash.Add($p,$value) }
     Catch [system.exception]
      { write-host -foreground blue "Value not found for $p" } }
   $docProperties = New-Object psobject -Property $objHash
   $docProperties | ft | out-file $save -append
   $docProperties | ft
   $document.close([ref]$saveOption::wdDoNotSaveChanges) 
   [System.Runtime.InteropServices.Marshal]::ReleaseComObject($BuiltinProperties) | Out-Null
   [System.Runtime.InteropServices.Marshal]::ReleaseComObject($document) | Out-Null
   Remove-Variable -Name document, BuiltinProperties
   }

$application.quit()
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($application) | Out-Null
Remove-Variable -Name application
[gc]::collect()
[gc]::WaitForPendingFinalizers()

+++++++++++++++++++++++++
Sample error and output:
+++++++++++++++++++++++++

You cannot call a method on a null-valued expression.
At p:\myfolder\scripts\Get-Info-Excel:18 char:3
+   $document = $application.documents.open($doc.fullname)
+   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : InvokeMethodOnNull

Built in properties are:
Value not found for Title
Value not found for Author
Value not found for Creation Date
Value not found for Last Save Time
I'm fairly certain that my documents all have all these metadata. I see the properties in the GUI.
System is Windows 10, Powershell version 5. My hunch is because of security of the file, not opening correctly - or opening as 'read-only' which Office does now. Anyway - if somebody knows something, TIA. :tipshat:

Zaepho
Oct 31, 2013

Djimi posted:

I'm fairly certain that my documents all have all these metadata. I see the properties in the GUI.
System is Windows 10, Powershell version 5. My hunch is because of security of the file, not opening correctly - or opening as 'read-only' which Office does now. Anyway - if somebody knows something, TIA. :tipshat:

First up, add some logging to find out what's going on. Some write-host lines with useful information (what file is it trying to open when it spits out the error?) can go a LONG way
Second, You're failing on line 18 of the script. either because $Doc.Fullname is null or that application object is no good. run some of the relevant bits of code interactively and check out the objects and their members and see what you can find. Get-Member is definitely your friend here.

Finally, talking to office via Com sucks. Check out the ImportExcel module and see if it has what you're looking for. https://github.com/dfinke/ImportExcel You may have to go digging around in the objects it returns to find your properties but I was able to do similar with Word using a different but similar module.

Djimi
Jan 23, 2004

I like digital data

Zaepho posted:

First up, add some logging to find out what's going on. Some write-host lines with useful information (what file is it trying to open when it spits out the error?) can go a LONG way
Second, You're failing on line 18 of the script. either because $Doc.Fullname is null or that application object is no good. run some of the relevant bits of code interactively and check out the objects and their members and see what you can find. Get-Member is definitely your friend here.

Finally, talking to office via Com sucks. Check out the ImportExcel module and see if it has what you're looking for. https://github.com/dfinke/ImportExcel You may have to go digging around in the objects it returns to find your properties but I was able to do similar with Word using a different but similar module.

The code does output the $doc filename from the $Dir, in the loop. Is my type wrong?
I do have about three documents in the directory that match and that part is working. I will use start-transcript and add more logging. Also I will get the github module and go from there and report back. Thank you! :v:

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

CzarChasm posted:

Working on a different but related script. I'm sure I'm doing this rear end backwards, but I can't seem to google the correct terms to do what I want.

In short, I need a counter that starts at a specific number and increments by 1 each time an email is processed. So, if I start my counter at 100, and process 15 emails, the counter should be at 115 at the end, and then the next time the program is run, it starts at 115. I have to start it at a specific value because this program is going to be a continuation of another program that has been running for almost 2 years now, and the counter is used as an identifier.

My first thought is to start with $counter = 100 at the start of the program, but of course, every time the program is started fresh, that counter is going to go back to 100 again.

The way I've set this in the program is to have a single line text file that starts my counter. One of the first lines I have is to read the value from this text file, and put it into the variable. Then after the processing is done, the last thing it will do before ending is to take the incremented value and write it back out to the same text file. This works, but my concern is that if my program crashes at any point before the counter is written back to the text file, my count is going to be off, and that could lead to big problems. I could add the line to write the value back to the text file immediately after the counter is incremented, and in that case it would be talking maybe one duplicate rather than a whole days worth.

But I still feel that I must be missing something obvious.

Use a registry key?

Adbot
ADBOT LOVES YOU

ChubbyThePhat
Dec 22, 2006

Who nico nico needs anyone else
Ah PowerShell 7 now has a ternary operator.... oh boy.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply