Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BaseballPCHiker
Jan 16, 2006

I've been trying to come up with a way to see a list of installed software from remote users:

code:
Function Get-InstalledSoftware
{
	Param
	(
		[Alias('Computer','ComputerName','HostName')]
		[Parameter(ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$true,Position=1)]
		[string[]]$Name = $env:COMPUTERNAME
	)
	Begin
	{
		$LMkeys = "Software\Microsoft\Windows\CurrentVersion\Uninstall","SOFTWARE\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall"
		$LMtype = [Microsoft.Win32.RegistryHive]::LocalMachine
		$CUkeys = "Software\Microsoft\Windows\CurrentVersion\Uninstall"
		$CUtype = [Microsoft.Win32.RegistryHive]::CurrentUser
		
	}
	Process
	{
		ForEach($Computer in $Name)
		{
			$MasterKeys = @()
			If(!(Test-Connection -ComputerName $Computer -count 1 -quiet))
			{
				Write-Error -Message "Unable to contact $Computer. Please verify its network connectivity and try again." -Category ObjectNotFound -TargetObject $Computer
				Break
			}
			$CURegKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($CUtype,$computer)
			$LMRegKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($LMtype,$computer)
			ForEach($Key in $LMkeys)
			{
				$RegKey = $LMRegKey.OpenSubkey($key)
				If($RegKey -ne $null)
				{
					ForEach($subName in $RegKey.getsubkeynames())
					{
						foreach($sub in $RegKey.opensubkey($subName))
						{
							$MasterKeys += (New-Object PSObject -Property @{
							"ComputerName" = $Computer
							"Name" = $sub.getvalue("displayname")
							"SystemComponent" = $sub.getvalue("systemcomponent")
							"ParentKeyName" = $sub.getvalue("parentkeyname")
							"Version" = $sub.getvalue("DisplayVersion")
							"UninstallCommand" = $sub.getvalue("UninstallString")
							})
						}
					}
				}
			}
			ForEach($Key in $CUKeys)
			{
				$RegKey = $CURegKey.OpenSubkey($Key)
				If($RegKey -ne $null)
				{
					ForEach($subName in $RegKey.getsubkeynames())
					{
						foreach($sub in $RegKey.opensubkey($subName))
						{
							$MasterKeys += (New-Object PSObject -Property @{
							"ComputerName" = $Computer
							"Name" = $sub.getvalue("displayname")
							"SystemComponent" = $sub.getvalue("systemcomponent")
							"ParentKeyName" = $sub.getvalue("parentkeyname")
							"Version" = $sub.getvalue("DisplayVersion")
							"UninstallCommand" = $sub.getvalue("UninstallString")
							})
						}
					}
				}
			}
			$MasterKeys = ($MasterKeys | Where {$_.Name -ne $Null -AND $_.SystemComponent -ne "1" -AND $_.ParentKeyName -eq $Null} | select Name,Version,ComputerName,UninstallCommand | sort Name)
			$MasterKeys
		}
	}
	End
	{
		
	}
}
This is based off of something found on spiceworks. It works great locally, however I'm having problems with getting it to work remotely. From what I've gathered I would need to enable the remote registry service first on any computer I wanted this to work with. I'm leery to set a group policy to allow this because my company really likes to keep things locked down and I don't want to make any waves. I know I shouldnt use Win32_Product but I think that would let me run this remotely without any issues. Is there any other way to go about this or should I just set a GPO and run this?

Adbot
ADBOT LOVES YOU

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

BaseballPCHiker posted:

This is based off of something found on spiceworks. It works great locally, however I'm having problems with getting it to work remotely. From what I've gathered I would need to enable the remote registry service first on any computer I wanted this to work with. I'm leery to set a group policy to allow this because my company really likes to keep things locked down and I don't want to make any waves. I know I shouldnt use Win32_Product but I think that would let me run this remotely without any issues. Is there any other way to go about this or should I just set a GPO and run this?
http://blogs.technet.com/b/askds/archive/2012/04/19/how-to-not-use-win32-product-in-group-policy-filtering.aspx

Have a look at this link. Look at the MOF file which lets you register a custom WMI class based on the registry provider. If you register that on each machine (once), you can use remote WMI calls to do this quickly and easily.

Of course the problem becomes how to register it on each machine, but it's a worthwhile endeavor.

If you can't do that, you can try to use PowerShell remoting, but that too has to be enabled on every machine, just like remote registry access would.

Hadlock
Nov 9, 2004

BaseballPCHiker posted:

I've been trying to come up with a way to see a list of installed software from remote users:

code:
Function Get-InstalledSoftware
{  ..  stuff ... }
This is based off of something found on spiceworks. It works great locally, however I'm having problems with getting it to work remotely. From what I've gathered I would need to enable the remote registry service first on any computer I wanted this to work with. I'm leery to set a group policy to allow this because my company really likes to keep things locked down and I don't want to make any waves. I know I shouldn't use Win32_Product but I think that would let me run this remotely without any issues. Is there any other way to go about this or should I just set a GPO and run this?

Do you have active directory? You should have a master admin account of some sort that they use to push out windows updates. Use those credentials (or a similar admin-level account) and Make sure ps-remoting is turned on (Enable-PSRemoting) then just wrap the whole script/function in a scriptblock and then do an

psuedocode
code:
invoke-command -server $servername -scriptblock $scriptbock -arguments $a $b $c

BaseballPCHiker
Jan 16, 2006

Briantist posted:

http://blogs.technet.com/b/askds/archive/2012/04/19/how-to-not-use-win32-product-in-group-policy-filtering.aspx

Have a look at this link. Look at the MOF file which lets you register a custom WMI class based on the registry provider. If you register that on each machine (once), you can use remote WMI calls to do this quickly and easily.

Of course the problem becomes how to register it on each machine, but it's a worthwhile endeavor.

If you can't do that, you can try to use PowerShell remoting, but that too has to be enabled on every machine, just like remote registry access would.

Hadlock posted:

Do you have active directory? You should have a master admin account of some sort that they use to push out windows updates. Use those credentials (or a similar admin-level account) and Make sure ps-remoting is turned on (Enable-PSRemoting) then just wrap the whole script/function in a scriptblock and then do an

psuedocode
code:
invoke-command -server $servername -scriptblock $scriptbock -arguments $a $b $c

Thanks to both of you. I'm thinking I'm going to enable ps-remoting and then as mentioned run it as a scriptblock. Hopefully I get the OK to do this. We just went through a security audit and the board has a major stick up their butts about pretty much everything and anything.

Hadlock
Nov 9, 2004

Admins need to be able to administrate. Tell them you're using it to make sure the security software is installed and correctly patched. If anyone gives you poo poo about using powershell feel free to question their windows credentials; WS2012 and forward is designed to ONLY be administrated via ps remoting. The Raspberry Pi 2, the new windows docker nano serverlets, windows IoT stuff is all designed to only be interfaced with via ps remoting so it's been designed to be quite secure.

Video Nasty
Jun 17, 2003

If I wanted to set up a script to move files and log each moved filename, would I want to use a process block or is there a better alternative?

12 rats tied together
Sep 7, 2006

Probably just a for loop to be honest. I don't see a lot of reason to use a process block for such a simple task, especially if the only thing you want to log is the filename and not like the timestamp, source path, destination path, and the output of any logic that decided whether or not the file was supposed to be moved.

code:
Get-FilesToMove | % { 
    Copy-Item -source -destination

    if (test-path -destination) {
        $logline = "$($_.FullName) was copied to <location>"
        $logline | Out-File -log.txt -append
}
Or whatever. But, to be honest, take that with a grain of salt because I read the documentation for begin/process/end blocks and scoffed IRL so I'm probably just a shithead. But, seriously, the don jones article that ended up being the first google search result is like 80% heads-ups about the gotchas involved with the blocks and why they might not behave the way you expect them to. Why bother with that poo poo? Why should I use PROCESS{} and then have to put a loving for loop in it anyway? I'd rather just wipe my own rear end.

12 rats tied together fucked around with this message at 17:38 on Jul 18, 2015

RICHUNCLEPENNYBAGS
Dec 21, 2010
I need to copy a bunch of folders over to a different folder, but only if a folder with the same name doesn't already exist in the target. My idea is to just build the list of folder names in the target into a list and loop over and do .Contains on everything, skipping if the name is already there. The trouble is I can't figure out hwo to get just the strings. ls | select Name returns a list of objects with a Name property which is not what I want. I want a list of strings. Is this even the best approach for this problem?

I suppose I could resolve this by using HashSet<string> or something like that but I was really hoping for something that was quick and easy in the console and not basically writing a C# with PowerShell syntax.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

RICHUNCLEPENNYBAGS posted:

I need to copy a bunch of folders over to a different folder, but only if a folder with the same name doesn't already exist in the target. My idea is to just build the list of folder names in the target into a list and loop over and do .Contains on everything, skipping if the name is already there. The trouble is I can't figure out hwo to get just the strings. ls | select Name returns a list of objects with a Name property which is not what I want. I want a list of strings. Is this even the best approach for this problem?

I suppose I could resolve this by using HashSet<string> or something like that but I was really hoping for something that was quick and easy in the console and not basically writing a C# with PowerShell syntax.

ls | select-object -ExpandProperty name

RICHUNCLEPENNYBAGS
Dec 21, 2010

Ithaqua posted:

ls | select-object -ExpandProperty name

Now we're talking. Thanks.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Reiz posted:

Probably just a for loop to be honest. I don't see a lot of reason to use a process block for such a simple task, especially if the only thing you want to log is the filename and not like the timestamp, source path, destination path, and the output of any logic that decided whether or not the file was supposed to be moved.

code:
Get-FilesToMove | % { 
    Copy-Item -source -destination

    if (test-path -destination) {
        $logline = "$($_.FullName) was copied to <location>"
        $logline | Out-File -log.txt -append
}
Or whatever. But, to be honest, take that with a grain of salt because I read the documentation for begin/process/end blocks and scoffed IRL so I'm probably just a shithead. But, seriously, the don jones article that ended up being the first google search result is like 80% heads-ups about the gotchas involved with the blocks and why they might not behave the way you expect them to. Why bother with that poo poo? Why should I use PROCESS{} and then have to put a loving for loop in it anyway? I'd rather just wipe my own rear end.

Why bother? Only if you want to process items from the pipeline.

And just to be pedantic, the code you have there is not using a for loop, it's using ForEach-Object, which as it turns out, is actually using a Process{} block.

In fact this:
code:
Get-Something | % {
    Do-Something $_
}
Is the same as:

code:
Get-Something | ForEach-Object -Process {
    Do-Something $_
}
ForEach-Object also takes parameters for Begin and end blocks. Essentially, it's a way of doing an anonymous advanced function.

So if you wanted to go back and do the above with an actual loop, let's say foreach, it might look like this:

code:
$files = Get-FilesToMove 
foreach($file in $files) { 
    Copy-Item -source -destination

    if (test-path -destination) {
        $logline = "$($file.FullName) was copied to <location>"
        $logline | Out-File -log.txt -append
}
Both will work, but there's a difference here, because the code you originally posted will start copying files from the first file it finds in Get-FilesToMove, whereas this latter code has to retrieve the entire file list first.

If you're working with a handful of files, then who cares. If you're working with, say, a million, you might not want to walk the entire directory tree before you even begin to copy files.

The pipeline in powershell is great in that each function's process{} block in the pipleine gets access to each individual object as it becomes available (most of the time, sometimes a cmdlet needs to see every item before it can hand them off to the next, like Sort-Object).

12 rats tied together
Sep 7, 2006

quote:

Both will work, but there's a difference here, because the code you originally posted will start copying files from the first file it finds in Get-FilesToMove, whereas this latter code has to retrieve the entire file list first.

Here's how I handle this: If I want to walk the entire directory tree I do (Get-Stuff) | % {}, if I don't want to walk the entire directory tree I just do Get-Stuff | % {}. More specifically it is just easier for me to use "%" in all situations and then handle my input outside of the scope of the function.

So, yes, I'm aware of the differences in execution. Was not aware that an "advanced function" is a thing or even that % is shorthand for it but that's pretty cool I guess. I left () vs pipeline up to the end user but you're right, that probably would have been a pretty good thing to mention because...

quote:

If you're working with, say, a million, you might not want to walk the entire directory tree before you even begin to copy files.
You might actually want to walk the entire directory tree before you begin to copy files. It could have a huge impact on performance so if you are working with a nontrivial amount of files, it's probably a good idea to check both and compare.

12 rats tied together fucked around with this message at 04:38 on Jul 20, 2015

Video Nasty
Jun 17, 2003

I really appreciate the responses and bonus knowledge on Process because my logic was to get a count of files before moving, then tee output to the event log based on a conditional block of whether any new files exist in the source directory; then tally the filenames of what was moved in a running txt log and do something like this to organize it before logging:
code:
Group-Object Extension |
Select-Object @{n="Extension";e={$_.Name -replace '^\.'}}, Count , @{n="Size (MB)";e={[math]::Round((($_.Group | Measure-Object Length -Sum).Sum / 1MB), 2)}}, @{n="Newest File"; e={$_.Group | Sort-object LastWriteTime | Select -last 1 } } 
| sort Extension
Main objective is to move files off an FTP site (networked) using Powershell in a Cron job to move, count, and log files, then determine if an email is needed to notify me of fatal errors.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Reiz posted:

Here's how I handle this: If I want to walk the entire directory tree I do (Get-Stuff) | % {}, if I don't want to walk the entire directory tree I just do Get-Stuff | % {}. More specifically it is just easier for me to use "%" in all situations and then handle my input outside of the scope of the function.

So, yes, I'm aware of the differences in execution. Was not aware that an "advanced function" is a thing or even that % is shorthand for it but that's pretty cool I guess. I left () vs pipeline up to the end user but you're right, that probably would have been a pretty good thing to mention because...

You might actually want to walk the entire directory tree before you begin to copy files. It could have a huge impact on performance so if you are working with a nontrivial amount of files, it's probably a good idea to check both and compare.
Yeah I agree, I meant to edit this in actually, but sure if you're going to copy all of the files, walking the tree first will probably be faster even if it starts later. But if you're going to have conditions on which files are included or not, like with a Where-Object condition, it could be desirable to take each object as it comes. You could go more advanced with it and background the copy with a powershell job or runspace or something, and then effectively you'll be walking the tree and copying concurrently. But this is far from the original question.. just trying to illustrate the differences.

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Briantist posted:

Yeah I agree, I meant to edit this in actually, but sure if you're going to copy all of the files, walking the tree first will probably be faster even if it starts later. But if you're going to have conditions on which files are included or not, like with a Where-Object condition, it could be desirable to take each object as it comes. You could go more advanced with it and background the copy with a powershell job or runspace or something, and then effectively you'll be walking the tree and copying concurrently. But this is far from the original question.. just trying to illustrate the differences.

I'd be more inclined to do any copy as I walk simply because the files may change during the walk and I wouldn't want stale metadata.

Spazz
Nov 17, 2005

You guys do know that you can use the backtick ( ` ) to put stuff on multiple lines, right? Same with bash with \. Makes your poo poo easier to read instead of a heaping one liner.

code:
$Items = $SPList.Items | Where-Object {$_.Properties.Author -eq $Author -AND $_.Created -gt (Get-Date).AddDays(-15) -OR $_.Created -lt (Get-Date)}
Becomes...

code:
$Items = $SPList.Items | Where-Object {$_.Properties.Author -eq $Author `
                                  -AND $_.Created -gt (Get-Date).AddDays(-15) `
                                   -OR $_.Created -lt (Get-Date)}
Then people like me won't hate you when we have to maintain your scripts.

12 rats tied together
Sep 7, 2006

I try to keep things as terse as possible and yeah at times it becomes kind of unbearable. I don't see anything particularly wrong with that example though, except I really try not to use -OR unless I have to. Sometimes you just have to.

This loving got me though:
code:
 @{n="Extension";e={$_.Name -replace '^\.'}}, Count , @{n="Size (MB)";e={[math]::Round((($_.Group | Measure-Object Length -Sum).Sum / 1MB), 2)}}, @{n="Newest File"; e={$_.Group | Sort-object LastWriteTime | Select -last 1 } } | sort Extension
Like, dang dude. Can definitely tell what you're doing but I wouldn't ever want to have to change that.

Briantist posted:

Yeah I agree, I meant to edit this in actually, but sure if you're going to copy all of the files, walking the tree first will probably be faster even if it starts later. But if you're going to have conditions on which files are included or not, like with a Where-Object condition, it could be desirable to take each object as it comes. You could go more advanced with it and background the copy with a powershell job or runspace or something, and then effectively you'll be walking the tree and copying concurrently. But this is far from the original question.. just trying to illustrate the differences.

This actually kind of brings up a concern/question I had a while back -- is it just me or does ForEach -parallel inside workflows run like complete rear end on a standard admin machine? I was doing some (frankly, kind of stupid) poo poo where I was calculating filehashes and stuffing them into databases. My boss' boss who was/is a developer said that I should be doing the calculation and insertion with multiple threads, in parallel. Some quick experimentation showed that -parallel was being outperformed on my work machine by a factor of 3 or 4 compared to just a standard ForEach-Object. This was a core i7 with 16gb of ram running on an SSD and using a gigabit line so I definitely didn't expect the performance hit to be that harsh. I spent some time analyzing it and the queries were sub-10ms from request to response in both cases so the issue definitely seemed to be the CPU overheard involved in starting and maintaining threads.

Not sure if it's the same for jobs or runspaces though.

quote:

I really appreciate the responses and bonus knowledge on Process because my logic was to get a count of files before moving, then tee output to the event log based on a conditional block of whether any new files exist in the source directory; then tally the filenames of what was moved in a running txt log and do something like this to organize it before logging:
Well, it depends. Depends on how many files you have, how big the files are, and what kind of network speeds you're going to get. If you have, say, under 100 files it doesn't matter and any functionality that works, works. If you're pushing 5,000... you should try both and see what is faster. If you're worried about hundreds of thousands of files it's probably a good idea to use rsync. Consider this a sliding scale I guess based on how important the files are.

Your log...organization kind of weirds me out but I'm assuming that you aren't using the normal fileinfo poo poo out of necessity and also because I don't want to bother figuring out how that actually works. But, if you are accessing the files as System.IO.FileInfo objects and not some weird FTP server text output there might be an easier or more readable way to do that... but if it works then it works. Absolute worst case scenario, if someone needs to re-sort that poo poo in a different way they can just read the output as an array and sort it again, no big deal.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy

Reiz posted:

This actually kind of brings up a concern/question I had a while back -- is it just me or does ForEach -parallel inside workflows run like complete rear end on a standard admin machine? I was doing some (frankly, kind of stupid) poo poo where I was calculating filehashes and stuffing them into databases. My boss' boss who was/is a developer said that I should be doing the calculation and insertion with multiple threads, in parallel. Some quick experimentation showed that -parallel was being outperformed on my work machine by a factor of 3 or 4 compared to just a standard ForEach-Object. This was a core i7 with 16gb of ram running on an SSD and using a gigabit line so I definitely didn't expect the performance hit to be that harsh. I spent some time analyzing it and the queries were sub-10ms from request to response in both cases so the issue definitely seemed to be the CPU overheard involved in starting and maintaining threads.

Not sure if it's the same for jobs or runspaces though.
Workflows are an area of powershell I just haven't touched at all, so I can't really say for sure. But the creation and teardowns of runspaces/threads could definitely be a factor. You might be able to outperform both by rolling your own parallelism with runspaces in non-workflow code.

Video Nasty
Jun 17, 2003

Yeah, I know my code is horrendous to look at, and I'm sorry about breaking tables. I ditched that jumbled mess in favor of just logging filenames and a tallied count at the end of the process.
Results get logged to Event Viewer with a custom Application event, or written as errors on failure. Output gets tee'd to a text log so I can trace back all the image names that were transferred. It's very amateur at best, but it's only for a hundred files at a time, if that so I'm not too concerned with performance.

Something a little more tidy than the snippet below, but I'm providing it in case it benefits anyone else:
code:
#establish today's date for logging.
$datetime = get-date -f yyyy-MM-dd_HH.mm;

#set up Powershell transcript log for internal record-keeping.
$PSlog = "C:\PS\logs\PSTranscript_$datetime.txt";
Start-transcript $PSlog;

#collect file list from the Source directory.
$getFiles =  gci -path $source -file | where-object {$_.extension -eq ".png"-or $_.extension -eq ".jpg" -or $_.extension -eq ".gif" }

#Set up a running tally of files moved for printing in results.
$i = 0;

#Track new files arriving to the destination in an array.
$newFiles= @();

#Copy each source file to destination
  foreach($file in $getFiles) { 
    Move-Item $($file.fullName) -destination $dest

    if (test-path $dest) {
	    # Write the filename to the log as it is moved.
        $logline = "$($file.FullName) was transferred.`r`n"
		
		# build the newFiles array of all moved files during the process.
        $newFiles += $($file.FullName);
		
		#Update the logs for each filename moved.
        $logline | Tee-Object $logfile -append
		
		#increment the file counter.
		$i++
    } else {
	  # files could not be located in the Source - write an error to the Event Log and PSLog.
	  Tee-object " *** TRANSFER ABORTED *** `r`n No files could be located in the Source directory." $logfile -append
}
    if ($fileCount -gt 0) { 
      #write event log of actions taken after completion.
      write-eventlog -logname Application -Source "Move_FTP_Files" `
 -eventID 3001 -entrytype Information -message "Successfully copied $fileCount files from FTP to shared Images directory." -category 1 -rawdata 10,20
    }

  #write final output to console and log.
  Write-host "`r`n *** Successfully copied $fileCount files from FTP to shared Images directory. *** `r`n" | tee-object $logfile -append

Stop-transcript
exit; 
I'll humbly admit that I stole the process block from Briantist, so many thanks for that help!

Video Nasty fucked around with this message at 01:02 on Jul 21, 2015

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.
Treat parallelism in PowerShell the same way that you would in most other scripting languages: it will help you with asynchronous I/O-bound operations, but it will almost never improve the performance of a calculation.

myron cope
Apr 21, 2009

I have a possibly dumb, maybe more "sysadmin" than powershell specific question:

Why does enter-pssession <exchangeserver> not give me exchange cmdlets? I stumbled upon
code:
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri [url]http://[/url]]<exchange>/PowerShell
Import-PSSession $session
which does give me the cmdlets.

12 rats tied together
Sep 7, 2006

As far as I'm concerned, "sysadmin" and "powershell" questions are basically the same thing. I'd defer to someone who is more aware of the actual terminology and what happens behind-the-scenes on this one, but short version: You can PSSession to any machine and it's kind of like just SSHing there. When you create a session and specify the ConfigurationName Microsoft.Exchange, you automatically modify the properties of your session. I'm not sure about the specifics, but Microsoft.Exchange as a configurationname probably attempts to auto-import the exchange commandlets for you in addition to doing some other stuff that would be useful to you on an exchange server, probably involving exchange's role based access controls. The session configuration is, well, session-significant, and AFAIK the path to the ConfigurationName config file is specific to the machine you are running on.

Presumably, you could move Microsoft.Exchange to your personal computer and then Enter-PSSession 127.0.0.1 with the Microsoft.Exchange configuration and you'd have the cmdlets, they probably wouldn't do anything though unless you pointed them at the actual exchange server. I'd also wager that you can just do a standard enter-pssession and then, once you're in, import the modules yourself. But, I recall the exchange-specific stuff is a little funky and that they generally want you to use the Exchange Management Shell whenever possible. I'm pretty sure you can install the exchange tools and it comes with a copy of the EMS and using that would probably be best practice here.

BaseballPCHiker
Jan 16, 2006

Just as a follow up to my previous question, I was able to get it working with your help. I had to enable PSRemoting as well as the RemoteRegistry service on the target computers. Now I just have to convince the powers that be that it'll be useful in the future beyond just this script.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
For exchange you need to add-pssnapin <name of the exchange snapin>

I have no idea why it's different to say, AD, which is a module that can auto-load.

I'll grab the exact snappin name in when I'm back at my desk. It'll be different depending which version you're running. I run 2010 so YMMV for other versions.



Edit: For Exchange 2010 SP3:

code:
 $exchange = New-PSSession <servername>
Enter-PSSession $exchange
Add-PSSnapin  Microsoft.Exchange.Management.PowerShell.E2010
For the record, it never actually works from my workstation. I get this error:
code:
 get-mailbox
Value cannot be null.
Parameter name: serverSettings
    + CategoryInfo          : NotSpecified: (:) [Get-Mailbox], ArgumentNullException
    + FullyQualifiedErrorId : System.ArgumentNullException,Microsoft.Exchange.Management.RecipientTasks.GetMailbox

Unsure if this is my broken client\server or the broken module.

Swink fucked around with this message at 01:37 on Jul 21, 2015

myron cope
Apr 21, 2009

I installed the Exchange Management Tools (they don't make it necessarily obvious that you just run the installer and only select the tools part. It's weird that they still have the "Exchange Management Shell" separate from Powershell. Eh, still better than remoting to one of the exchange servers! I didn't realize you could install the tools on their own, so thanks!

Swink
Apr 18, 2006
Left Side <--- Many Whelps
^ Yeah or just do that. :)

Hadlock
Nov 9, 2004

.NET 4.6 got released today

I was expecting PS v5.0 to be released with the next version of .NET (5.0?) but this is just a point release above 4.5, so I went ahead and downloaded the update just to check, it is still 4.0

code:
PS C:\Users\hadlock> $psversiontable

Name                           Value                                           
----                           -----                                           
PSVersion                      4.0                                             
WSManStackVersion              3.0                                             
SerializationVersion           1.1.0.1                                         
CLRVersion                     4.0.30319.42000                                 
BuildVersion                   6.3.9600.17400                                  
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0}                            
PSRemotingProtocolVersion      2.2   
Looks like the CLRVersion changed a tiny bit, 4.030319.34209 -> 4.0.30319.42000

Presumably Win10 will launch with .NET 4.6 and PS4.0. Or are we still expecting to see PS5.0 with Win10? I wonder if they are going to hold back on PS5.0 until they complete SSH integration. That would be major.

Hadlock fucked around with this message at 01:56 on Jul 21, 2015

Tony Montana
Aug 6, 2005

by FactsAreUseless
What are the major features of 5 that I should be aware of?

SSH? So you can do encrypted sessions? What else?

myron cope
Apr 21, 2009

Tony Montana posted:

What are the major features of 5 that I should be aware of?

SSH? So you can do encrypted sessions? What else?

Here is a MS page: https://technet.microsoft.com/en-us/%5Clibrary/Hh857339.aspx


Hadlock posted:

.NET 4.6 got released today

I was expecting PS v5.0 to be released with the next version of .NET (5.0?) but this is just a point release above 4.5, so I went ahead and downloaded the update just to check, it is still 4.0

code:
PS C:\Users\hadlock> $psversiontable

Name                           Value                                           
----                           -----                                           
PSVersion                      4.0                                             
WSManStackVersion              3.0                                             
SerializationVersion           1.1.0.1                                         
CLRVersion                     4.0.30319.42000                                 
BuildVersion                   6.3.9600.17400                                  
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0}                            
PSRemotingProtocolVersion      2.2   
Looks like the CLRVersion changed a tiny bit, 4.030319.34209 -> 4.0.30319.42000

Presumably Win10 will launch with .NET 4.6 and PS4.0. Or are we still expecting to see PS5.0 with Win10? I wonder if they are going to hold back on PS5.0 until they complete SSH integration. That would be major.

I'm using Windows 10 and it definitely has powershell 5.0.
code:
Name                           Value
----                           -----
PSVersion                      5.0.10240.16384
WSManStackVersion              3.0
SerializationVersion           1.1.0.1
CLRVersion                     4.0.30319.42000
BuildVersion                   10.0.10240.16384
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}
PSRemotingProtocolVersion      2.3
Not sure when it will be available for other OSes though that aren't 10. Syntax coloring!

Swink
Apr 18, 2006
Left Side <--- Many Whelps

Tony Montana posted:

What are the major features of 5 that I should be aware of?

SSH? So you can do encrypted sessions? What else?

The goddamn package manager!

code:
PS C:\Users\9thg> find-package *notepad*

Name                           Version          Source           Summary
----                           -------          ------           -------
Devbox-Notepad2                4.2.25           chocolatey       A fast and extremly light-weight Notepad-like text ...
notepadplusplus-withuninstall  6.6.2            chocolatey       Notepad++ is a free (as in "free speech" and also a...
notepadplusplus                6.7.9.2          chocolatey       Notepad++ is a free (as in "free speech" and also a...
notepadplusplus.install        6.7.9.2          chocolatey       Notepad++ is a free (as in "free speech" and also a...
notepadplusplus.commandline    6.7.9.2          chocolatey       Notepad++ is a free (as in "free speech" and also a...
Notepadplusplus.Settings       1.0.0.20141029   chocolatey       Allows Notepad++ settings to be installed from a pr...
notepad2                       4.2.25.3         chocolatey       A fast and light-weight Notepad-like text editor wi...
notepad2-mod                   4.2.25.940       chocolatey       A modified version (fork) of Notepad2 based on Kai ...
notepadreplacer                1.1.6            chocolatey       Replace notepad.exe with your favorite editor instead.
ProgrammersNotepad             2.3              chocolatey       Programmers Notepad - Windows programming editor wi...
XmlNotepad                     2007.0.0.0       chocolatey       XmlNotepad




install-package notepad2


Hadlock
Nov 9, 2004

Tony Montana posted:

What are the major features of 5 that I should be aware of?

SSH? So you can do encrypted sessions? What else?

5 is supposed to support linux-style package management system(s), like apt-get and yum, replacing the need for stuff like ninite.

Find-Package -Name AdobeReader | Install-Package
Find-Package -Name WinRAR, Skype, Opera | Install-Package

It also supports formal programming classes in addition to functions, etc. Also deeper DSC support, and tighter Hyper-V integration.

Basically if you manage more than 20 servers and have to deploy code to them on a regular basis, it's very helpful. You can define your own software repositories of course. Not as useful if you're a desktop user.

And yeah, syntax highlighting in the ISE :dance:

There are more, but those are the most relevant to my world right now.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Hadlock posted:

.NET 4.6 got released today

I was expecting PS v5.0 to be released with the next version of .NET (5.0?) but this is just a point release above 4.5, so I went ahead and downloaded the update just to check, it is still 4.0

code:
PS C:\Users\hadlock> $psversiontable

Name                           Value                                           
----                           -----                                           
PSVersion                      4.0                                             
WSManStackVersion              3.0                                             
SerializationVersion           1.1.0.1                                         
CLRVersion                     4.0.30319.42000                                 
BuildVersion                   6.3.9600.17400                                  
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0}                            
PSRemotingProtocolVersion      2.2   
Looks like the CLRVersion changed a tiny bit, 4.030319.34209 -> 4.0.30319.42000

Presumably Win10 will launch with .NET 4.6 and PS4.0. Or are we still expecting to see PS5.0 with Win10? I wonder if they are going to hold back on PS5.0 until they complete SSH integration. That would be major.

CLR != framework

There have only been 4 CLR versions, 1.0, 1.1, 2.0 and 4.0.

Briantist
Dec 5, 2003

The Professor does not approve of your post.
Lipstick Apathy
Woah of course this thread blew up on a day when I'm busy with Windows updates (you guys did see this right?)

Nash Regex posted:

Yeah, I know my code is horrendous to look at, and I'm sorry about breaking tables. I ditched that jumbled mess in favor of just logging filenames and a tallied count at the end of the process.
Results get logged to Event Viewer with a custom Application event, or written as errors on failure. Output gets tee'd to a text log so I can trace back all the image names that were transferred. It's very amateur at best, but it's only for a hundred files at a time, if that so I'm not too concerned with performance.

Something a little more tidy than the snippet below, but I'm providing it in case it benefits anyone else:
code:
# snip

$getFiles =  gci -path $source -file | where-object {$_.extension -eq ".png"-or $_.extension -eq ".jpg" -or $_.extension -eq ".gif" }
I didn't look at all the code closely, but I like using an array with -contains for this:
code:
$extensions = '.png','.jpg','.gif'
$getFiles =  gci -path $source -file | where-object { $extensions -contains $_.extension }

myron cope posted:

I have a possibly dumb, maybe more "sysadmin" than powershell specific question:

Why does enter-pssession <exchangeserver> not give me exchange cmdlets? I stumbled upon
code:
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri [url]http://[/url]]<exchange>/PowerShell
Import-PSSession $session
which does give me the cmdlets.

myron cope posted:

I installed the Exchange Management Tools (they don't make it necessarily obvious that you just run the installer and only select the tools part. It's weird that they still have the "Exchange Management Shell" separate from Powershell. Eh, still better than remoting to one of the exchange servers! I didn't realize you could install the tools on their own, so thanks!
Actually the way you were doing it at first is the supported way of using the Exchange cmdlets. It's called Implicit Remoting. You're not supposed to directly use the snap in in your own scripts.

And note that even when you do install the management tools and run the exchange management shell, implicit remoting is being done behind the scenes.

It should would with Enter-PSSession as long as you also use -ConfigurationName Microsoft.Exchange -ConnectionUri http://<exchange>/PowerShell.

Hadlock posted:

.NET 4.6 got released today

I was expecting PS v5.0 to be released with the next version of .NET (5.0?) but this is just a point release above 4.5, so I went ahead and downloaded the update just to check, it is still 4.0

code:
PS C:\Users\hadlock> $psversiontable

Name                           Value                                           
----                           -----                                           
PSVersion                      4.0                                             
WSManStackVersion              3.0                                             
SerializationVersion           1.1.0.1                                         
CLRVersion                     4.0.30319.42000                                 
BuildVersion                   6.3.9600.17400                                  
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0}                            
PSRemotingProtocolVersion      2.2   
Looks like the CLRVersion changed a tiny bit, 4.030319.34209 -> 4.0.30319.42000

Presumably Win10 will launch with .NET 4.6 and PS4.0. Or are we still expecting to see PS5.0 with Win10? I wonder if they are going to hold back on PS5.0 until they complete SSH integration. That would be major.
I think this has been answered already, but PowerShell ships with the Windows Management Framework (which requires a specific version of the CLR), but it doesn't come with .Net.

I believe (hope) that Win10 is expected to ship with PowerShell 5.0, but it might not be until Server vNext (after which WMF 5 would be available for install on Win10 and some earlier versions presumably).

Vulture Culture posted:

Treat parallelism in PowerShell the same way that you would in most other scripting languages: it will help you with asynchronous I/O-bound operations, but it will almost never improve the performance of a calculation.
Agreeeeeeeed. I'll add that with PowerShell there are a lot of waitable things other than I/O, mostly related to fan-out remoting and such.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
I'm working on a project to verify firewall settings.

The manual strategy is to remote to a host, use telnet to try and connect to various targets and ports and record the results.

Manual sucks.

I tried using
$Connection = New-Object.System.Net.Sockets.TcpClient
$Connection.BeginConnect("Host",12345,$null,$null)
Then check the status with $Connection
Then closing with
$Connection.close()

Does this seem like a good place to start with this or am I running the risk of screwing stuff up?

12 rats tied together
Sep 7, 2006

That's probably how I would do it, given the requirements of "you must use powershell and you must check by attempting to connect to $host on $port". I would add that it sucks that you have to do this, though, and that there are probably better ways in general to ensure that firewalls are configured properly.

Also that you're only going to be testing TCP rules on the firewall, and only from one source, but if you're only looking for TCP rules that isn't really a big deal! You may also want to consider lowering the timeout since any sane TCP endpoint should complete the 3way handshake in about 1 second, and I'm not sure what the default waiting period is on Net.Sockets.TcpClient but if it's like 30 seconds you are in for some waiting.

Tony Montana
Aug 6, 2005

by FactsAreUseless
Haven't you written a basic port scanner?

I'm all for using Powershell for EVERYTHING, but there really are quite complex free port scanners that you can download that will test all sorts of protocols

myron cope
Apr 21, 2009

Briantist posted:

Actually the way you were doing it at first is the supported way of using the Exchange cmdlets. It's called Implicit Remoting. You're not supposed to directly use the snap in in your own scripts.

And note that even when you do install the management tools and run the exchange management shell, implicit remoting is being done behind the scenes.

It should would with Enter-PSSession as long as you also use -ConfigurationName Microsoft.Exchange -ConnectionUri [url]http://[/url]<exchange>/PowerShell.

Interesting. Thanks for the info! I'll keep the tools around to save on (a tiny bit of) typing, but it's good knowledge to have anyway.

Spazz
Nov 17, 2005

Just write a wrapper for nmap that outputs to xml (-oX) and parse that in PowerShell if you must use it.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
I'm in a new job and don't know what software is approved or not.

I've got some shared folders full of tools but I'm not familiar with all the names.
Other than nmap, are there any other common tools that I might want to look for to accomplish this port scanning task?

Adbot
ADBOT LOVES YOU

Tony Montana
Aug 6, 2005

by FactsAreUseless
Just got a vote for nmap from our networking team, if that helps.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply