Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sloshmonger
Mar 21, 2013

Judge Schnoopy posted:

Thank you for clarifying the Measure-Object cmdlet. The idea of getting this info in one go instead of running get-childitem twice is amazing since I'm looking at about a million files total in $root.

I was able to get the rest of the script working besides size, and you helped me clean up the count output as well. However! I'm getting warnings when running this script because empty folders return lengths of "null", not 0. This doesn't stop the script and I added a quick "+0" to help format my output data, so no big deal, but I dislike seeing red blocks of text in my powershell window. Any way around that, maybe an error suppression command, or is it better to just leave it be and ignore?

New script by the way which works, with progress bars / information!

code:
$FS1Log = "c:\software\FS1log.csv"
$root = "\\FileServer\Departments"

$foldertotal = Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True}
$foldernumber = 1
$subnumber = 1

Write-progress -id 1 -activity "Running Script" -Status "Overall progress" `
   -percentComplete ($foldernumber / $foldertotal.count * 100) 

ForEach ($Folder in $(Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True})){

	Write-progress -id 2 -parentId 1 -activity "Gathering data" -Status "From $Folder" 
	$foldernumber++

		ForEach ($sub in $(Get-ChildItem $folder | Where-Object {$_.PSisContainer -eq $True})){
			$subCount = (Get-ChildItem $sub.Fullname -Recurse | Measure-Object -property length -sum )
			$fileCount = ($subCount | Where-Object {!$_.PSisContainer}).Count + 0
			$fileSize = ($subCount | Where-Object {$_.Property -eq "length"}).Sum + 0
			$subOutInfo = $sub.Fullname + "," + $fileCount + "," + $fileSize 
			Add-Content -Value $subOutInfo -Path $FS1log
			$subnumber++
			Write-progress -id 3 -parentId 2 -activity "Current Sub Folder" -Status $sub
	
		}
	}

This script is going to not look for any files in level 0 or 1, as it just looks for files in the $root/$folder/$sub level. To do that you'll need to add the stuff under sub to each level above.

As for the errors, you can add -erroraction SilentlyContinue to the offending line, but it'd be better to add Try/Catch with error logging as you'll want to see why it doesn't work like you think it should.

Adbot
ADBOT LOVES YOU

sloshmonger
Mar 21, 2013
To check for group membership, you can use something along these lines. It checks if the group is a part of the ADuser.memberof property, rather than looking if the user is a part of the group. Roughly equivalent, but you'll be loading up the user object anyway and it'll usually be slightly quicker to iterate through groups rather than members, though it's fractions of a millisecond.

code:
$ADuser = get-aduser -properties * -filter {name -eq $user}
$ADgroup = get-adgroup "Group Name"
if($ADuser.memberof.Value -eq $ADgroup.distinguishedName){
	Remove-AdgroupMember $AdGroup -members $AdUser -server $WritableADServer -Credential $ADCredential 
	Write-Host "Removed $($ADUser.SamAccountName) from $($ADGroup.DistinguishedName)"
}

sloshmonger
Mar 21, 2013

lol internet. posted:

I didn't bother with ToUpper but I used split and trim. I'm asking the user to enter names of AD group names separated by commas. So in case they put "group1, group2" instead of "group1,group2" so in this case I don't think it matters. I did do trim before split though.

Afterwards, I throw it into a string array then do a foreach to do a get-adgroupmembership.

The one other thing I wanted to do was a check to make sure the user input was not 0 groups and between 1-100 only. I tried doing this with .count but the problem is .count returns a 1 for blank values still.

I am not sure how to go about this, I was thinking a if statement to specifically check for a blank line\empty space, and if that's not there, and elseif to continue the check for 1-100 groups prior to running get-adgroupmembership

I am not sure how to write the initial if statement though, I tried
code:
if($string -eq $null)
if($string -eq "")
if($string -eq $_)
and these don't seem to return true for a blank string value.
You can do what Arbitrary did or work with the $string.Length property



#or to prompt the user until they stop being dumb
while ($string.Length -lt 1){
$string = Read-host "Enter a valid string"
While ((($string.toCharArray() | Where-Object {$_ -eq ","} | Measure-Object).Count + 1) -gt 100){ #Only need to check if greater than 100 items, as 1 item is assumed
$string = Read-Host "Enter a valid string again, this time with feeling"
}}

sloshmonger
Mar 21, 2013

22 Eargesplitten posted:

Locally, just running it out of the terminal for now.

I figured it out. For some reason I thought I needed gc to copy the files, so it was telling me I couldn't access the location, but it was still trying to get the files. The problem was Copy-Item wasn't given parameters to merge or overwrite folders that already exist. I also wasn't having it copy subfolders.

This is what happens when someone with no understanding of powershell outside of get-help tries to automate what would otherwise be 100 hours of work.

New error.

code:
$nasName = Read-Host "What NAS do you want to use?"

$location = "\\$nasName\folder\folder"

Copy-Item -Path "$location\folder" -Destination "\\computer\C:\dada" -force
Copy-Item : The given path's format is not supported.

At line:3 char:1

+ Copy-Item -Path "$location\folder" -Destination "\\computer\C:\d ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (:) [Copy-Item], NotSupportedException

    + FullyQualifiedErrorId : System.NotSupportedException,Microsoft.PowerShell.Commands.CopyItemCommand

Is the destination share "C:"? Maybe it should be \\computer\c$\destinationpath

sloshmonger
Mar 21, 2013

22 Eargesplitten posted:

Hopefully a double post is okay. Yesterday when doing some work I broke my script, now it tries to install on the local computer multiple times. I fixed it this morning, but I guess I didn't commit those changes like I thought, and then I broke it again.

I'm using a foreach to read computer names from a file, and then copy and install files. I've commented out the copying right now because the other computer has the files on it already and that saves time.

$computer = "\\$line"

        $PreReqLoc = "$computer\C$"

        $ScriptLoc = "$computer\C$\users\public\Documents\scripts"

 

        Copy-Item -Path "C:\Scripts" -Destination$computer"\C$\users\public\Documents\" -force -recurse

        Start-Job {

        param($location,$PreReqLoc,$ScriptLoc,$accountUsed)

        #Copy-Item $location -Destination $PreReqLoc -force -recurse

        write-host $scriptloc

        Powershell "$ScriptLoc\install.ps1" -executionPolicy bypass #-credential $accountUsed

        Remove-Item -Path $ScriptLoc -force -recurse} -argumentList $location,$PreReqLoc,$ScriptLoc,$accountUsed



Sorry if it's messy, I had to email it from my computer to my phone.

When I check the script location variable before the powershell launch, it gives the right computer name, but as soon as it goes into the install script, it's on the local computer.

E: well that formatting didn't take, hang on.


First, you may need to escape the $ in the $PreReqLoc and $ScriptLoc by using the backtick (`) character -- so "$computer\C`$"

But primarily, the Job is being started with no information. For example, this script connects to a remote computer and gets WMIC info.

code:
Start-Job -ArgumentList ($Address,$ADCredential) -ScriptBlock {
            param($add,$cred)
            $RemoteHost = ([System.Net.DNS]::GetHostByAddress($add)).HostName
            $os = Get-WmiObject -Class win32_OperatingSystem -Credential $cred -ComputerName $RemoteHost
            $sys = Get-WmiObject -Class win32_ComputerSystemProduct -Credential $cred -ComputerName $RemoteHost
            $mem = Get-WmiObject -Class win32_PhysicalMemory -Credential $cred -ComputerName $RemoteHost
            $comp = Get-WmiObject -Class win32_ComputerSystem -Credential $cred -ComputerName $RemoteHost
           #Do a bunch of stuff here
            return $HostInfo
        } 
Edit: Crap, I should really read to the end. You've got -ArgumentList tacked on there. Maybe it's the backtick thing

sloshmonger fucked around with this message at 19:07 on Jul 14, 2016

sloshmonger
Mar 21, 2013

Eschatos posted:

I've started teaching myself powershell this year and so far used it to solve a variety of relatively simple problems - running email traces, querying workstation uptime to see when users are lying about having restarted, and other basic stuff that can be handled in a 10 line script or less. Now I've got a new significant project that will be a massive pain in the rear end to do manually. Naturally I want to automate it, but am not completely sure that what I want to do is reasonably possible via powershell.

Basically the problem is that my company's IT department's asset tracking spreadsheet for workstations, laptops and such is woefully out of date. Updating it manually means walking around to hundreds of different machines and plugging in serial numbers and such into an Excel spreadsheet. If I can automate asset creation I can speed up the manual process to only include laptops that don't connect to the domain often.

Here's the steps I have envisioned to make this happen.

Start off by enabling Powershell remoting on all workstations through Group Policy. - I've already started to roll this out, testing with a single location so far.
Then set up scripts to do the following:
1. Iterate through every computer in AD.
2. Filter out the servers and output the names to csv.
3. Create a function that can query a given computer for as much asset sheet info as possible - brand/model/location(based on ip address)/last logged on user/sn/deploy date/etc.
4. Create a script that can run the above function en masse, taking a csv list of names for input and outputting a csv of results. Bonus points if it automatically removes the names of PCs it successfully retrieved info on from the original csv(or creates a new one with failed queries).

Does this sound like a reasonable way to go about doing this? I've already figured out steps 1 and 2, but am hitting a brick wall in regards to figuring out the rest.

I've pretty much already done 3 & 4 in a script that was supposed to step through all subnet IPs and inventory them.

If you can figure out how to get the info from either the registry or WMIC you can easily extract it. It may require PS v5.

code:
Clear-Host
$ADCredential = Get-Credential -Credential "admin@contoso.com" 
$DumpLocation = "C:\Users\name\Documents\Info Dump"
#1. Get responding computers by IP address, resolve names
$LocationA= "192.168.1"
$LocationB= "192.168.11"
$LocationC= "192.168.168"
#Consolidate addresses
$AllInfo = @()
$Addresses = @()
1..254|Foreach{ #Add addresses at LocationA
    $ip = "$LocationA.$_"
    $Addresses += $ip
    }
1..254|Foreach{ #Add addresses at LocationB
    $ip = "$LocationB.$_"
    $Addresses += $ip
    }
1..254|Foreach{ #Add addresses at LocationC
    $ip = "$LocationC.$_"
    $Addresses += $ip
    }
$HostCount = 1
$HostMax = 20 #How many it will seek info from at one time
foreach($Address in $Addresses) {
    IF(Test-Connection -ComputerName $Address -Count 1 -Quiet){
        Start-Job -ArgumentList ($Address,$ADCredential) -ScriptBlock {
            param($add,$cred)
            $RemoteHost = ([System.Net.DNS]::GetHostByAddress($add)).HostName
            $os = Get-WmiObject -Class win32_OperatingSystem -Credential $cred -ComputerName $RemoteHost
            $sys = Get-WmiObject -Class win32_ComputerSystemProduct -Credential $cred -ComputerName $RemoteHost
            $mem = Get-WmiObject -Class win32_PhysicalMemory -Credential $cred -ComputerName $RemoteHost
            $comp = Get-WmiObject -Class win32_ComputerSystem -Credential $cred -ComputerName $RemoteHost
            $TotalMemory = 0
            $mem.Capacity | foreach {$TotalMemory += $_}
            $HostInfo = New-Object psobject
            $HostInfo | Add-Member -NotePropertyName "HostName" -NotePropertyValue $RemoteHost
            $HostInfo | Add-Member -NotePropertyName "Serial#" -NotePropertyValue $sys.IdentifyingNumber 
            $HostInfo | Add-Member -NotePropertyName "Model#" -NotePropertyValue $sys.Name 
            $HostInfo | Add-Member -NotePropertyName "Version" -NotePropertyValue $sys.Version 
            $HostInfo | Add-Member -NotePropertyName "LoggedIn" -NotePropertyValue $comp.UserName 
            $HostInfo | Add-Member -NotePropertyName "CurrentIP" -NotePropertyValue $add 
            $HostInfo | Add-Member -NotePropertyName "WindowsVersion" -NotePropertyValue $os.Version 
            $HostInfo | Add-Member -NotePropertyName "Architecture" -NotePropertyValue $os.OSArchitecture
            $HostInfo | Add-Member -NotePropertyName "Memory (in GB)" -NotePropertyValue ([System.Math]::Round($TotalMemory/1GB,2))
            return $HostInfo
        } 
        if($HostCount -ge $HostMax){
            sleep 5
            $HostCount = 1
            }
        Else{$HostCount++}
    }
}
Get-Job | Wait-Job
$HostJobs = Get-Job
Write-Host "Adding info to Info Dump folder"
foreach($Hostjob in $HostJobs){
    $htemp = Receive-Job -Id $Hostjob.id -Keep -ErrorAction SilentlyContinue
    Write-Host "Attempting to process job $($HostJob.ID)"
    if($htemp.HostName -eq $null){ Write-Host "No valid connection for $($Htemp.CurrentIP)"}
    ELSE{    
        $outfile = New-Item -ItemType File -Path $DumpLocation -Name "$($htemp.HostName).txt" -Force
        $htemp | out-file $outfile
        $AllInfo += $htemp
    }
}
Get-Job | Stop-Job
Get-Job | Remove-Job

$AllDump = ($DumpLocation + "\out.csv")
$AllInfo | Export-CSV $AllDump -Force -NoTypeInformation
Yes, it's pretty crappy, but it does its job. If i had to change anything I'd make it so that the processing is done on the remote side, rather than on the local computer.

sloshmonger
Mar 21, 2013

MC Fruit Stripe posted:

When using ordered in this case, I'd need to specify every service in the script itself and not in a text file, no? I don't see a way to use both, essentially a $services = [ordered]stopservices.txt approach. Placing the services directly into the script, essentially going from

code:
$servers = Get-Content -Path "c:\stripe\servers.txt"
$services = Get-Content -Path "c:\stripe\stopservices.txt"
Get-Service -Name $services -ComputerName $servers | Set-Service -Status Stopped
to

code:
$servers = Get-Content -Path "c:\stripe\servers.txt"
$services = [ordered]@{"DHCP Client"=1;"Task Scheduler"=2;"IPsec Policy Agent"=3;"Base Filtering Engine"=4}
Get-Service -Name $services -ComputerName $servers | Set-Service -Status Stopped
Given these scripts, would

Script 1 stopping in order: Base Filtering Engine, DHCP Client, IPsec Policy Agent, Task Scheduler
Script 2 stopping in order: DHCP Client, Task Scheduler, IPsec Policy Agent, Base Filtering Engine

Would that be right?

Edit: Heh, you know what, this whole thing may be pointless. There is absolutely no reason to do this, and I mean I appreciate a good learning opportunity as much as the next guy, but this is a straight up copy paste situation. Forest for the trees, missed the drat forest for the trees.

code:
$servers = Get-Content -Path "c:\stripe\servers.txt"
Get-Service -Name "Not a Real Service" -ComputerName $servers | Set-Service -Status Stopped (throws an error quickly and moves on to...)
Get-Service -Name "DHCP Client" -ComputerName $servers | Set-Service -Status Stopped
Get-Service -Name "Task Scheduler" -ComputerName $servers | Set-Service -Status Stopped
Tested, worked, probably as complicated as I need to make that one no?

Sounds like you have that one pretty much solved, but could you use import-csv instead of get-content, as import-csv preserves ordering?

sloshmonger
Mar 21, 2013
i used powershell to read through a CSV of what location I visited each day to generate formatted expense reports. Makes it a 2 minute thing at the end of each week instead of 10 minutes. I should be breaking even timewise sometime in 2018.

sloshmonger
Mar 21, 2013

Avenging_Mikon posted:

Okay, so I'm trying to do a lab in Powershell in a Month of Lunches, and early in the chapter, I was able to set the prompt to hkcu:\ and work from there, but trying to go to HKEY_CURRENT_USERS gets me a "drive does not exist with that name" error. I looked in the registry, and it is there. I've tried set-location -Path HKEY_Current_User, with and without a colon and/or backslash, and from multiple directories, ranging from C:\ to HKCU:\

Where am I loving up?

You can use the command Get-PSDrive to get a list of all current drives, and use the information in the Name field to switch to another drive. So, Set-Location -Path HKCU: should work and does from my machine. Or cd HKCU: if you want to be oldschool

sloshmonger
Mar 21, 2013

Newf posted:

I have a powershell script that automates some file backup, but the script has become slow as the number of files has grown - it's currently copying the entire working directory into the backup locations with each use, when it'd be better to just copy the new files.

IE, with the following files, it currently copies all five files, overwriting files 1, 2, and 3 in the backup directory, where it should be copying only file 4 and 5. (The files themselves don't change, so there's no worry that fileX in the backup directory will fall out of date).

code:
Working dir       Backup dir
-----             -----
file1             file1
file2             file2
file3             file3
file4
file5
The current one-liner is

code:
Copy-Item -Recurse -Force .\path\to\working\dir .\path\to\backup\dir
but I can't find any flag on the copy-item command that prevents clobbering existing files.

Suggestions?

If you're adamant on using powershell, I think you can take out the -force flag and instead add -erroraction silentlycontinue

But the real answer is above my post.

sloshmonger
Mar 21, 2013
This is probably something so simple, but I just found out about #region and #endregion for grouping and I am loving blown away.

sloshmonger
Mar 21, 2013

devmd01 posted:

Needed a way to track Windows Management Framework versions across all my 2012R2 servers as I roll out WMF 5.1...I love -asjob.


code:
$servers=(Get-ADComputer -Filter 'OperatingSystem -like "*Windows Server 2012 R2*"' -Properties * -SearchBase "OU=Servers,DC=corporate,DC=domain,DC=com" | Select-Object -ExpandProperty Name | Sort-Object) | foreach-Object{invoke-command -computername $_ -scriptblock {$PSVersionTable.psversion} -asjob} | get-job | wait-job | receive-job | export-csv .\2012R2PowershellVersion.csv -notype
Any pssession errors are reflected to the console and I just skim them to verify that they're decommissioned machines that haven't been cleaned up yet.

I'm in the middle of this myself. Have you found a way to automate the installation of WMF 5.1 through powershell?

sloshmonger
Mar 21, 2013

PierreTheMime posted:

Sorry this is a fairly basic question but for the life of me I cannot find a good example of how to update individual access rules for an ACL. I'm trying to set up a user with specific read/write/delete access for files only (essentially just removing "Create Folders / append data" and "Delete" from the Advanced permissions area. Making ruleset for basic permissions is simple enough, but what is the command to modify the FileSystemRights for the special permissions?

I'm not quite sure what you're asking, but let's assume you are defining the ACL similar to this
code:
$objACLFullControl = [System.Security.AccessControl.FileSystemRights]::FullControl
$objACL = New-Object System.Security.AccessControl.FileSystemAccessRule ($objAdGroup.SamAccountName,$objACLFullControl,('ContainerInherit, ObjectInherit'),'None','Allow')


Then you just need to define a FileSystemRights object with the permissions you want to give. Check the MSDN below for what is available.
You may need to do this multiple time if you're granting multiple advanced permissions -- I haven't done that myself.

See: FileSystemAccessRule https://msdn.microsoft.com/en-us/library/system.security.accesscontrol.filesystemaccessrule(v=vs.110).aspx
or
FileSystemRights https://msdn.microsoft.com/en-us/library/system.security.accesscontrol.filesystemrights(v=vs.110).aspx

sloshmonger
Mar 21, 2013
Has anyone started using Powershell Core? I'm wondering what use cases it does better than Windows Powershell at this stage.

sloshmonger
Mar 21, 2013
I don't know if it helps this situation at all, but the Powershell Core 6.x version of Invoke-RestMethod has a ResponseHeadersVariable outputs the return headers to a variable.

If it's a only-need-it-once type thing, maybe Fiddler?

Otherwise, you're stuck using Invoke-WebRequest and piping the data to a json.

sloshmonger
Mar 21, 2013

FISHMANPET posted:

Do people have thoughts on foreach vs ForEach-Object? Is one or the other more readable? Better performing? More powershelly?

I prefer foreach because it reads better to me but that could be the last vestiges of Python being my native language. My coworker uses ForEach-Object and it looks kinda dirty and messy but piping does seem more in line with doing things the Powershell way.

For some reason, there's a performance difference between those two seemingly identical functions.

https://blogs.technet.microsoft.com/heyscriptingguy/2014/07/08/getting-to-know-foreach-and-foreach-object/


code:
$time = (Measure-Command {
    1..1E4 | ForEach-Object {
        $_
    }
}).TotalMilliseconds
 [pscustomobject]@{
    Type = 'ForEach-Object'
    Time_ms = $Time
 }

$Time = (Measure-Command {
    ForEach ($i in (1..1E4)) {
        $i
    }
}).TotalMilliseconds
  [pscustomobject]@{
    Type = 'ForEach_Statement'
    Time_ms = $Time
 }
Running it now in PowerShell 5.1.17134 gives the foreach-object 95ms to count to 1000, whereas foreach does it in 17ms. Your system may be slightly faster and turn any difference negligible. I used to exclusively use foreach-object, but have converted to foreach since I evaluated that myself.

sloshmonger
Mar 21, 2013

PBS posted:

ForEach-Object: 69.9019716ms
ForEach_Statement: 7.292194ms
ForEach_Method: 39.3125347ms

anthonypants posted:

And the method is supposed to be even faster.

I had no idea the method even existed. Thanks for showing me that.

I've usually gone with functions or statements.

I guess you could say I'm not a... Method Man

sloshmonger
Mar 21, 2013

FISHMANPET posted:

Trying to construct a function with some parameter sets, not sure if what I want to do is actually possible. This is a "get" function. By default I want it to get "all" the parameters, but I want to allow using -AllFields:$false to only get "default" values. I ALSO want to, in a separate parameter set, allow a -Fields parameter where the user can specify which fields they want. It wouldn't make sense to pass AllFields and Fields in the same command so I'd like to use Parameter Sets to prevent that... So I wrote this but it doesn't seem very elegant. Checking if AllFields was specified then acting on it, else checking if fields was specified and acting on that, else building the default URL. And using a DefaultParameterSetName that doesn't exit.

I guess I could also do a switch statement on $PSCmdlet.ParameterSetName but that doesn't change the meat of the problem which is that I'm having to define my "default" twice. I also don't know which is better coding practice, what I've got above or switch on ParameterSetName.

I abandoned the recent opportunity I had to use them, but I believe the search term you're looking for is Dynamic Parameters or dynamicparam. If you find a good example, please report back. Hopefully that'll set you in the right direction so you don't have to double/triple your effort.

Edit: Ignore this. Mario has it below.

Irritated Goat posted:

Help me save my own sanity. I’m trying to scrape for a specific device on specific PCs using WMI wrapped in an invoke-command. If I do it manually PC by PC, it works. It’s only when I add in Import-CSV file.csv | does it return nothing.

The CSV is just:
Name
PC1
PC2
PC3

Ideas?


Can you tell if it's misbehaving on your computer, or on the remote computer?
Are you passing local variables to the remote computer with the $using:variable?
Credentials being passed successfully?
Do you know at what point in your script/import it's failing?

One thing that helps me troubleshooting a new script is having Write-Host "Some-Command -and $arguements1 -but $argument2 | Do-something -useful" right before whatever the command is, so I can see what the script is running versus what I think it should be running. All of a sudden I'm passing a $null argument? Easy to see.

sloshmonger fucked around with this message at 03:42 on Feb 20, 2019

sloshmonger
Mar 21, 2013

CzarChasm posted:

I'm having a problem with trying to get a split or another way to break up a string to work. Overall, what I'm trying to do is go to my (GMAIL) mailbox and run a report on all the messages that are in there, including the file names of attachments. Here's what I have.
code:
if ($msg2.attachments -NE "") { # If the message has an attachment 
        $ContentType = $msg2.Headers.'Content-Type' #Grab the value in Content-Type from the Header
        $Parts = $ContentType.split("=")        #Split the value based on where "=" appears in the string
        $attachNames = $Parts[1]}   #Grab the right half of the string where the split happens
Note: msg2.attachments does not have a "Name" or other ID value, it basically just shows the value "AE.Net.Mail.Attachment" for any attachment. Basically just a "Yup, this email has an attachment"

Unfortunately, the split method doesn't work on $ContentType - according to powershell, it does not contain that method. I also tried using a substring method, but that is also not present.

I have tried converting the variable to string by using [string]$ContentType, but that didn't really do anything

As an example, this is what $ContentType contains if there is an attachment:
Content-Type: application/octet-stream; name=lead_attachment.xml

I have a second issue where there could be multiple instances of Content-Type in the header, but if I can't get it to pull the string data anyway there's not much point in getting it to narrow down the right value.

You may be able to use the PowerShell -split operation rather than the .NET .split() method
code:
$Parts = $ContentType -split "="
I have a feeling you're going to need to stop worrying and love the regex.

sloshmonger
Mar 21, 2013

PierreTheMime posted:

Here's a quick verbose attempt at what you're asking for:

A lot of this can be reduced and it's mostly like this just for a better visualization of each step, here's the slightly-more-condensed version:
PowerShell code:
Mostly good code
This would depend on the OP's problem, but would miss files if Folder3 has subfolders that need to be checked they'd be missed.

PowerShell code:
ForEach ($Folder in Get-ChildItem -Path '.\Folder1') {
	ForEach ($Subfolder in Get-ChildItem -Path $Folder.FullName -Directory) {
		ForEach ($File in Get-ChildItem -Path $Subfolder.FullName -File -Recurse) {
			$FileFolder = "$($Subfolder.FullName)\$($File.Basename)"
			If (!(Test-Path -PathType Container -Path $FileFolder)) { New-Item -ItemType Directory -Force -Path $FileFolder }
			Move-Item $File.FullName -Destination $FileFolder
		}
	}
}

sloshmonger
Mar 21, 2013

Toshimo posted:

So, every 3 weeks I have to change the passwords on a few hundred test accounts and afterwards, the services running as those accounts have to be changed as well.

I wrote a script to go out to the associated servers and change the passwords on the services and it 100% works as-advertised except... the services tend to revert to the old password after reboot.

Is there something I'm missing here to make this persistent?

code:
gwmi -NameSpace "root\CIMV2" -Class "Win32_Service" -ComputerName $Computer_Name -Filter "Name LIKE '0%Controller'" | % { $_stopservice() }
gwmi -NameSpace "root\CIMV2" -Class "Win32_Service" -ComputerName $Computer_Name -Filter "Name LIKE '0%Controller'" | % { $_.change($null,$null,$null,$null,$null,$null, "DOMAIN\$($_.Name -match '_(\d)_' | % { $Matches[1] })", $New_Password, $null, $null, $null ) }
gwmi -NameSpace "root\CIMV2" -Class "Win32_Service" -ComputerName $Computer_Name -Filter "Name LIKE '0%Controller'" | % { $_.startservice() }

Assuming that your AD environment will support it, this is exactly the case for a Managed Service Account, or Group Manage Service Account, was made for, unless you're also using those accounts to log in as or do other manual entries.

I'm not any help as to the issue you raised, though.

sloshmonger
Mar 21, 2013

a hot gujju bhabhi posted:

I'm trying to automate some IIS stuff and I'm getting stuck trying to assign an SSL certificate to an HTTPS binding. I'm using the following command:
code:
(Get-WebBinding -Name $name -Protocol "https" -HostHeader $header).AddSslCertificate($cert.Thumbprint, "WebHosting")
$cert is an X509Certificate2 that I've already imported to the WebHosting store for LocalMachine.

The error I'm getting is: A specified logon session does not exist. It may already have been terminated.


Try this:
code:
netsh http add sslcert hostnameport="${FQDNName}:443" appid="$guid" certhash="$($Cert.Thumbprint)" certstorename=MY
New-WebBinding -name $iisSite -Protocol https  -HostHeader $FQDNName -Port 443 -SslFlags 1 -Force

sloshmonger
Mar 21, 2013
Then expect to learn a lot about runspaces. Which are great for that, if a little hard to get at.

sloshmonger
Mar 21, 2013

Lum posted:

Having PSRemoting fun again...

So I have a script that runs once a day, connects to a server on a different domain using Invoke-Command -Cred $Creds. Once it's connected it runs some SQL backups, passes the details back and then we copy those backups over SMB. It's done this way because inbound connections to your backup server are a bad idea in a post-cryptolocker world.

It's worked for months and then started failing with "The WSMan service could not launch a host process to process the given request". Couldn't find much on google but I checkes the TrustedHosts stuff, restarted the wsman service, added -SessionOption ( NewPSSesionOption -NoMachineProfile), even tried upgrading from PS 5.1 to PS 7.2

Eventually what cured it was a reboot.

Unfortunately that only lasted for 5 days and now the problem is back.

Any idea what it could be? I can't keep rebooting this server as it's not clustered or anything.

How does it close out the connection? Maybe it's keeping ports open and you're seeing port exhaustion?

sloshmonger
Mar 21, 2013

Boywhiz88 posted:

OK, we're making folders how we want them to be named, and where we want them to be!

Hurrah.

Now to get into permissions.

At this time, I'm hoping to have the $username folder inherit permissions, add PCNAME\$username (when introduced into work, it'll be DOMAIN\$username), and provide Modify writes.

Afterwards, it will generate a subfolder called PERSONAL. Turn off inheritance while keeping permissions, and removing a specific SG that we have at work.

I've got it adding the $username to the folder permissions (i setup a dummy user on my PC), but it doesn't successfully add the Modify "Allow" mark. I've tried it as this:

$ACL = Get-Acl $path

$perm = New-Object System.Security.AccessControl.FileSystemAccessRule("PCNAME\$username", "Modify", "Allow")

$ACL.SetAccessRule($perm)

$ACL | Set-ACL $path


Which is a copy&paste from pretty much everything you see on this. I've also tried it where it calls out inheritance/propagation w/ no difference. No fails when testing the script... just doesn't lock in the permissions.

Looks like you've got the right constructor for the $perm variable, but you're using the SetAccessRule method for $ACL. That will remove all access rules and just have the one you specify in it (https://learn.microsoft.com/en-us/dotnet/api/system.security.accesscontrol.directorysecurity?view=net-7.0). If you want to add an additional permission on top of the parent permissions, use the AddAccessRule method.


Try this:
$ACL = Get-Acl $path
$perm = New-Object System.Security.AccessControl.FileSystemAccessRule("PCNAME\$username", "Modify", "Allow")
$ACL.AddAccessRule($perm)
$ACL | Set-ACL $path
$Subfolder = new-item -ItemType Directory -Name "PERSONAL" -path $Path #Creates a new subfolder and keeps it as a variable
$SubACL = $ACL.psobject.copy () #Creates a copy of the $acl variable while keeping the original
$SubACL.SetAccessRuleProtection($True, $True) #The first part says is this folder protected or not (opposite of inherited). The second is should the current acl be copied.
$BadPerm = New-Object System.Security.AccessControl.FileSystemAccessRule("PCNAME\GroupName", "Modify", "Allow") #Change this to be whatever the group you don't want inherited, and make sure the Permission level matches. There's a way to get this through scripting but if it's all the same this is faster
$SubACL.RemoveAccessRule($BadPerm) #Removes the group permission above
$SubACL | Set-ACL $Subfolder.FullName #And set it on the subfolder

sloshmonger
Mar 21, 2013
Glad to see you got it working!

Adbot
ADBOT LOVES YOU

sloshmonger
Mar 21, 2013

Boywhiz88 posted:

No idea, but when you see it like that, you can't help but think... there's gotta be a way.

I have what I think is an impossible ask:

I'm trying my damndest to find a process where I can import a CSV (preferably) with appointments filled out to an M365 Room Mailbox.

The idea here is that we have our standard holidays, and we want to book out our conference rooms. I'm hoping to automate in some way, vs my boss sending out manual invites/manually logging into each mailbox.

Thoughts or leads? Right now, I'm coming up short but I feel crazy because you think it would be possible.

EDIT: Realizing the EWS API isn't as depreciated as I might have thought... will consider that avenue because there's some stuff available. But I'm just surprised there's no way via EXO Powershell.

You're going to have to do some work in the Graph API if you want to do that.
https://learn.microsoft.com/en-us/graph/api/calendar-post-events?view=graph-rest-1.0&tabs=http

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply