Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Pile Of Garbage
May 28, 2007



22 Eargesplitten posted:

I wrote a 1-liner yesterday that I had some trouble with.

code:
gci /path . -include "*.auc" /force /recurse 
I also tried without specifying the path. It worked on a few folders, but I got an error saying access was denied on most of them. I ran it as an administrator, so I don't see why that should be. I even got it on some of my own user folders.

Lmao yeah your syntax is all kinds of messed up (Funny how it still works though). This looks a bit nicer:

code:
Get-ChildItem -Path '.\' -Include '*.auc' -Force -Recurse
However the access denied errors you are seeing is most likely due to it trying to traverse NTFS junction points, of which there are many in user profile folders (e.g. "C:\Users\Default\Application Data" and "C:\Users\Default\Start Menu", etc.). These junction points are set to hidden so if you run Get-ChildItem without the Force switch it won't try to traverse them. However if the files you are searching for are also hidden then you may just have to catch the exceptions.

Adbot
ADBOT LOVES YOU

Zaepho
Oct 31, 2013

GPF posted:

Looking at it now, I can see a few places where I could have been more efficient or cut 5 lines down to 2 or less, but this runs at an acceptable speed and, most importantly, does what I need it to do.

Excellent script! A couple of minor critiques.

I like to always put things like your $Servers into a Param block so i can feed the script from anything.
$Servers should explicitly be declared as an [array] so that if there is a single item it doesn't get handled wrong. (this is more about bulletproofing for use by others)
$myJobs should also explicitly be declared as an [array] "Just In Case"

These are really just some quick bulletproofing and "best practice" (yeah yeah, at least what has seemed to keep me out of the most trouble) items. Script looks pretty drat solid and useful. Its a pity that its even necessary though. Seems like purging old failed jobs should be something the print spooler is capable of on it's own.

GPF
Jul 20, 2000

Kidney Buddies
Oven Wrangler

Zaepho posted:

Excellent script! A couple of minor critiques.

These are really just some quick bulletproofing and "best practice" (yeah yeah, at least what has seemed to keep me out of the most trouble) items. Script looks pretty drat solid and useful. Its a pity that its even necessary though. Seems like purging old failed jobs should be something the print spooler is capable of on it's own.

Thanks for the compliments. You're right about what you referenced, but this is currently a script that only I run. When it's just me, I'll tend to make the script a bit 'fragile' so I can make changes quickly depending on the circumstances. When I'm automating or writing stuff for others to use, it's a totally different story. Validate all input! Abort at the first sign of trouble! Be kind, rewind!

And, you're right on that last point as well. Why in the hell is the MS print spooler so damned...SQUIRRELY!!??!!?? Why can't HP/Lexmark/Hitachi/Samsung/Canon/Brother/Epson [gently caress Epson in a business situation] actually write drivers that are print server-friendly? GAHH!!

22 Eargesplitten
Oct 10, 2010



mystes posted:

Why are you using slashes for parameters? Powershell uses dashes.

:doh: My only excuse is that I have also been working in the command line.

The files are in appdata, but once I gotten into appdata everything is visible. I'll try again with the correct syntax. Since they're all in appdata\local, I'll probably run from Users, and do *\appdata\local for the path. It sounds like force should help when using a dash instead of slash.

E: that worked. Strangely enough, I was still getting a bunch of errors, but not for the folders I needed.

Now my question is how to find registry keys by value. I've got server entries in the registry, so I want to be able to search for a particular value and replace all instances of that value with the new server name. The Google searches I have done are only bringing up searches for the key name, not the value.

22 Eargesplitten fucked around with this message at 17:11 on Sep 9, 2016

nielsm
Jun 1, 2009



22 Eargesplitten posted:

Now my question is how to find registry keys by value. I've got server entries in the registry, so I want to be able to search for a particular value and replace all instances of that value with the new server name. The Google searches I have done are only bringing up searches for the key name, not the value.

First, do you actually mean keys, or do you actually mean values?
Keep in mind that the registry has this odd structure where the "folders" are called keys, and each key has a number of values. A value has a name, a datatype, and data. The values are the "files".
(Historically, there was only one value per key, the one now represented with a blank/null name, shown as (Default) in regedit.)

Anyway, first figure out if you're looking for keys, or for key+value name. And whether you're searching by value data or something else.

When you use Get-ChildItem in a registry provider tree in PowerShell (e.g. HKLM:\ ) then you only get the keys out as objects, the values are accessed roundabout through the keys.

You can do something like this, at least:
code:
Get-ChildItem HKLM:\SOFTWARE\Classes\ |
  where { $_.GetValue("PerceivedType") -eq "video" }
If you don't know the value name it gets much more annoying. Microsoft.Win32.RegistryKey objects in PowerShell just get a NoteProperty called "Property" that contains a string array of value names, so you need to call GetValue for each value name you want to test something against.

22 Eargesplitten
Oct 10, 2010



I guess I'm looking for the value data then? My lead admin calls the values keys, if I'm understanding you right. He calls the individual items you edit keys. I'm not surprised, though. Everyone here calls display port "Dell DVI" and it drives me crazy.

It's 5PM on a Friday, so I'm not quite sure what you're saying at the end there. That stuff you wrote wouldn't find the data (in this case a server name that a program connects to) in a value, right?

nielsm
Jun 1, 2009



22 Eargesplitten posted:

I guess I'm looking for the value data then? My lead admin calls the values keys, if I'm understanding you right. He calls the individual items you edit keys. I'm not surprised, though. Everyone here calls display port "Dell DVI" and it drives me crazy.

It's 5PM on a Friday, so I'm not quite sure what you're saying at the end there. That stuff you wrote wouldn't find the data (in this case a server name that a program connects to) in a value, right?

My example searches for keys ("folders") that have a value named "PerceivedType" with a data contents of "video".
E.g. for this, it would find the "HKLM:\SOFTWARE\Classes\.MKV" item:

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
I think what 22 Eargesplitten wants is a method to return all registry values with a content of "video"?

mystes
May 31, 2006

anthonypants posted:

I think what 22 Eargesplitten wants is a method to return all registry values with a content of "video"?
Something like:
code:
function getMatchingValues($key, $data) {
$properties = Get-Item $key | Select-Object -ExpandProperty property
return $properties | ? {(Get-ItemProperty $key).psobject.properties[$_].value -eq $data}
}

function findRegistryData($startpath, $data) {
Get-ChildItem -recurse $startpath| ? {$_ | getMatchingValues $_.pspath $data}
}

findRegistryData "HKLM:\" "video" | % {$_.name}
This is way more confusing then it should be due to powershell being generally a terrible language.

mystes fucked around with this message at 04:28 on Sep 10, 2016

Pile Of Garbage
May 28, 2007



*bursts into thread, panting and out of breath*

I THINK You'll find that's more just the registry provider being terrible, not PowerShell itself.

Edit: PowerShell is the best because you can do this:

code:
(New-Object Media.SoundPlayer([Text.Encoding]::UTF8.GetString([Convert]::FromBase64String('aHR0cDovL2JpdC5seS8xQnJaQ3Rr')))).PlayLooping()

Pile Of Garbage fucked around with this message at 14:40 on Sep 13, 2016

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

cheese-cube posted:

*bursts into thread, panting and out of breath*

I THINK You'll find that's more just the registry provider being terrible, not PowerShell itself.

Edit: PowerShell is the best because you can do this:

code:
(New-Object Media.SoundPlayer([Text.Encoding]::UTF8.GetString([Convert]::FromBase64String('aHR0cDovL2JpdC5seS8xQnJaQ3Rr')))).PlayLooping()
It's true. All of it.

MisterZimbu
Mar 13, 2006
I'm trying to automate some IIS site creation to make my job easier / make us look slightly more professional.

The thing I'm trying to do at this point is remove a duplicate binding from another site if it exists (this would be behind a -Force parameter, of course).

I'm using Get-WebBinding with some filters, but is there a way to pass the -HostHeader parameter so it only returns something with an empty string? If i just omit the parameter or pass it an empty string it just doesn't filter.

Example bindings:
*:80:site1.local
*:80:
*:80:site2.local

code:
Get-WebBinding -IPAddress "*" -Port 80 -HostHeader ""
I want to get back just the "*:80:" binding, but I instead get all three.

code:
Get-WebBinding -IPAddress "*" -Port 80 | Where-Object { $_.bindingInformation.EndsWith(":${HostHeader}") }
Seems to work, but is there a more elegant way?

Sefal
Nov 8, 2011
Fun Shoe
I spent way too long trying to use robocopy to move directories

source dir looked this \\path\path2\path3\pat*\path4\p*
Destination \\path1\path2\path3

Robocopy had issues with the wildcard in the source dir. spent alot time trying to figure it out. in the end all i needed were 3 lines of code
Felt pretty stupid afterwards but at least i learned that move-item also moves the child directories and files beneath it.



code:
 $path = "\\path\path2\path3\pat*\path4\p*"
 $destination = \\path1\path2\path3

 move-item -path $path -destination $destination

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Sefal posted:

I spent way too long trying to use robocopy to move directories

source dir looked this \\path\path2\path3\pat*\path4\p*
Destination \\path1\path2\path3

Robocopy had issues with the wildcard in the source dir. spent alot time trying to figure it out. in the end all i needed were 3 lines of code
Felt pretty stupid afterwards but at least i learned that move-item also moves the child directories and files beneath it.



code:
 $path = "\\path\path2\path3\pat*\path4\p*"
 $destination = \\path1\path2\path3

 move-item -path $path -destination $destination
Robocopy's source and destination parameters are for folder names.

Mo_Steel
Mar 7, 2008

Let's Clock Into The Sunset Together

Fun Shoe

cheese-cube posted:

*bursts into thread, panting and out of breath*

Edit: PowerShell is the best because you can do this:

code:
(New-Object Media.SoundPlayer([Text.Encoding]::UTF8.GetString([Convert]::FromBase64String('aHR0cDovL2JpdC5seS8xQnJaQ3Rr')))).PlayLooping()

code:
Add-Type -AssemblyName System.speech
$message = New-Object System.Speech.Synthesis.SpeechSynthesizer
$message.Speak("This is the FBI, step away from your keyboard. You have 10 seconds to comply.")
Create a scheduled task and have it generate a random delay after running at startup / login, wait for the next time your buddy leaves his machine unlocked.

Pile Of Garbage
May 28, 2007



Better yet: add it to their PowerShell profile so that it runs whenever they launch PowerShell.

adaz
Mar 7, 2009

I've been super lazy about keeping the OP updated folks, was going to make a pass at it later today with links to the newer docs, releases, and stuff like the free e-books don jones and co have put out. Anyone else have any recommendations?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
http://ss64.com/ps/

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
http://powershell.office.com/

beepsandboops
Jan 28, 2014
I want to shoot off a command to our on-prem Exchange and Lync servers as part of our new user script.

Right now, I'm using Import-PSSession for each server then running the enable user command, but that seems to have a lot of overhead if I just want to run just the one command.

Invoke-Command doesn't seem to be able to use the product-specific commandlets. Is there a better way to run just one Exchange/Lync command remotely?

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

beepsandboops posted:

I want to shoot off a command to our on-prem Exchange and Lync servers as part of our new user script.

Right now, I'm using Import-PSSession for each server then running the enable user command, but that seems to have a lot of overhead if I just want to run just the one command.

Invoke-Command doesn't seem to be able to use the product-specific commandlets. Is there a better way to run just one Exchange/Lync command remotely?
I think Import-PSSession is basically what the Exchange Management Shell does, so I don't think so.

nielsm
Jun 1, 2009



beepsandboops posted:

I want to shoot off a command to our on-prem Exchange and Lync servers as part of our new user script.

Right now, I'm using Import-PSSession for each server then running the enable user command, but that seems to have a lot of overhead if I just want to run just the one command.

Invoke-Command doesn't seem to be able to use the product-specific commandlets. Is there a better way to run just one Exchange/Lync command remotely?

I can definitely use Invoke-Command to run individual cmdlets on Exchange, without importing the session or making a module from it. I just connect the session with New-PSSession, usually using -Name to set a name for the session so I can easily grab it later with Get-PSSession.
(Actually I have a custom Get-ExchangeSession function that tries to get the named session, and if that fails then it establishes one with that name and returns it.)

Then just store the session reference into a variable and pass that for -Session on Invoke-Command.

I can post an example tomorrow when I'm at work.

nielsm
Jun 1, 2009



Basic example of doing Exchange management via Invoke-Command instead of an imported PSSession:
code:
Function Get-ExchangeSession {
    $exchange = Get-PSSession -Name "Exchange" -ErrorAction SilentlyContinue
    if (-not $?) {
        Write-Host "Establishing Exchange session..."
        $exchange = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "http://exchange.contoso.com/PowerShell" -name "Exchange"
    }
    return $exchange
}

Function Set-MailboxFullAccess($sharedMailbox, $user) {
    $exchange = Get-ExchangeSession
    Invoke-Command -Session $exchange { param($id,$u) Add-MailboxPermission -Identity $id -User $u -AccessRights "FullAccess" } -ArgumentList $sharedMailbox,$user
}

Eschatos
Apr 10, 2013


pictured: Big Cum's Most Monstrous Ambassador
I've started teaching myself powershell this year and so far used it to solve a variety of relatively simple problems - running email traces, querying workstation uptime to see when users are lying about having restarted, and other basic stuff that can be handled in a 10 line script or less. Now I've got a new significant project that will be a massive pain in the rear end to do manually. Naturally I want to automate it, but am not completely sure that what I want to do is reasonably possible via powershell.

Basically the problem is that my company's IT department's asset tracking spreadsheet for workstations, laptops and such is woefully out of date. Updating it manually means walking around to hundreds of different machines and plugging in serial numbers and such into an Excel spreadsheet. If I can automate asset creation I can speed up the manual process to only include laptops that don't connect to the domain often.

Here's the steps I have envisioned to make this happen.

Start off by enabling Powershell remoting on all workstations through Group Policy. - I've already started to roll this out, testing with a single location so far.
Then set up scripts to do the following:
1. Iterate through every computer in AD.
2. Filter out the servers and output the names to csv.
3. Create a function that can query a given computer for as much asset sheet info as possible - brand/model/location(based on ip address)/last logged on user/sn/deploy date/etc.
4. Create a script that can run the above function en masse, taking a csv list of names for input and outputting a csv of results. Bonus points if it automatically removes the names of PCs it successfully retrieved info on from the original csv(or creates a new one with failed queries).

Does this sound like a reasonable way to go about doing this? I've already figured out steps 1 and 2, but am hitting a brick wall in regards to figuring out the rest.

sloshmonger
Mar 21, 2013

Eschatos posted:

I've started teaching myself powershell this year and so far used it to solve a variety of relatively simple problems - running email traces, querying workstation uptime to see when users are lying about having restarted, and other basic stuff that can be handled in a 10 line script or less. Now I've got a new significant project that will be a massive pain in the rear end to do manually. Naturally I want to automate it, but am not completely sure that what I want to do is reasonably possible via powershell.

Basically the problem is that my company's IT department's asset tracking spreadsheet for workstations, laptops and such is woefully out of date. Updating it manually means walking around to hundreds of different machines and plugging in serial numbers and such into an Excel spreadsheet. If I can automate asset creation I can speed up the manual process to only include laptops that don't connect to the domain often.

Here's the steps I have envisioned to make this happen.

Start off by enabling Powershell remoting on all workstations through Group Policy. - I've already started to roll this out, testing with a single location so far.
Then set up scripts to do the following:
1. Iterate through every computer in AD.
2. Filter out the servers and output the names to csv.
3. Create a function that can query a given computer for as much asset sheet info as possible - brand/model/location(based on ip address)/last logged on user/sn/deploy date/etc.
4. Create a script that can run the above function en masse, taking a csv list of names for input and outputting a csv of results. Bonus points if it automatically removes the names of PCs it successfully retrieved info on from the original csv(or creates a new one with failed queries).

Does this sound like a reasonable way to go about doing this? I've already figured out steps 1 and 2, but am hitting a brick wall in regards to figuring out the rest.

I've pretty much already done 3 & 4 in a script that was supposed to step through all subnet IPs and inventory them.

If you can figure out how to get the info from either the registry or WMIC you can easily extract it. It may require PS v5.

code:
Clear-Host
$ADCredential = Get-Credential -Credential "admin@contoso.com" 
$DumpLocation = "C:\Users\name\Documents\Info Dump"
#1. Get responding computers by IP address, resolve names
$LocationA= "192.168.1"
$LocationB= "192.168.11"
$LocationC= "192.168.168"
#Consolidate addresses
$AllInfo = @()
$Addresses = @()
1..254|Foreach{ #Add addresses at LocationA
    $ip = "$LocationA.$_"
    $Addresses += $ip
    }
1..254|Foreach{ #Add addresses at LocationB
    $ip = "$LocationB.$_"
    $Addresses += $ip
    }
1..254|Foreach{ #Add addresses at LocationC
    $ip = "$LocationC.$_"
    $Addresses += $ip
    }
$HostCount = 1
$HostMax = 20 #How many it will seek info from at one time
foreach($Address in $Addresses) {
    IF(Test-Connection -ComputerName $Address -Count 1 -Quiet){
        Start-Job -ArgumentList ($Address,$ADCredential) -ScriptBlock {
            param($add,$cred)
            $RemoteHost = ([System.Net.DNS]::GetHostByAddress($add)).HostName
            $os = Get-WmiObject -Class win32_OperatingSystem -Credential $cred -ComputerName $RemoteHost
            $sys = Get-WmiObject -Class win32_ComputerSystemProduct -Credential $cred -ComputerName $RemoteHost
            $mem = Get-WmiObject -Class win32_PhysicalMemory -Credential $cred -ComputerName $RemoteHost
            $comp = Get-WmiObject -Class win32_ComputerSystem -Credential $cred -ComputerName $RemoteHost
            $TotalMemory = 0
            $mem.Capacity | foreach {$TotalMemory += $_}
            $HostInfo = New-Object psobject
            $HostInfo | Add-Member -NotePropertyName "HostName" -NotePropertyValue $RemoteHost
            $HostInfo | Add-Member -NotePropertyName "Serial#" -NotePropertyValue $sys.IdentifyingNumber 
            $HostInfo | Add-Member -NotePropertyName "Model#" -NotePropertyValue $sys.Name 
            $HostInfo | Add-Member -NotePropertyName "Version" -NotePropertyValue $sys.Version 
            $HostInfo | Add-Member -NotePropertyName "LoggedIn" -NotePropertyValue $comp.UserName 
            $HostInfo | Add-Member -NotePropertyName "CurrentIP" -NotePropertyValue $add 
            $HostInfo | Add-Member -NotePropertyName "WindowsVersion" -NotePropertyValue $os.Version 
            $HostInfo | Add-Member -NotePropertyName "Architecture" -NotePropertyValue $os.OSArchitecture
            $HostInfo | Add-Member -NotePropertyName "Memory (in GB)" -NotePropertyValue ([System.Math]::Round($TotalMemory/1GB,2))
            return $HostInfo
        } 
        if($HostCount -ge $HostMax){
            sleep 5
            $HostCount = 1
            }
        Else{$HostCount++}
    }
}
Get-Job | Wait-Job
$HostJobs = Get-Job
Write-Host "Adding info to Info Dump folder"
foreach($Hostjob in $HostJobs){
    $htemp = Receive-Job -Id $Hostjob.id -Keep -ErrorAction SilentlyContinue
    Write-Host "Attempting to process job $($HostJob.ID)"
    if($htemp.HostName -eq $null){ Write-Host "No valid connection for $($Htemp.CurrentIP)"}
    ELSE{    
        $outfile = New-Item -ItemType File -Path $DumpLocation -Name "$($htemp.HostName).txt" -Force
        $htemp | out-file $outfile
        $AllInfo += $htemp
    }
}
Get-Job | Stop-Job
Get-Job | Remove-Job

$AllDump = ($DumpLocation + "\out.csv")
$AllInfo | Export-CSV $AllDump -Force -NoTypeInformation
Yes, it's pretty crappy, but it does its job. If i had to change anything I'd make it so that the processing is done on the remote side, rather than on the local computer.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
Also, starting with Windows 10, the last boot time is not 100% accurate.

Space Whale
Nov 6, 2014
Does findstr (or another utility in powershell) let you do relative paths? For instance, I'm in /AncientSourceControl/. Within that are folders with MONTHYEAR. Within those folders is another level, with DB and DEPLOY:

code:
AncientsourceControl/MONTHYEAR/DB 

AncientsourceControl/MONTHYEAR/DEPLOY
I want to search wtihin all of the .sql files in /DB/. What's the best way to do this?

Also, Serena Dimensions :stonk:

Space Whale fucked around with this message at 18:51 on Sep 29, 2016

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Space Whale posted:

Does findstr (or another utility in powershell) let you do relative paths? For instance, I'm in /AncientSourceControl/. Within that are folders with MONTHYEAR. Within those folders is another level, with DB and DEPLOY:

code:
AncientsourceControl/MONTHYEAR/DB 

AncientsourceControl/MONTHYEAR/DEPLOY
I want to search wtihin all of the .sql files in /DB/. What's the best way to do this?

Also, Serena Dimensions :stonk:
dir (yes, and ls) is an alias for Get-ChildItem. Please don't use findstr. You made a mess of your slashes and are confused about what the names of the folders are so I don't know exactly what you're looking for but it should look be something like this:
code:
Get-ChildItem .\AncientsourceControl\MONTHYEAR\DB\* -include *.sql

Space Whale
Nov 6, 2014

anthonypants posted:

dir (yes, and ls) is an alias for Get-ChildItem. Please don't use findstr. You made a mess of your slashes and are confused about what the names of the folders are so I don't know exactly what you're looking for but it should look be something like this:
code:
Get-ChildItem .\AncientsourceControl\MONTHYEAR\DB\* -include *.sql

I hit the forward slash since I'm typing this on another computer :v:

Can that do internal searches of those file contents - or could I pipe it?

Basically I want to find "someString" inside of files ending in .sql where I am where it's currentDirectory\Whatever\DB\theFile.sql and it's in FOO\DB\thefile or BAR\DB\thefile

EDIT: The monthyear changes. I want to search for all MONTHYEARS\DB\foo.sql

Space Whale fucked around with this message at 19:10 on Sep 29, 2016

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Space Whale posted:

I hit the forward slash since I'm typing this on another computer :v:

Can that do internal searches of those file contents - or could I pipe it?

Basically I want to find "someString" inside of files ending in .sql where I am where it's currentDirectory\Whatever\DB\theFile.sql and it's in FOO\DB\thefile or BAR\DB\thefile

EDIT: The monthyear changes. I want to search for all MONTHYEARS\DB\foo.sql
You could do either Select-String by itself
code:
Select-String -Path .\Whatever\DB\theFile.sql,.\FOO\DB\thefile,.\BAR\DB\thefile,.\MONTHYEARS\DB\foo.sql -Pattern 'someString'
Or if you already have a dir/ls/gci command ready to go you could pipe that into Select-String
code:
Get-ChildItem .\AncientsourceControl\MONTHYEAR\DB\* -include *.sql | Select-String 'someString'

Space Whale
Nov 6, 2014

anthonypants posted:

You could do either Select-String by itself
code:
Select-String -Path .\Whatever\DB\theFile.sql,.\FOO\DB\thefile,.\BAR\DB\thefile,.\MONTHYEARS\DB\foo.sql -Pattern 'someString'
Or if you already have a dir/ls/gci command ready to go you could pipe that into Select-String
code:
Get-ChildItem .\AncientsourceControl\MONTHYEAR\DB\* -include *.sql | Select-String 'someString'

But how can I search recursively through the folder for any MonthYear\HasTobeDB\*.sql ? Like, can I search for "string" in *\DB\*.sql or .\DB\*.sql?

Video Nasty
Jun 17, 2003

Start at the parent directory and use the -recurse flag?

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Space Whale posted:

But how can I search recursively through the folder for any MonthYear\HasTobeDB\*.sql ? Like, can I search for "string" in *\DB\*.sql or .\DB\*.sql?
If you want to replace a folder name with a wildcard, you can do that.
code:
Get-ChildItem '.\folder1\*\folder3\' -Recurse -Include *.sql | Select-String 'someString'

Space Whale
Nov 6, 2014

anthonypants posted:

If you want to replace a folder name with a wildcard, you can do that.
code:
Get-ChildItem '.\folder1\*\folder3\' -Recurse -Include *.sql | Select-String 'someString'

Thank you!

Linear Zoetrope
Nov 28, 2011

A hero must cook
How usable is Powershell just as a terminal? I'm way more used to bash, but now that Ubuntu For Windows is a thing I don't have an overwhelming need to constantly frustrate myself by trying to badly fake a GNU/Posix environment on Windows with MSYS+MinGW so I was thinking of moving my command line programming to the msvc toolchains and Powershell as my terminal environment (mostly just opening editors, executing build tools, searching/moving/uzipping files, git, etc). My first impressions are that it's... a bit verbose for that? I like Powershell infinitely more for scripting than bash, in that it's actually readable, and the object model is IMO superior to the "everything is plaintext" philosophy of Unix (even if that has some perks wrt tool compatibility), but even when I know the commands it feels kind of cumbersome. Is it mostly intended for scripting and not extended terminal use? Am I just not used to it and being grumpy?

kaynorr
Dec 31, 2003

It's an excellent terminal - there are aliases for the most commonly used commands so you aren't always typing Get-ChildItem, Set-Location, Where-Object, and the like. You can also define your own aliases so you can keep using the muscle memory that works for you.

The only caveat I've found is that PowerShell doesn't appear to play nice with older interactive command-line tools (cpdr.exe is my chief annoyance, I'm sure there are more). You may have to go back to a Command Prompt to properly use them. The vast majority of my days are spend in PowerShell from beginning to end and it works great.

kumba
Nov 8, 2003

I posted my food for USPOL Thanksgiving!

enjoy the ride

Lipstick Apathy
First time trying to use powershell and I have what is probably a simple question: I have a folder on a network drive that has a bunch of .pdf files in a complicated hierarchy of subfolders. I want to copy items from these subfolders to a folder with the same hierarchy on my desktop, but I only want to copy items with a certain list of keywords in the file name.

So, for example if I have a folder G:\Docs\2016\9\29\JimBob\ that contains 3 files named Test.pdf, asdf.pdf, and Example.pdf, I want to copy only items that have the word Test or Example in the name.

I'm doing some logic so it knows which folders to copy over (new folders are created for each day new files are generated, so I wrote something to loop through the structure and find the most recent stuff). That part is fine and I can recreate the folder structure locally, but I just can't figure out the syntax to copy the actual .pdfs over - whatever I've tried it seems like I get all or nothing.

After I've recreated the folder structure, I loop through the final subfolders in the hierarchy (that's what $currentEntity is). $ratesToImport is an array containing the words I want to match in the filenames (so for the above example, I would have Test and Example as items in the array). This is what I'm using:

code:
Copy-Item -Path "G:\Docs\Pew Pew Inc\$mostRecentYear\$mostRecentMonth\$mostRecentDay\Archive\$currentEntity\*" -Recurse 
-Destination "C:\Users\$UserName\Desktop\Spanish Docs\"  -Include "*$ratesToImport[0]*", "*$ratesToImport[1]*", "*$ratesToImport[2]*"
And it doesn't seem to actually copy anything over. When I try this:

code:
Copy-Item -Path "G:\Docs\Pew Pew Inc\$mostRecentYear\$mostRecentMonth\$mostRecentDay\Archive\$currentEntity\*" -Recurse 
-Destination "C:\Users\$UserName\Desktop\Spanish Docs\"
                    Where-Object {
                                 $_.Name -like "*$ratesToImport[0]*" -or $_.Name -like "*$ratesToImport[1]*" -or $_.Name -like "*$ratesToImport[2]*"               
                                 }
It seems to copy everything in the folder instead of filtering anything out. What on earth am I doing wrong??

thebigcow
Jan 3, 2001

Bully!

Moundgarden posted:

And it doesn't seem to actually copy anything over. When I try this:

code:
Copy-Item -Path "G:\Docs\Pew Pew Inc\$mostRecentYear\$mostRecentMonth\$mostRecentDay\Archive\$currentEntity\*" -Recurse 
-Destination "C:\Users\$UserName\Desktop\Spanish Docs\"
                    Where-Object {
                                 $_.Name -like "*$ratesToImport[0]*" -or $_.Name -like "*$ratesToImport[1]*" -or $_.Name -like "*$ratesToImport[2]*"               
                                 }
It seems to copy everything in the folder instead of filtering anything out. What on earth am I doing wrong??

Copy-Item is doing what you told it to, piping no objects out, then Where-Object is doing what you told it to with the no objects that were piped into it. You could try something similar with Get-ChildItem at the first step, then use Where-Object to filter just the objects you want, then pipe that into Foreach-Object and in the script block run Copy-Item on $_. But if you need to match directory structures you'll have to figure something else out. Copy-Item has its own filter switch so you might be able to skip the additional cmdlets entirely.

I can't answer you first question, but I do have a suggestion. The last time I looked at doing something fancy with PowerShell to copy directories I ended up just doing it with robocopy.exe. I would rather have done it with PowerShell to learn something but I had a mountain of things to get done. If you want to use robocopy in PowerShell without the shell trying to do a bunch of fancy stuff with your switches start the line with an ampersand, which puts it in what I think was called "command mode."

kumba
Nov 8, 2003

I posted my food for USPOL Thanksgiving!

enjoy the ride

Lipstick Apathy
I appreciate the nudge. I managed to get it to work like so:

code:
        Get-ChildItem "G:\Docs\Pew Pew Inc\*\*\*\Archive\$currentEntity\*" -Recurse -Include $ratesToImport[$j] | 
        select FullName, LastWriteTime | 
        sort LastWriteTime -Descending | 
        select -First 1 | 
        select -ExpandProperty FullName | 
        Copy-Item -Destination "C:\Users\$UserName\Desktop\Spanish Docs\$currentEntity\"
It performs like absolute garbage, presumably because of the triple wildcard in the filepath and all the sorting I need to do. I couldn't find a way around that, and unfortunately I have no power to modify the folder structure. Any tips on optimizing something like this or am I pretty much SOL?

Adbot
ADBOT LOVES YOU

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Moundgarden posted:

It performs like absolute garbage, presumably because of the triple wildcard in the filepath and all the sorting I need to do. I couldn't find a way around that, and unfortunately I have no power to modify the folder structure. Any tips on optimizing something like this or am I pretty much SOL?
Use robocopy.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply