Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Sefal
Nov 8, 2011
Fun Shoe
i'm writing a script to only keep the latest 3 files of each file in 1 directory

These files are logs which are generated every week. Rhey all start with the <name>.conf<Date>
There are 20 different names
I foundthis script on the internet that is similar to what I want. But the trouble i'm having is i'm not sure how to define each unique file in this directory
Currently the script looks like this.
I've now copied this script and changed the filemask name for each other file.


code:
 # Defines how many files you want to keep?
$Keep = 3

# Specifies file mask
$FileMask = "<name>.conf*"

# Defines base directory path
$Path = "driveletter:\path\"

# Creates a full path plus file mask value
$FullPath = $Path + $FileMask

# Creates an array of all files of a file type within a given folder, reverse sort.
$allFiles = @(Get-ChildItem $FullPath) | sort-object -Property {$_.CreationTime} -Descending 

# Checks to see if there is even $Keep files of the given type in the directory.
If ($allFiles.count -gt $Keep) {

    # Creates a new array that specifies the files to delete, a bit ugly but concise.
    $DeleteFiles = $allFiles[$($allFiles.Count - ($allFiles.Count - $Keep))..$allFiles.Count]

    # ForEach loop that goes through the DeleteFile array
    ForEach ($DeleteFile in $DeleteFiles) {

        # Creates a full path and delete file value
        $dFile = $Path + $DeleteFile.Name

        # Deletes the specified file
        Remove-Item $dFile 
    }
}
But there has to be a way to filter each unique file in directory?
I'm not sure how to do that

Adbot
ADBOT LOVES YOU

Zaepho
Oct 31, 2013

Sefal posted:

i'm writing a script to only keep the latest 3 files of each file in 1 directory


But there has to be a way to filter each unique file in directory?
I'm not sure how to do that

Simplify simplify simplify.

Break the task down into 2 parts. Work against only one name. Find all files with that name $name.conf*
Loop over that and parse out the date part and determine if it should stay or go.
Delete what needs deleting.

Now that you have it working for one file, loop over all of the unique file names and execute the above for each one.

Venusy
Feb 21, 2007
Yeah, you can generate a simple array of names:
code:
$names = @("aaa","bbb","ccc","ddd")
Alternatively, since the date is in the file extension, it's also very easy to get PowerShell to tell you all the file prefixes:
code:
# The BaseName is the filename without the extension. This selects just the BaseName, and shows the unique ones.
$names = Get-ChildItem $path | select -ExpandProperty BaseName -Unique
Either way, it's just a case then of changing the $FileMask to be "$name.conf*", and wrapping the entire script like this:
code:
foreach ($name in $names) {
    <script>
}

Roargasm
Oct 21, 2010

Hate to sound sleazy
But tease me
I don't want it if it's that easy
Put everything in different folders to do this in 5 seconds :ssh:

If $_.LastWriteTime lines up with your file creation dates, do

PHP code:
$listOfShit = 'apple','banana','cantelope'

foreach ($fruit in $listOfShit) {
    $fruitFiles = get-childitem D:\Data -filter "$fruit*"
    $fruitFiles | sort {$_.LastWriteTime} | select -First ($fruitFiles.Count - 3) |  remove-item -force
}

Toshimo
Aug 23, 2012

He's outta line...

But he's right!

Sefal posted:

i'm writing a script to only keep the latest 3 files of each file in 1 directory

These files are logs which are generated every week. Rhey all start with the <name>.conf<Date>
There are 20 different names
I foundthis script on the internet that is similar to what I want. But the trouble i'm having is i'm not sure how to define each unique file in this directory
Currently the script looks like this.
I've now copied this script and changed the filemask name for each other file.


code:
 # Defines how many files you want to keep?
$Keep = 3

# Specifies file mask
$FileMask = "<name>.conf*"

# Defines base directory path
$Path = "driveletter:\path\"

# Creates a full path plus file mask value
$FullPath = $Path + $FileMask

# Creates an array of all files of a file type within a given folder, reverse sort.
$allFiles = @(Get-ChildItem $FullPath) | sort-object -Property {$_.CreationTime} -Descending 

# Checks to see if there is even $Keep files of the given type in the directory.
If ($allFiles.count -gt $Keep) {

    # Creates a new array that specifies the files to delete, a bit ugly but concise.
    $DeleteFiles = $allFiles[$($allFiles.Count - ($allFiles.Count - $Keep))..$allFiles.Count]

    # ForEach loop that goes through the DeleteFile array
    ForEach ($DeleteFile in $DeleteFiles) {

        # Creates a full path and delete file value
        $dFile = $Path + $DeleteFile.Name

        # Deletes the specified file
        Remove-Item $dFile 
    }
}
But there has to be a way to filter each unique file in directory?
I'm not sure how to do that

What format is your date in and are you preferring to determine age by that datecode as opposed to last file written timestamp?

Toshimo
Aug 23, 2012

He's outta line...

But he's right!
GAZE UPON MY WORKS, YE MIGHTY, AND DESPAIR!

Using the assumption that the log files will be in <NAME>.confMMDDYY format, like so:
code:
PS C:\tmp> ls


    Directory: C:\tmp


Mode                LastWriteTime         Length Name                                                                                        
----                -------------         ------ ----                                                                                        
-a----         03/07/16     16:48              0 abc.conf010115                                                                              
-a----         03/07/16     17:07              0 abc.conf020115                                                                              
-a----         03/07/16     17:11              0 abc.conf030115                                                                              
-a----         03/07/16     17:11              0 abc.conf040415                                                                              
-a----         03/07/16     16:48              0 cde.conf010215                                                                              
-a----         03/07/16     17:08              0 cde.conf010515                                                                              
-a----         03/07/16     17:11              0 cde.conf010715                                                                              
-a----         03/07/16     17:11              0 cde.conf011315                                                                              
And assuming we want to use the date codes in the filename, instead of relying on LastWriteTime:


PHP code:
$Logs_To_Keep = 3
$Log_Path = "C:\tmp"

Get-ChildItem -Path $Log_Path "*.conf*" -Force | Where-Object {-not $_.PSIsContainer} | Select Name | `
    % { [regex]::Replace($_.name, "\.conf.*", "") } | Group-Object | % { Get-ChildItem "$($path)\\$($_.Name).conf*" | `
    Select Name | % { [regex]::Replace($_.Name, "(.*)\.conf(\d{2})(\d{2})(\d{2})", '$4$2$3.conf$1') } | Sort-Object -Descending | `
    Select-Object -Skip $Logs_To_Keep | % { [regex]::Replace($_, "(\d{2})(\d{2})(\d{2})\.conf(.*)", '$4.conf$2$3$1') | `
    % { Write-Output "Pruning Log File $($Log_Path)\\$($_)"; Remove-Item "$($Log_Path)\\$($_)" -WhatIf } } }
Produces proper output of:
code:
Pruning Log File C:\tmp\\abc.conf010115
What if: Performing the operation "Remove File" on target "C:\tmp\abc.conf010115".
Pruning Log File C:\tmp\\cde.conf010215
What if: Performing the operation "Remove File" on target "C:\tmp\cde.conf010215".

beepsandboops
Jan 28, 2014
Trying to wrap my head around when it's appropriate to use parameters and when it's not. I have a script that onboards a new user and takes input like their name, start date, etc. using Read-Host.

For stuff like that, is it better to just wrap it all into a param block? When does it make sense to use Read-Host instead?

Walked
Apr 14, 2003

beepsandboops posted:

Trying to wrap my head around when it's appropriate to use parameters and when it's not. I have a script that onboards a new user and takes input like their name, start date, etc. using Read-Host.

For stuff like that, is it better to just wrap it all into a param block? When does it make sense to use Read-Host instead?

If you ever anticipate another script calling your new employee script, then definitely paramterize.

Generally, given the choice parameters will be a bit more extensible than read-host.

Sefal
Nov 8, 2011
Fun Shoe
Wow, Thank you guys. sry for the late response. I was puzzled on how to do the for each. But the examples you guys have given are excellent.
I'm going to finish it. today I hope.

I feel I have so much to learn in powershell, so I try to do as much scripting as I can. You're advice helps!

Jethro
Jun 1, 2000

I was raised on the dairy, Bitch!

beepsandboops posted:

Trying to wrap my head around when it's appropriate to use parameters and when it's not. I have a script that onboards a new user and takes input like their name, start date, etc. using Read-Host.

For stuff like that, is it better to just wrap it all into a param block? When does it make sense to use Read-Host instead?

Not only is param more extensible, you can also just create a prompt with Read-Host in the parameter definition.

Roargasm
Oct 21, 2010

Hate to sound sleazy
But tease me
I don't want it if it's that easy

Sefal posted:

Wow, Thank you guys. sry for the late response. I was puzzled on how to do the for each. But the examples you guys have given are excellent.
I'm going to finish it. today I hope.

I feel I have so much to learn in powershell, so I try to do as much scripting as I can. You're advice helps!

The big part of foreach (aliased as "%" in the other script) is that you can use whatever you want for the "this specific object" variable, and you can reference that specific object with $_ or $specificObjectVariable to do stuff

code:
$textFiles = get-content d:\textfiles

foreach ($hamprince in $textFiles) {
   echo $_
   echo $hamprince
}
would show the same output twice for every text file

To see what you can do or read from a variable, pipe it to get-member (e.g. "$textFiles | get-member" would show you that you can read $textFiles.LastWriteTime, convert to string with $textFiles.ToString(), etc)

beepsandboops
Jan 28, 2014

Jethro posted:

Not only is param more extensible, you can also just create a prompt with Read-Host in the parameter definition.
Oh, interesting. I was setting them up like:
code:
[Parameter(HelpMessage="Please provide the username")]
[string]$username
But you're saying I could also do

code:
[Parameter]
[string]$username= Read-Host "Please provide the username"

Jethro
Jun 1, 2000

I was raised on the dairy, Bitch!

beepsandboops posted:

Oh, interesting. I was setting them up like:
code:
[Parameter(HelpMessage="Please provide the username")]
[string]$username
But you're saying I could also do

code:
[Parameter]
[string]$username= Read-Host "Please provide the username"

Correct*. In essence, you are setting the default to be the input from Read-Host.

*: the syntax is actually [string]$username= $(Read-Host "Please provide the username")

Jethro fucked around with this message at 20:27 on Mar 8, 2016

Swink
Apr 18, 2006
Left Side <--- Many Whelps
I'm creating a new script with parameters but they are not working.

I've saved the file as new-user.ps1 and when I run ./new-user.ps1 from the console I expect to the able to tab through the available parameters.

Am I missing something painfully obvious here?

code:
[CmdletBinding()]
Param(
  [Parameter(Mandatory=$True)]
   [ValidateSet("Blue", "White")]
   [string]$brand,
	
   [Parameter(Mandatory=$True)]
   [ValidateSet("Sydney","Melbourne")]
   [string]$location,
   
   [Parameter(Mandatory=$True)]
   [string]$username= $(Read-Host "Please provide the username")
)
Edit - it was because I had errors elsewhere that halted the script. All good.

Swink fucked around with this message at 08:19 on Mar 10, 2016

Irritated Goat
Mar 12, 2005

This post is pathetic.

beepsandboops posted:

Oh, interesting. I was setting them up like:
code:
[Parameter(HelpMessage="Please provide the username")]
[string]$username
But you're saying I could also do

code:
[Parameter]
[string]$username= Read-Host "Please provide the username"

I thought Read-Host was a bad way to do it. I was told to use something like

code:
Param([parameter(Mandatory=$true,HelpMessage = "message")]$variable)
Both work but the Mandatory=$true method was "best practice". :confused:

Venusy
Feb 21, 2007
Correct - Read-Host assumes that there is a host to read from, that someone will always be running your script interactively from the console. If your script is called from another script, but someone hosed up and forgot the parameter, the version with Read-Host will hang indefinitely waiting for input that will never come, while the version with mandatory=$true should error out.

Similarly, before PowerShell v5 introduced the Information stream, there was no way to log information displayed to the console with Write-Host, while now you can do that like you can with Verbose and Warning messages (see help about_redirection).

EDIT: Actually running a test:
code:
powershell -command {function ReadHostTest {
    param($variable = (Read-Host -Prompt "PLEASE ENTER A VARIABLE"))
    $PID | Out-File U:\Temp\PS_PID_READHOST.TXT
}
ReadHostTest}

powershell : Windows PowerShell is in NonInteractive mode. Read and Prompt functionality is not available.
code:
powershell -command {function MandatoryTest {
    param([Parameter(Mandatory=$true)]$variable)
    $PID | Out-File U:\Temp\PS_PID_MANDATORY.TXT
}
MandatoryTest}
powershell : Cannot process command because of one or more missing mandatory parameters: variable.
So it errors out either way, but the mandatory error message makes it a lot easier to see where the fault is.

Venusy fucked around with this message at 14:32 on Mar 11, 2016

GPF
Jul 20, 2000

Kidney Buddies
Oven Wrangler
Going back to Gothmog1065 and his scripting...

When it comes to verifying a computer is there, is on, and other things like that, I really try to avoid Test-Connection. I don't trust a ping attempt to either make it or make it back, even if the other side is definitely there.

My reasoning goes like this, at least in my network. Other networks would be different, so you do things that fit your situation:

1. ICMP may be blocked either coming or going for various reasons, so to prove that ICMP is valid from one side to the other will probably involve a long chat with Networking.
2. If that machine is Windows and is in the domain, there's certain info I can rely on being able to find. AD Record, DNS record, DHCP record, RADIUS log record, RADIUS connection policy... That's verification outside the machine. Once I have those, then I can start looking at the machine itself.
3. If I'm connecting to a domain joined Windows machine (in my case, that's pretty much true), I can do one of two things: A TCP port test to 135 or a path test to \\compname\Admin$. If I can get a valid connection to either of those, I'm 99% sure.
4. If those ports or path test don't work, I hit it with a TCP 9100, TCP 80, TCP 443, and UDP 161. If it comes back saying yes to 9100 and 161, then I hit it with a specific SNMP request so I can tell what kind of printer it is. I can also ignore this line if I don't care, or focus on this line if I'm doing printer stuff.

If I do things right, I can get those tests done before your ping times out, specially if you're using Test-Connection with more than a single attempt, and a single attempt to ping a system is no guarantee it'll respond. The ICMP process is a low priority process, and the ICMP packet also has no guarantee of being transferred along by a router or even a switch since it's pure layer 3 IP which has no mechanics to verify delivery.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
I'm writing a script that automates an install.
Part of this involves certificates, and it's complicated. I have the exact commands, but they involve a lot of quotation marks and environment variables.

I can get it working with:
pre:
$command = "$env:PROGRAMROOT\sbin\program.exe"
$arguments =@("pki","new-cert","--n","$env:FQDN",...)

& $command $arguments
But that looks crappy.

Any better way?

Venusy
Feb 21, 2007
Start-Process $command -ArgumentList $arguments? If the problem is with how you populate $arguments, you can do:
code:
#region Arguments
$arguments = @()
$arguments += "pki"
...
#endregion
But that may be more annoying to read than $arguments as it is in the original script (assuming each entry is on a new line). Without the region and endregion comments in the example above, you won't be able to collapse that section in the ISE for greater readability.

Fiendish Dr. Wu
Nov 11, 2010

You done fucked up now!
If you're up for some DSC:

code:
Package PackageName
{
  Name        = 'PackageName'
  Path        = 'https://s3.amazonaws.com/blahblahblah/blahblah.msi'
  ProductId   = '{numbers}' # the productID from the msi, I think the new package resource doesn't need this possibly
  Ensure      = 'Present' # Present or Absent
  Arguments   = "ALLUSERS=1 APIKEY=`"" + $APIKey + "`" TAGS=`"monitor:$DDTag,vnet:" + $node.Vnet + ",inst:" + $node.Inst + ",role:" + $node.Role + "`"" 
}
This is basically the sanitized code we use for our stuff. I left the mess of arguments in just so you can see the fun with escaped characters.

Irritated Goat
Mar 12, 2005

This post is pathetic.
Is there a way to plug in a variable to Invoke-Webrequest?

Basically, what I want to do is

code:
Invoke-WebRequest [url]http://www.downforeveryoneorjustme.com/[/url]$site

$Site would be pulled from a write-host to get the address needed. I'm just not sure if using $Site at the end of the address is going to complete like I actually typed http://www.downforeveryoneorjustme/mxtoolbox.com or not.

nielsm
Jun 1, 2009



Yes you can just do string interpolation:
PHP code:
$site = read-host "Site to check"
$result = invoke-webrequest "http://downforeveryoneorjustme.com/$site"

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

nielsm posted:

Yes you can just do string interpolation:
PHP code:
$site = read-host "Site to check"
$result = invoke-webrequest "http://downforeveryoneorjustme.com/$site"

Or if it's a property of an object, $result = invoke-webrequest "http://downforeveryoneorjustme.com/$($someObject.site)"

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction
Is it possible to use custom colors, ie custom RGB, as a value for write-host's Fourground parameter? The docs say it's an enumeration, but perhaps there's a clever trick to write with my own color?

GPF
Jul 20, 2000

Kidney Buddies
Oven Wrangler
Whee! That was hard work (mainly due to our own inability to read MSDN or apply the correct Google-fu), but we finally did it. I now have an incredibly short PS script that can take the place of Trace32/CMTrace to connect to, read, and output lines from open, active log files.

The problem came from this situation: We use NPS as our RADIUS solution for 802.1x. Multiple NPS servers. We've been wanting a way to follow a request from the switch to the first tier, to second tier, to third tier. We tried our damndest to use Get-Content asjldkf -Tail #, but it just wasn't fast enough. I knew if Trace32 could watch an open file and pop lines up as they're written, it could be done.

aaaaand...all the code is at work and I can't get to it. I'll put it up here as a generic bit of PS tomorrow! I found a script that someone wrote to turn a NPS log (xml format) into a custom PSObject, so we modified it for our needs and now we can get a stream of PSObjects from a live log and do all the where | select we want to do!

Horay for CommandLine.Net!

GPF
Jul 20, 2000

Kidney Buddies
Oven Wrangler
Alrighty, back to business.

This is some weakass code, but it works. There's a ton more work to be done for errors, partial answers, converting between different outputs (did you know that DHCP Audit log lines and NPS log lines are separated by different characters? Well, now you do!)

code:
$TextFilePath = "where the file is"
[int64]$oldposition = 0

$fm = [System.IO.FileMode]::Open
$fr = [System.IO.FileAccess]::Read
$fsh = [System.IO.FileShare]::ReadWrite
$fs = [System.IO.File]::Open($TextFilePath,$fm,$fr,$fsh)

$oldposition = $fs | select -ExpandProperty Length
$fs.Position = $oldposition
$stream = New-Object System.IO.StreamReader($fs)
Then, in a loop (I'm using a do {} while ($true) loop), do the following:

code:
$output = $stream.ReadToEnd()
$oldposition = $fs | select -ExpandProperty Length
$fs.Position = $oldposition
$output = $output.Trim()
if ($output) {Do things and stuff with that output, just don't forget to split on that file's EOL indicator}
Start-Sleep -milliseconds 250 | out-null
The first part sets up the read connection to the open and active file, and the second section reads to the end from the position we set before the loop. I did it this way since I didn't want to see the whole damned file scroll down my screen until the end. After the read and store, get new length of file which should be the same as what you just read, then set the streamreader to that position. Mess with your output if there is any, then sleep a bit and do it all again.

The more you sleep, the bigger the chunk of output you have to deal with, but it seems to affect both machines less to grab every half second or second or two, depending on the speed of the information being written.

Judge Schnoopy
Nov 2, 2005

dont even TRY it, pal
I signed myself up for some powershell scripting that I have no idea how to do, and my precursory one hour of blind attempts is not going well.

I'm tasked with auditing our file server, which is split into 2 tasks. One is to get the file count and overall size, and the other is to report on permissions. I've got the basics on how to do both of these with parent level folders.

The trick is that I need to report on folders 1 level deep in each directory. For example, the structure is Departments > Technology > Target Subfolders. I've resigned to manually targeting each department's folder instead of aiming at "Departments" and going 2 levels deep.

If "Get-ChildItem" is the right command to use, which I believe best suits the purpose, how do I design "-Recurse" to stop at 1 sublevel?

For reference this is what I have so far and obviously it does not work. The recurse inside the "forEach" recurse hangs after the first folder, and $size is resolved to "Microsoft.PowerShell.Commands.GenericMeasureInfo". I'm not worried so much about fixing this exact script because I know it will need a total overhaul to work, but I'm really interested in an intuitive way to do that 1 sublevel recurse.

code:
$FS1Log = "c:\software\FS1Log.csv"
$root = "\\fileserver\Departments\Technology"

$rootCount = (Get-ChildItem $root -Recurse).Count
$rootSize = Get-ChildItem $root -recurse | Measure-Object -property length -sum
Add-Content $FS1Log $root, $rootCount, $rootSize

ForEach ($Folder in $(Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True})){

$size = Get-ChildItem $root\$folder -recurse | Measure-Object -property length -sum

$FileCount = (Get-ChildItem $root\$folder -Recurse).Count
Add-Content $FS1Log $Folder, $FileCount, $size
}

Judge Schnoopy fucked around with this message at 14:43 on Apr 13, 2016

Jethro
Jun 1, 2000

I was raised on the dairy, Bitch!
Measure-Object doesn't return the sum, it returns an object that has a Count, Sum, Average, etc. properties, and the ones you specify when you call the cmdlet are the ones that are populated (Count is always populated). This means that you can get the file size and file count in one pass.
code:
$FS1Log = "c:\software\FS1Log.csv"
$root = "\\fileserver\Departments\Technology"

$FileStats = (Get-ChildItem $root -Recurse | Measure-Object -property length,PSisContainer  -sum)
$rootCount = ($FileStats | Where-Object {$_.Property -eq "PSisContainer" }).Count
$rootSize = ($FileStats | Where-Object {$_.Property -eq "length" }).Sum
Add-Content $FS1Log $root, $rootCount, $rootSize

ForEach ($Folder in $(Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True})){

    $FileStats = (Get-ChildItem $folder.FullName -Recurse | Measure-Object -property length,PSisContainer  -sum)

    $size = ($FileStats | Where-Object {$_.Property -eq "length" }).Sum

    $FileCount = ($FileStats | Where-Object {$_.Property -eq "PSisContainer" }).Count

    Add-Content $FS1Log $Folder, $FileCount, $size
}
(If it's OK for the folders to not be included in the file count, you can simplify the above a great deal)

I don't think you can recurse a set number of levels, so if you want to do all the department folders at once, you just have to put another ForEach around the existing one.

code:
$FS1Log = "c:\software\FS1Log.csv"
$base = "\\fileserver\Departments"
ForEach ($root in $(Get-ChildItem $base | Where-Object {$_.PSisContainer -eq $True})){

	$FileStats = (Get-ChildItem $root.FullName -Recurse | Measure-Object -property length,PSisContainer  -sum)
	$rootCount = ($FileStats | Where-Object {$_.Property -eq "PSisContainer" }).Count
	$rootSize = ($FileStats | Where-Object {$_.Property -eq "length" }).Sum
	Add-Content $FS1Log $root.FullName, $rootCount, $rootSize

	ForEach ($Folder in $(Get-ChildItem $root.FullName | Where-Object {$_.PSisContainer -eq $True})){

		$FileStats = (Get-ChildItem $folder.FullName -Recurse | Measure-Object -property length,PSisContainer  -sum)

		$size = ($FileStats | Where-Object {$_.Property -eq "length" }).Sum

		$FileCount = ($FileStats | Where-Object {$_.Property -eq "PSisContainer" }).Count

		Add-Content $FS1Log $Folder, $FileCount, $size
	}
}

Venusy
Feb 21, 2007
On Windows 10, Get-ChildItem has a Depth parameter, but I don't think that's available on Windows 7 or Windows 8 even with PSv5 installed.

Judge Schnoopy
Nov 2, 2005

dont even TRY it, pal

Jethro posted:

Measure-Object doesn't return the sum, it returns an object that has a Count, Sum, Average, etc. properties, and the ones you specify when you call the cmdlet are the ones that are populated (Count is always populated). This means that you can get the file size and file count in one pass.

(If it's OK for the folders to not be included in the file count, you can simplify the above a great deal)

I don't think you can recurse a set number of levels, so if you want to do all the department folders at once, you just have to put another ForEach around the existing one.

Thank you for clarifying the Measure-Object cmdlet. The idea of getting this info in one go instead of running get-childitem twice is amazing since I'm looking at about a million files total in $root.

I was able to get the rest of the script working besides size, and you helped me clean up the count output as well. However! I'm getting warnings when running this script because empty folders return lengths of "null", not 0. This doesn't stop the script and I added a quick "+0" to help format my output data, so no big deal, but I dislike seeing red blocks of text in my powershell window. Any way around that, maybe an error suppression command, or is it better to just leave it be and ignore?

New script by the way which works, with progress bars / information!

code:
$FS1Log = "c:\software\FS1log.csv"
$root = "\\FileServer\Departments"

$foldertotal = Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True}
$foldernumber = 1
$subnumber = 1

Write-progress -id 1 -activity "Running Script" -Status "Overall progress" `
   -percentComplete ($foldernumber / $foldertotal.count * 100) 

ForEach ($Folder in $(Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True})){

	Write-progress -id 2 -parentId 1 -activity "Gathering data" -Status "From $Folder" 
	$foldernumber++

		ForEach ($sub in $(Get-ChildItem $folder | Where-Object {$_.PSisContainer -eq $True})){
		$subCount = (Get-ChildItem $sub.Fullname -Recurse | Measure-Object -property length -sum )
		$fileCount = ($subCount | Where-Object {!$_.PSisContainer}).Count + 0
		$fileSize = ($subCount | Where-Object {$_.Property -eq "length"}).Sum + 0
		$subOutInfo = $sub.Fullname + "," + $fileCount + "," + $fileSize 
		Add-Content -Value $subOutInfo -Path $FS1log
		$subnumber++
		Write-progress -id 3 -parentId 2 -activity "Current Sub Folder" -Status $sub
	
		}
	}

Judge Schnoopy fucked around with this message at 17:39 on Apr 13, 2016

sloshmonger
Mar 21, 2013

Judge Schnoopy posted:

Thank you for clarifying the Measure-Object cmdlet. The idea of getting this info in one go instead of running get-childitem twice is amazing since I'm looking at about a million files total in $root.

I was able to get the rest of the script working besides size, and you helped me clean up the count output as well. However! I'm getting warnings when running this script because empty folders return lengths of "null", not 0. This doesn't stop the script and I added a quick "+0" to help format my output data, so no big deal, but I dislike seeing red blocks of text in my powershell window. Any way around that, maybe an error suppression command, or is it better to just leave it be and ignore?

New script by the way which works, with progress bars / information!

code:
$FS1Log = "c:\software\FS1log.csv"
$root = "\\FileServer\Departments"

$foldertotal = Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True}
$foldernumber = 1
$subnumber = 1

Write-progress -id 1 -activity "Running Script" -Status "Overall progress" `
   -percentComplete ($foldernumber / $foldertotal.count * 100) 

ForEach ($Folder in $(Get-ChildItem $root | Where-Object {$_.PSisContainer -eq $True})){

	Write-progress -id 2 -parentId 1 -activity "Gathering data" -Status "From $Folder" 
	$foldernumber++

		ForEach ($sub in $(Get-ChildItem $folder | Where-Object {$_.PSisContainer -eq $True})){
			$subCount = (Get-ChildItem $sub.Fullname -Recurse | Measure-Object -property length -sum )
			$fileCount = ($subCount | Where-Object {!$_.PSisContainer}).Count + 0
			$fileSize = ($subCount | Where-Object {$_.Property -eq "length"}).Sum + 0
			$subOutInfo = $sub.Fullname + "," + $fileCount + "," + $fileSize 
			Add-Content -Value $subOutInfo -Path $FS1log
			$subnumber++
			Write-progress -id 3 -parentId 2 -activity "Current Sub Folder" -Status $sub
	
		}
	}

This script is going to not look for any files in level 0 or 1, as it just looks for files in the $root/$folder/$sub level. To do that you'll need to add the stuff under sub to each level above.

As for the errors, you can add -erroraction SilentlyContinue to the offending line, but it'd be better to add Try/Catch with error logging as you'll want to see why it doesn't work like you think it should.

PBS
Sep 21, 2015

Venusy posted:

On Windows 10, Get-ChildItem has a Depth parameter, but I don't think that's available on Windows 7 or Windows 8 even with PSv5 installed.

Get-ChildItem -Depth is available on Windows 7, used it the other day to figure out which folder someone had moved their folder into.

Venusy
Feb 21, 2007
So it is, for some reason it wasn't showing up in tab completion for me.

Judge Schnoopy
Nov 2, 2005

dont even TRY it, pal

sloshmonger posted:

This script is going to not look for any files in level 0 or 1, as it just looks for files in the $root/$folder/$sub level. To do that you'll need to add the stuff under sub to each level above.

As for the errors, you can add -erroraction SilentlyContinue to the offending line, but it'd be better to add Try/Catch with error logging as you'll want to see why it doesn't work like you think it should.

This worked fine, level 0 is the root target and level 1 is just a folder directory with $n folders in it, no files.

I originally had the same meat of the script copied in at the level 1 get-childitem and was going to run the script on $n folders individually. This way I was able to point at the root one time and have everything complete for me.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Some sweet Done Jones action from the Powershell Summit thing https://www.youtube.com/watch?v=playlist

There's like 30 videos on all kinds of topics.


The forums arent letting me post the link to the playlist so here's a single video. You can find your way from there.

https://www.youtube.com/watch?v=KprrLkjPq_c

Swink fucked around with this message at 01:58 on Apr 16, 2016

MF_James
May 8, 2008
I CANNOT HANDLE BEING CALLED OUT ON MY DUMBASS OPINIONS ABOUT ANTI-VIRUS AND SECURITY. I REALLY LIKE TO THINK THAT I KNOW THINGS HERE

INSTEAD I AM GOING TO WHINE ABOUT IT IN OTHER THREADS SO MY OPINION CAN FEEL VALIDATED IN AN ECHO CHAMBER I LIKE

I want to rename a boatload of files with powershell because I'm anal about my music/movie/tvshow collection. These files have periods instead of spaces between stuff, like The.Two.Towers etc.

If I wildcard the names by doing something like *.*.*.* and tell it to replace period with a space, will it replace the period before the filename as well?

I haven't written anything out yet or I'd paste some super butchered code.

Hadlock
Nov 9, 2004

I would do something like

php:
<?
$safety = 0
$JamesRegex = $regex
$files = $filepath
$filecount = 0

function renamefile{
  # code goes here
  return $newfilename
  }

foreach($file in $files){
  if($file -match $JamesRegex){
    Write-Host $file
    Write-Host "New file name is" $(renamefile)
    if($safety -ge 1){
      rename-file $file $(renamefile)
      }
    $filecount++
  }

Write-Host "$filecount Files were renamed."
?>

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
Using -whatif is a great idea as well. You can avoid making a bad mistake that way.

Venusy
Feb 21, 2007
When you use Get-Item on a file (or Get-ChildItem, the PS equivalent of dir), the resulting object(s) will have a BaseName property, which is the filename without the extension. You can use that to make sure that you just grab the right part of the name:
code:
$file = Get-Item C:\Temp\the.two.towers.txt
$newname = $file.BaseName.Replace("."," ") + $file.Extension
Rename-Item $file.FullName $newname -WhatIf
code:
What if: Performing the operation "Rename File" on target "Item: C:\Temp\the.two.towers.txt Destination: C:\Temp\the two towers.txt".

Adbot
ADBOT LOVES YOU

MF_James
May 8, 2008
I CANNOT HANDLE BEING CALLED OUT ON MY DUMBASS OPINIONS ABOUT ANTI-VIRUS AND SECURITY. I REALLY LIKE TO THINK THAT I KNOW THINGS HERE

INSTEAD I AM GOING TO WHINE ABOUT IT IN OTHER THREADS SO MY OPINION CAN FEEL VALIDATED IN AN ECHO CHAMBER I LIKE

Thanks guys! Yeah I was going to -whatif whatever I came up with, but sometimes -whatif is a bit... unhelpful and/or misleading depending on the command you're using.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply