|
First time trying to use powershell and I have what is probably a simple question: I have a folder on a network drive that has a bunch of .pdf files in a complicated hierarchy of subfolders. I want to copy items from these subfolders to a folder with the same hierarchy on my desktop, but I only want to copy items with a certain list of keywords in the file name. So, for example if I have a folder G:\Docs\2016\9\29\JimBob\ that contains 3 files named Test.pdf, asdf.pdf, and Example.pdf, I want to copy only items that have the word Test or Example in the name. I'm doing some logic so it knows which folders to copy over (new folders are created for each day new files are generated, so I wrote something to loop through the structure and find the most recent stuff). That part is fine and I can recreate the folder structure locally, but I just can't figure out the syntax to copy the actual .pdfs over - whatever I've tried it seems like I get all or nothing. After I've recreated the folder structure, I loop through the final subfolders in the hierarchy (that's what $currentEntity is). $ratesToImport is an array containing the words I want to match in the filenames (so for the above example, I would have Test and Example as items in the array). This is what I'm using: code:
code:
|
# ¿ Sep 30, 2016 16:37 |
|
|
# ¿ May 14, 2024 16:15 |
|
I appreciate the nudge. I managed to get it to work like so:code:
|
# ¿ Sep 30, 2016 18:39 |
|
stupid newbie question: I have a summarization process I run on a monthly basis that combines data from a poo poo load of disparate spreadsheets (67 of them to be exact). Part of this summarization process relies on Excel's INDIRECT() function in the main workbook building strings of filepaths based on other criteria yadda yadda. What this means is that for my summarization to work, I have to physically open every single workbook I want to pull data from. I hate doing this manually and would rather just have something do this for me while I walk away for a minute. I wrote a script to open all the proper workbooks containing the string that I need, and that part works just fine. The problem I'm running into is having 68 spreadsheets open all at once crashes my system. I'd like to add a pause about halfway through. How do I do that? I found a way to pause in between, but the way variable increments work in powershell is apparently not what I would have thought and the pause fires off in between EVERY workbook instead of at the increment I wanted. For example: code:
|
# ¿ Aug 24, 2018 18:42 |
|
see, I told you it was something stupid. thanks yallFISHMANPET posted:Is the problem all 68 workbooks opening at the same time, or having all 68 workbooks open at the same time? If it's just that launching them all at once causes your system to freak out, I'd add start-sleep -seconds 5 after every invoke item. This will cause your script to wait 5 seconds between calls of invoke-item. yeah i tried using that method first and then figured i'd go the old for loop route to see if it worked any differently, when the problem was just the comparison I was using instead. definitely going back to the foreach method thanks again everyone e: after iterating a few times I could not get if(($array.IndexOf($item)%20) -eq 0) part to work for some reason, that just refused to trigger. instead i used a foreach loop and just incremented a variable anyway and used that instead. ended up with code:
thanks again goons kumba fucked around with this message at 21:36 on Aug 24, 2018 |
# ¿ Aug 24, 2018 19:32 |
|
PBS posted:What's the purpose of leaving them open then? If you close the workbook, the formulas revert back to a #REF error because they're no longer open and accessible. Unfortunately, my process involves me copy + paste values over the formulas so the results stick. Hence, open 20 workbooks, pause while I copy + paste values, continue through the rest of them We're getting a permanent actual solution to this problem in a few months, this is just a stop gap that's part of a monthly reporting process until the time our infrastructure teams get everything in place e: for context, I'm a supervisor of an analytics team in a call center that does agent performance grades, and this is part of a monthly summarization of those graded calls
|
# ¿ Aug 25, 2018 16:55 |
|
Performance improvement question: I'm a SQL guy that is starting to dabble in powershell and I have a script that works but is unbelievably slow and I'm hoping for some pointers because some of this is obviously not well-constructed Goal: I have a huge slew of .html files on a network drive buried in subfolders upon subfolders - each individual file is the result of a chat conversation between an existing/potential customer and the agent handling the chat. I want to extract the contents of these html files, remove all the bullshit/extraneous html nonsense, and be left with a list of english words that I can use to generate a simple word cloud to get an idea of the most common types of questions, objections, etc that we're facing This is what I have so far: code:
My first immediate thought is doing the regex first before comparing to the $exclusions array so I can at least get rid of the silly duplicates with formatting but I'm not sure where to go from there. Any pointers would be super appreciated e: i guess also most of the delay is probably from it being several gigs of data on a network drive instead of on my local machine so any performance improvement will probably be limited kumba fucked around with this message at 18:58 on Nov 16, 2022 |
# ¿ Nov 16, 2022 17:38 |
|
|
# ¿ May 14, 2024 16:15 |
|
Thank you all for the suggestions!! I posted it more as a learning exercise because this is one of those things I only really needed to use once and while it took an hour to run, my feeble script got the job done I appreciate the feedback and have learned a bunch already!!
|
# ¿ Nov 17, 2022 16:07 |