|
adaz posted:You need to set inheritance and propagation flags when you create your new access rule. Right. It's been a while since I did this and wasn't near my domain. Totally forgot.
|
# ? May 3, 2012 08:13 |
|
|
# ? May 21, 2024 14:15 |
|
Before I spend too much time writing a new script, does anyone know of a pre-made Powershell script that I could use to pull a list of user-created scheduled tasks off of a server and tell me what credential and/or username it is set to run as?
|
# ? May 8, 2012 16:09 |
|
Wicaeed posted:Before I spend too much time writing a new script, does anyone know of a pre-made Powershell script that I could use to pull a list of user-created scheduled tasks off of a server and tell me what credential and/or username it is set to run as? I'm pretty sure it's a simple wmi query to win32_scheduledjob... which would be wrong. Win32_scheduledjob for some incredibly stupid reason excludes any jobs created with the UI, good job Microsoft. Check out this article: http://www.windowsitpro.com/article/windows-powershell/how-to-powershell-scheduled-tasks-140978
|
# ? May 8, 2012 17:18 |
|
Here's another diagnostic tool out there for any Exchange Administrators. Sometimes you need to look up Application events across several servers, this will let you determine which server role (including all Exchange Servers), which event logs you wish to parse, the event level, and finally the Event ID. Once all that data is selected through the Powershell menu, it will generate a Text file on your desktop with all the matching results.code:
|
# ? May 14, 2012 19:43 |
|
Korlac posted:Here's another diagnostic tool out there for any Exchange Administrators. Sometimes you need to look up Application events across several servers, this will let you determine which server role (including all Exchange Servers), which event logs you wish to parse, the event level, and finally the Event ID. Once all that data is selected through the Powershell menu, it will generate a Text file on your desktop with all the matching results. That looks really nice, but it doesn't account for fat fingering. You can easily change that by using the default switch and a recursive function: code:
|
# ? May 15, 2012 08:36 |
|
Alright, I had fun making it into a real cmdlet. I just wanted to explain autohelp and parameters. I just used your script as an excuse to do it. Feel free to ask about any of it of course. First the code, then I will explain what I did. I kept the functionality of your script the same.code:
code:
Apart from putting in the errorhandling I talked about in the previous post, I added in the following: PARAMETERS To make this script accept parameters, I included a param() block. I will explain using one parameter as an example: code:
code:
code:
You could do more fun stuff with parameters, like setting default values. A fun example for that is: code:
Other fun you could do with parameters would be to make them required, have them positioned (so you don't have to specify the parameter name) or make sets, because only certain combinations of parameters do something. There is more fun to be had, of course. Helpfile And now for the awesome part. get-help works for this script! The only thing you need to get that going, is to include that huge commentblock in the beginning in that syntax. I always use a base commentblock as a start for my scripts, where I just fill in the specifics as I go. I suggest anyone making big scripts to do the same! How awesome is it, if you can just tell your coworkers to RTFM in a windows environment! (NB: Man is an alias for get-help, for added fun) So, to break it down, this is part of my base script, an adaptation of this example: code:
|
# ? May 15, 2012 13:54 |
|
So the decree came down from on high- The CEO wants a quarterly report of every single share on our network, along with their permissions. Now maintaining a list of the shares is easy, but not so much with the permissions. Is there a way to write a Powershell script to have it aggregate all this info? The final product is going to be uploaded to Sharepoint (), but I think for now it just needs to be dumped to Notepad. More of a proof-of-concept thing, I suppose. I've looked at Dumpsec and this tool already, and I'm not sure whether either would work better, or if Powershell would be sufficient.
|
# ? May 15, 2012 20:00 |
|
get-acl will do what you want, but you might have to parse the outcome a bit to make it readable.
|
# ? May 15, 2012 20:07 |
|
Walter_Sobchak does he mean SHARE or "every folder in share"? Does it need to recurse? And do you mean share permissions or are you (please lord) doing the best practice and setting everyone full control on shares and setting actual permissions via NTFS? and Jelmylicious that's some mighty fine function craftin'
|
# ? May 15, 2012 21:30 |
|
I just realized: I did not specify in my helpfile that this script requires the exchange modules. This would be one of the more important things to put in there. Whoops And for Walter_Sobchak: indeed, what is the exact thing you want to do. Do you want to check the share permissions also, in case those aren't good? How many levels deep will the permissions be unique? Do you need to inventory who is a member of the security groups as well? Want to output it in a pretty excel file for management to swoon over? Anyway, a simple option would be: code:
e: Why didn't I look at this earlier. If you convert it to HEX, it gets a lot more readable. 268435456 is GENERIC_ALL. 268435456 in hex is 0x10000000 Jelmylicious fucked around with this message at 11:20 on May 16, 2012 |
# ? May 16, 2012 10:45 |
|
Walter_Sobchak posted:So the decree came down from on high- The CEO wants a quarterly report of every single share on our network, along with their permissions. Now maintaining a list of the shares is easy, but not so much with the permissions. Is there a way to write a Powershell script to have it aggregate all this info? Another big post ahoy! First of, as you can see in the comment notes, there is a lot more you can do with this, that isn't implemented yet. I don't filter out admin shares, I do filter out the shares that are unreachable. I chose to output one access right per line, to keep things flat. The conversion-table for making shares human readable is definitely incomplete, but that is easy to append. I know this script might seem big and daunting for a firsttimer, but that is because I made it into a full script that includes a helpfile and can be run with parameters. Save it as Get-ShareRights.ps1 and you can run it from commandline, or run it as a scheduled job. Then have something compare previous results and you have a quick and dirty rights auditing! But I'm digressing. First the full script, after that, some explanation. code:
Let me start with the only function in this script. All it does is take a simple string as input, and either return a different string if it knows the conversion, or return the same string again if it doesn't. The global parameter $dontconvert is first polled to see if any conversion has to be done at all. code:
code:
code:
I also put in a small test, to see if the share is valid, so it wouldn't error out, but give you a small message saying a share doesn't exist: code:
And there you have it. If you need it adjusted, I can do so. I might make this script bigger for auditing purposes in my company. Jelmylicious fucked around with this message at 20:02 on May 18, 2012 |
# ? May 18, 2012 14:15 |
|
This will be an easy one. The following lists all subfolders of all our exchange mailboxes. code:
|
# ? May 21, 2012 07:36 |
|
This article explains how to format a ByteQuantifiedSize (like FolderSize) in detail. Short version is to replace FolderSize with @{expression={FolderSize.Value.ToMB()}; label="FolderSize (MB)"} in your Select call (you may need to move the sort to before the select).
|
# ? May 21, 2012 08:17 |
|
Thanks, that sorted me out.
|
# ? May 22, 2012 03:28 |
|
I'm using cwrsync to pull some files from a Unix server, and I've decided to do this little project in Powershell, because why not. To get it to run I have to modify some enviroment variables, which I can do in a bat script like this: code:
I can't just run it with the full path because it keeps grabbing at binaries in PATH to figure out what to do (insert: I bet if I specified full path for both rsync and ssh it would work, but I'd rather just figure this out). So, what am I missing here? E: Solved my own problem. This is what I was doing: code:
code:
FISHMANPET fucked around with this message at 23:08 on May 25, 2012 |
# ? May 25, 2012 00:08 |
|
I'm working with Powershell and AD here and I have no idea where to begin. I've got an OU full of machine accounts that I need to move into different ones. I can use a DirectorySearcher to get an array of all the objects in the OU, but when I use code:
edit: holy poo poo I'm dumb. I had set the directorysearcher scope to "subtree", which recurses through all the objects and OUs that it encounters. Using the "onelevel" scope causes it to only search the specified OU. Hope this helps someone in the future! angrytech fucked around with this message at 17:32 on May 30, 2012 |
# ? May 30, 2012 17:29 |
|
Ugh, this is gonna hurt: 800k+ files in 7 directories; delete all files older than 60 days. I know I saw something about Powershell performance tapering off before, but I can't remember where I saw it. I just hope that my gci -recurse works.
|
# ? Jun 1, 2012 15:02 |
|
Phone posted:Ugh, this is gonna hurt: 800k+ files in 7 directories; delete all files older than 60 days. I know I saw something about Powershell performance tapering off before, but I can't remember where I saw it. I think this is what you are looking for: from: http://blogs.msdn.com/b/powershell/archive/2009/11/04/why-is-get-childitem-so-slow.aspx Since the sweet point seems to be around 300k files, why not specify the 7 directories, and do a simple gci without the recurse on them? I feel dirty for removing some automation, but sometimes, doing it yourself really is better. Or, to get all the directories automatically, either: - do a gci -directory (powershell 3 option) or - filter with gci -filter *. (to specify a native filesystem filter for files with no extension) Last option has the assumption that directories have no extension, and files do. Or,if you can distinguish by name, you could use a different filter. e: changed image host to imgur, even though msdn.com can probably handle the load from this thread... Jelmylicious fucked around with this message at 16:42 on Jun 1, 2012 |
# ? Jun 1, 2012 16:04 |
|
Looks like I barely got by, haha. It was 7 folders in a share that has close to 30 folders total, so each folder had it's own gci call. It went surprisingly fast and created a 110MB log file (while eating up 2.3GB of RAM ). The largest folder had about 315k files in it; so bullet = dodged. Thanks for that link!!
|
# ? Jun 1, 2012 16:34 |
|
This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.
|
# ? Jun 9, 2012 02:21 |
|
adaz posted:As a note, and I missed this last week, but the beta of powershell 3.0 is out: http://www.microsoft.com/download/en/details.aspx?id=28998 Has the public beta for Powershell 3 been discontinued? This link is dead now. e: n/m, found a new link http://www.microsoft.com/en-us/download/details.aspx?id=29939 stubblyhead fucked around with this message at 04:25 on Jun 9, 2012 |
# ? Jun 9, 2012 04:18 |
|
Scaramouche posted:This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days The answer is to use Powershell w/ the .NET 4 framework which fixes the lovely performance issue with a ton of files. Currently PS 2.0 uses the 2 framework by default, although it can be changed (warning: causes issues but if you want to know how see http://stackoverflow.com/questions/2094694/how-can-i-run-powershell-with-the-net-4-runtime). The issue with file performance is a .NET 2 issue, not a intrinsic powershell problem. Your other option is to use DOS or something else that uses the WIN 32 API directly instead of going through .NET PS 3.0 uses the 4.5 framework I do believe but I haven't checked that for sure so don't quote me on that. e: also any PS goons going to be down in Orlando for Tech-Ed next week? I'll be attending some of the Powershell sessions adaz fucked around with this message at 08:25 on Jun 9, 2012 |
# ? Jun 9, 2012 08:22 |
|
Scaramouche posted:This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days I would keep using PowerShell. The advantage of that, is that it returns objects, not plaintext. So, if you can filter it down with -Filter (native NTFS filter, like dir uses) or if you can break it up in chunks, I would keep using Get-ChildItem for flexibilities sake. For this example, the batch script would of course work, it's just that you can do so much more with the objects. If you'd ever want to expand on your script, the PowerShell one would be very easy to modify. Granted, the batch script will probably be a lot faster.
|
# ? Jun 9, 2012 08:29 |
|
So I saw this posted on Reddit, give a really good overview of Powershell for those trying to learn it, and also gives a lot of good tips on script creation: https://www.youtube.com/watch?v=-Ya1dQ1Igkc For example, I didn't even know of the show-command cmdlet. Blew my fuckin' MIND, man.
|
# ? Jun 11, 2012 14:50 |
|
Wicaeed posted:So I saw this posted on Reddit, give a really good overview of Powershell for those trying to learn it, and also gives a lot of good tips on script creation: https://www.youtube.com/watch?v=-Ya1dQ1Igkc Just gave the first day of our internal two day powershell course. Most important commands I taught were Get-Command (in conjunction with filters), Get-Help and Get-Member. With those three, you can find out almost all you need to know or at least find specific terms to google. e: Just watched most of that video, it is really good. I think I am going to change the structure of my course a bit, because of this. Jelmylicious fucked around with this message at 23:27 on Jun 11, 2012 |
# ? Jun 11, 2012 16:26 |
|
Saw a good talk at teched on powershell remoting, the lecture guy was awesome and had a TON of tips I had no idea about. He wrote a free book you can check out on the remoting here: http://powershellbooks.com/. I highly recommend it, takes a lot of the pain out of weird remoting options that aren't documented anywhere.
|
# ? Jun 12, 2012 06:37 |
|
The remoting feature looked pretty awesome, but it strikes me that third-party applications that run on the server are going to be negatively impacted by removing the Windows Server gui. Our monitoring system (PRTG) relies on a GUI-based control panel that runs on the server. Is there a way (with ps remoting) to redirect GUI content to another computer?
|
# ? Jun 12, 2012 13:39 |
|
Alright, I found something wierd, which is probably just sommething in the datetime format the WMI returns. First, let me lay a little background. To get a the installdate through WMI you can ask it like this: code:
Well, that was helpful. I think I see a 2012 at the beginning, but yeah.... How long is that thing anyway? code:
code:
code:
code:
code:
What I would expect to be 1999-12-31 23:50:12.12345678901 yields 1999-12-25 20:29:12. 4 days, 3 hours and 21 minutes difference! Let's timetravel! change the 1999 to 1899, 'cause I'm oldfashioned: monday 25 december 1899 20:29:12. Exact same difference! Anyone know what's up? Or should I ask in the .NET thread, since the datetime class is technically more their domain. I trust the conversion, I am just intrigued by this all.
|
# ? Jun 13, 2012 16:01 |
|
Wicaeed posted:The remoting feature looked pretty awesome, but it strikes me that third-party applications that run on the server are going to be negatively impacted by removing the Windows Server gui. Our monitoring system (PRTG) relies on a GUI-based control panel that runs on the server. Is there a way (with ps remoting) to redirect GUI content to another computer? No, remoting is using the WSMAN protocol and doesn't really do GUI redirection. For that particular case you'd be better off using something like server 2012 minimal install, which is kind of like core but still installs some gui for software that needs it. Or, more correctly, it installs the entire GUI but windows explorer and internet explorer. Also, just spent today oogling the new powershell 3.0 features. Intellisense in the IDE? Yes. Debug tools that show you loop variables and all the rest as you hover over them? Yes please. A scheduled job cmd-let? Yesssss. Simplified for-each syntax & finally using 4 framework yessssssssssssss Negatives: Remoting is still balls bad if you need to cross forests that have no trust, as in so complex the lead architect was trying to get it working before their presentation and couldn't. Export-CSV remains dumb But seriously go download the 3.0 beta. adaz fucked around with this message at 00:14 on Jun 14, 2012 |
# ? Jun 14, 2012 00:10 |
|
adaz posted:Negatives: It also inserts spaces instead of tab characters, which bothers me to no end. I really don't want to bring back the Great Indent Wars though, so I'll say no more about it.
|
# ? Jun 14, 2012 07:23 |
|
Hola powershell companeros, another day, another powershell question. I'm making a powershell script to: 1. Log into an ftp (done) 2. Get a directory listing (done) 3. Find the last file in the directory listing, which is randomly named (failing hard) 4. Download said last file (not done but easy) I'm doing all this using native objects because this thing has to be somewhat portable, otherwise I'd be using the great PSFTP client (http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb). Basically what I've done is returning a stream, and I'm boggled as to how to get the last line of it. Here's what I've got so far, hacked together from a couple guys' FTP examples: code:
1. I got no idea how to do anything with this stream. I've tried my VB tricks with Seek, and split(\n) but I get gobbledygook,null, or errors all the time. I think there's something different with how PowerShell handles them. 2. Even after that I'll still have to isolate the file name itself. The output I'm looking at looks like: code:
|
# ? Jun 16, 2012 03:01 |
|
I think you could simplify this a great deal by using a StreamReader to wrap the response stream. Haven't tested it though. Something like this, picking up partway through your code. It checks against an empty string in case there are trailing newlines. code:
|
# ? Jun 16, 2012 03:53 |
|
Thanks mang, sorry for the late reply I took a little 'computer break' this weekend. I'll check that out.
|
# ? Jun 18, 2012 20:39 |
|
So, does anyone know if it's possible to import all of the PowerCLI modules directly into Powershell or Powershell ISE? edit: nm, apparently code:
Wicaeed fucked around with this message at 18:30 on Jun 21, 2012 |
# ? Jun 21, 2012 17:59 |
|
Any ideas on why the following returns items that are from yesterday instead of just the stuff 5 days old and older? code:
code:
Nebulis01 fucked around with this message at 00:36 on Jun 23, 2012 |
# ? Jun 23, 2012 00:24 |
|
Nebulis01 posted:Any ideas on why the following returns items that are from yesterday instead of just the stuff 5 days old and older? Have you tried with LastWriteTime instead of CreationTime? It may be that the CreationTime does not reflect the correct date for whatever reason, I'd recommend checking the attributes for the incorrectly matched files with code:
|
# ? Jun 23, 2012 16:23 |
|
LastWriteTime and LastModifiedTime work a lot better. e: i think lastwritetime = modified time
|
# ? Jun 23, 2012 16:28 |
|
How can I validate user input? I have a script that creates AD users, and I need to specify the users' location. There are four options, how can I ensure I dont misspell the city when I'm typing it in? Is there any way I could choose from a list of options rather than having to type?
|
# ? Jun 27, 2012 00:11 |
|
Swink posted:How can I validate user input? I have a script that creates AD users, and I need to specify the users' location. There are four options, how can I ensure I dont misspell the city when I'm typing it in? Is there any way I could choose from a list of options rather than having to type? With parameters, you couldt use a validateset: code:
code:
code:
Jelmylicious fucked around with this message at 06:32 on Jun 27, 2012 |
# ? Jun 27, 2012 06:22 |
|
|
# ? May 21, 2024 14:15 |
|
Jelmylicious posted:With parameters, you couldt use a validateset: And for more help and samples with those, check code:
|
# ? Jun 27, 2012 06:24 |