|
WMF 5.0 Production Preview is here!
|
# ? Sep 1, 2015 16:58 |
|
|
# ? May 15, 2024 03:16 |
|
This is a reaaaaally niche request, so I'm not really hoping to strike gold, but you never know. So, we're performing a Domino to Exchange migration and the migration product doesn't migrate permissions on anything (bit of a nightmare, really). As such, we're better off building a script that gets all the permissions on Domino and then another script that puts those permissions onto exchange automatically. Most of the stuff we'll probably be able to do, but I'm no expert at working with Domino and Powershell. Do any of you guys have script that might get a list of the permissions/delegations on Domino?
|
# ? Sep 2, 2015 14:03 |
|
orange sky posted:This is a reaaaaally niche request, so I'm not really hoping to strike gold, but you never know. http://baldwin-ps.blogspot.com/2013/08/lotus-notes-and-powershell-retrieve-acl.html You'll need to use their COM objects for basically all the automation.
|
# ? Sep 2, 2015 15:10 |
Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts. I need to find out, if possible, who last logged into them. What I have is this: code:
I also tried: code:
I'm not fixed on using an array for this - totally open to ideas to do this better.
|
|
# ? Sep 3, 2015 21:32 |
|
MJP posted:Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts. Just throwing this out there before I start testing, could you try this syntax: (Get-WMIObject -Computername $computer -Class Win32_computersystem).username Edit: I'm phoneposting so this is hard, but it looks like your structure is sorta weird. I don't think you want to use a pipe after that foreach. Dr. Arbitrary fucked around with this message at 21:53 on Sep 3, 2015 |
# ? Sep 3, 2015 21:50 |
|
$computers | %{$(get-wmiobject -ComputerName $_ -class win32_computersystem).username}
Toshimo fucked around with this message at 22:16 on Sep 3, 2015 |
# ? Sep 3, 2015 21:52 |
|
I hate that % alias.
|
# ? Sep 4, 2015 04:46 |
|
MJP posted:Another question. I've again got an array, $computers. It's a list of computers that I copied out from a spreadsheet of possible stale accounts. To attempt to explain this a little better, you have an array or collection or whatever. A list of computers that you copied from a spreadsheet. You're using % or ForEach-Object and then doing something. It might help if you visualize it like this: code:
The thing you want it to do, what Toshimo posted, is take each object inside your array $computers and "drop" it into the {}s. "$_" is a built-in variable that generally refers to "what am I working on right now?" -- since you are going through a list of computer names, in each pass through the loop "$_" is going to be the first, second, third, etc, computer name. Try this: $computers | % { write-output "Pretend that I got info for $_" } Compare it to: $computers | % { write-output "Pretend that I got info for username" } So, if you rewrite it to be "$computers | % { get-wmiobject -computername $_ win32_computersystem }" it turns into this: code:
12 rats tied together fucked around with this message at 16:56 on Sep 4, 2015 |
# ? Sep 4, 2015 16:43 |
|
I run a mixed environment of 32 and 64 bit machines at work and have to gently caress with the registry periodically. Right now I'm just looking for wow64node or whatever in the registry path to move the project along but there's got to be a better way. What's the best practice for determining the bit count of the machine? Is there a way to dynamically navigate to the 64 or 32 bit version of a registry path? I guess you could do a bit check at the beginning of the script and say if 64 bit, $a="wow64node\" And then in all your paths say, $reg = 'hlkm:software\$a path\to\stuff' ? My syntax is probably off a little (phone posting) but that seems a little obtuse still.
|
# ? Sep 4, 2015 18:09 |
|
Hadlock posted:I run a mixed environment of 32 and 64 bit machines at work and have to gently caress with the registry periodically. Right now I'm just looking for wow64node or whatever in the registry path to move the project along but there's got to be a better way. code:
|
# ? Sep 4, 2015 22:58 |
|
What is everyone doing for version/source control? I'm starting to grow in both number of scripts, and people I share them with to a point where I need something in place to manage it a bit better. Looking at git or TFS Online options. Any suggestions on workflows I can move to? Any good git primer you guys can recommend if that's the way to go?
|
# ? Sep 5, 2015 13:04 |
|
Walked posted:What is everyone doing for version/source control? Use VSO + Git. That way, you have a local repo and you can make sure if your PC explodes you're still covered by source control.
|
# ? Sep 5, 2015 14:05 |
|
Ithaqua posted:Use VSO + Git. That way, you have a local repo and you can make sure if your PC explodes you're still covered by source control.
|
# ? Sep 5, 2015 15:53 |
|
Followup on the git question: Spent part of today reading up, and fairly comfortable with the basic concepts. However; my use case is that: I have my local repo; makes sense Which I'll push to VSO for changes I'm happy with. How would I go about also getting the VSO repo synced up with a DFS namespace/fileshare at work? Basically I'd like to have a share at work that is only (and autmatically) updated with changes that are pushed to the server. Any suggestions there? I'd like as much automation as possible so the repo and share are as in-sync as possible without human intervention.
|
# ? Sep 5, 2015 20:48 |
|
Walked posted:Followup on the git question:
|
# ? Sep 5, 2015 20:55 |
|
Vulture Culture posted:Create a repository on a shared drive. Create a scheduled task to automatically git pull your wanted branch(es) from VSO. Ensure it's read-only (i.e. only the user running that pull script can write to it). That's where I'm stumbing; I cant find any documentation on how to cook in the authentication for VSO into a scheduled task. Otherwise this is 100% exactly what I'd like to do. Any tips on what to look for on that? I'm normally pretty good with google-fu, but not being a developer, not using VSO before, and not knowing git and I'm struggling edit: Figured it out; dang. Easy! Though documentation is not all over about this but you can just use the VSO Personal Access token and code:
Walked fucked around with this message at 21:52 on Sep 5, 2015 |
# ? Sep 5, 2015 21:09 |
|
So I'm running into another permissions issue with my Powershell application deployment script. If I remote (using remote desktop) into the "jump box" where the build is staged, I can run the deployment script just fine, after the changes I made in the discussion around this post. However, I'm trying to remotely invoke the script from the build machine where the build actually takes place. Basically the flow is: do build, copy package and deployment script to jump box, try to use invoke-command to execute script on jump box. The script is being executed, but large portions of it are bombing out, presumably due to the same double-hop issues I had to fix in the script itself. So basically I'm doing this: code:
If I want to maintain the workflow posted above, how would I go about handling this? Can I somehow forward the credential on to the remote execution of the script? That is, would something like this work? code:
Edit: Tried the above, no luck. Same access denied errors from the script, running on the jump box, when it tries to copy files to other machines. RDP-ing into the jump box and running the script causes no issues. bgreman fucked around with this message at 17:46 on Sep 8, 2015 |
# ? Sep 8, 2015 17:21 |
|
bgreman posted:So I'm running into another permissions issue with my Powershell application deployment script. You can forward credentials by enabling, configuring, and using CredSSP, but I find using CredSSP is usually not needed, and other workarounds are cleaner. For example in your case, the entire purpose of remoting is to start a script without giving it any parameters. For that, I would probably create a scheduled task (on-demand, no actual schedule) that runs your script with the specified credentials. Then your invocation becomes: code:
code:
Another thing you can do is to create a new PSSession configuration on the target machine, and give that configuration a RunAs credential. You can finely control who is allowed to connect to this endpoint, but it's a bit more work upfront.
|
# ? Sep 8, 2015 19:08 |
|
Briantist posted:You can forward credentials by enabling, configuring, and using CredSSP, but I find using CredSSP is usually not needed, and other workarounds are cleaner. I'll give this a go, thanks for the suggestions! I only need to run the script (for now) on the jump box. If someone unauthorized is able to get on the box and run the script, we've got bigger issues than someone accidentally running a deploy, so I'm not too worried about the security of the scheduled task.
|
# ? Sep 8, 2015 19:28 |
|
I have to dynamically parse an XML file This is what my XML file looks like. Each line looks like <component Registration="None" Name="Filename.dll"/> I can't figure out how to auto-pretty a single line XML so here's a screenshot Basically what I want to do is parse $manifest.manifest.Machines.Machine where Code="BAT" and then write these values $path = $manifest.manifest.Machines.Machine.installpath.Default (should = C:\path\) $file = $manifest.manifest.Machines.Machine.installpath.component.name (should = Filename.dll) So far I have this to parse up to the BAT part but getting the actual data out is stumping me. code:
|
# ? Sep 8, 2015 22:19 |
|
Hadlock posted:I have to dynamically parse an XML file Isn't $component.name supposed to be capitalized [$Component.name], or was that just mis-typed in the example? Powershell is a stickler on case; but everything else checks out and should function properly as long as you have access to that network-shared file through powershell. Video Nasty fucked around with this message at 00:27 on Sep 9, 2015 |
# ? Sep 9, 2015 00:20 |
|
Hadlock posted:I have to dynamically parse an XML file Yeah you were really close, this is the only change you need: code:
|
# ? Sep 9, 2015 01:28 |
|
Does the PowerShell ISE have a different implementation of Test-Path or something? Top window is a normal Powershell command line where I've populated those values by manually running some lines from my script. Bottom window is from the PowerShell ISE that was paused after running the appropriate lines. The problem I'm having is that Test-Path $BuildZip is returning false if I run it from the command line (or if I just have PowerShell execute my script), but it returns true and the script executes normally if I run this stuff from within the ISE. What gives? (I know I could probably fix this by using Test-Path $BuildZip.FullName, but I want to understand what's happening here if possible). Edit: Right as I clicked post I understood. It returns True within the ISE because the execution directory is the same as the folder containing $BuildZip, but in the standalone command line, it's not. When I changed directory to that dir in the command line and ran Test-Path $BuildZip it worked just fine. Derp.
|
# ? Sep 9, 2015 23:06 |
|
Is anyone using the http://psappdeploytoolkit.com and is it a worthwhile thing to use in a SME? edit - for context, we install\update software via GPO however with more and more mobile staff, less people are actually booting up in the office so I need a method to push out software when the machine is online. Swink fucked around with this message at 08:24 on Sep 15, 2015 |
# ? Sep 15, 2015 08:19 |
|
Swink posted:Is anyone using the http://psappdeploytoolkit.com and is it a worthwhile thing to use in a SME? I can't comment specifically on that, but we use PDQ Deploy and it's been really great. You can automate patches, set heartbeat triggers so installs/updates are pushed as soon as a computer comes online, and it provides a repository for updates so you can restrict update access to only that server. It pairs well with their other product PDQ Inventory but isn't necessary. http://www.adminarsenal.com/pdq-deploy
|
# ? Sep 15, 2015 13:13 |
|
So I'm writing a script to cleanup from an SCCM bug and I'm wondering how to approach the problem. The problem is that SCCM has created multiple folders that have identical contents and I need to find them. The folders have random names, but they're named with a GUID. So I'd probably need a regex to say "is this a GUID of form X" or not. I also don't know how many levels deep the folders go. So it will look something like this code:
|
# ? Sep 15, 2015 18:15 |
|
Doug posted:I can't comment specifically on that, but we use PDQ Deploy and it's been really great. [/url] I'm all about PDQ the beauty of the powershell toolkit (for me) is the notifications for the users. Allows them to defer etc. Might be question for the windows thread.
|
# ? Sep 16, 2015 00:07 |
|
Double post for this: https://twitter.com/Johnmroos/status/643915357384740865 PowershellGet Support in PS 3.0 and 4.0
|
# ? Sep 16, 2015 03:30 |
|
How is that going to work? Some sort of updater package that pushes you from PS 3.0 and 4.0 to 3.1 and 4.1?
|
# ? Sep 16, 2015 04:01 |
|
FISHMANPET posted:So I'm writing a script to cleanup from an SCCM bug and I'm wondering how to approach the problem. So you basically want something like this right? code:
|
# ? Sep 16, 2015 04:18 |
|
Swink posted:Double post for this: https://twitter.com/Johnmroos/status/643915357384740865 adaz posted:So you basically want something like this right?
|
# ? Sep 16, 2015 05:04 |
|
pre:set-strictmode -version 2.0 $folders = Get-ChildItem C:\Windows\System32\catroot -Force -Recurse | Where-Object{($_.PSIsContainer)} | Select Name, FullName $filesize_list = @() ForEach($folder in $folders){ $name = [Guid]::Empty if([Guid]::TryParse($folder.Name,[ref]$name)) { $filesize_list += Get-Item $folder.FullName | Select -Property FullName,@{Name="Size"; ` Expression = {(Get-ChildItem $_.FullName | Measure-Object -property length -sum).Sum + 0}} } } $file_groups = $filesize_list | Select FullName, Size | Group-Object -Property Size | Where-Object {($_.Count -gt 1) -and ($_.Name -gt 0)} ` | Select Name, @{Name="Group"; Expression = {$_.Group.FullName}} $file_hashes = @{} ForEach( $file_group in $file_groups){ ForEach( $file_to_hash in $file_group.Group){ $file_hashes.Add( $file_to_hash, (Get-ChildItem $file_to_hash -Force -Recurse | Get-FileHash | Select Hash)) } } $file_hashes.GetEnumerator() | Group-Object -Property Value | Select Count, @{Name="Matches"; Expression = {$_.Group.Name}} ` | Out-GridView -OutputMode Single | Select -ExpandProperty Group
|
# ? Sep 16, 2015 07:40 |
|
FISHMANPET posted:So I need to find a folder that's full of GUID folders, then iterate over every GUID folder and match identical contents. I'm thinking for that a double for-each loop, so for each GUID folder I compare to every other GUID folder. But I'm just wondering what would be an easy way to compare the folders in Powershell? The files are identical, the same number, the same size, I'd imagine the hashes are the same (but I don't want to hash gigabytes of data for no reason). Any tips? You probably don't need to match or even care about the guid, unless I'm really misunderstanding something here. This is an SCCM bug, yeah? So it probably barfed a bunch of poo poo out into some folder and you need to clean it up? Then everything is going to be a child of c:\SSCM\bullshit\. So, spin through c:\sccm\bullshit recursively, then do something like this: code:
So, if you know that what you're looking for has a hash of "$hash" or a size of "56000 or greater", and then you need to delete these things or move them somewhere, you can just do: code:
But, if you have some kind of "reference object" (or even a small list of them), you can probably do this much more easily with Compare-Object. If you know that SCCM hosed up with some specific set of folders and then just vomited them into subdirectories at random, you can get your "master" and use Compare-Object with a target of "gci /some/path -recurse". No need to reinvent the wheel here. 12 rats tied together fucked around with this message at 17:10 on Sep 16, 2015 |
# ? Sep 16, 2015 16:56 |
|
The specific bug is that when it imported a driver folder, it makes a copy of the folder for every INF file. the worst case was a 500mb Realtek driver that had 40 INF files so it made 40 copies of that folder, resulting in a 20Gb driver pack. So yeah SCCM spewed too many files out, but I don't know what files specifically it spewed, I just know that it if something is there twice then I need to mark it as bad (remediation will be done manually because it's kind of involved and scripting in SCCM sucks). But anyway I've got another thing I have to work on today before I can dig into this but hopefully I can get to it yet today and figure out what's going on.
|
# ? Sep 16, 2015 17:26 |
|
Are any of you aware of a change in Powershell between Windows 8 and Windows 10 that would lead to a different date format being used for CSV output? Here's an example of the difference I'm talking about : quote:Windows 8 output: This is output from a script I wrote to export Sharepoint List data using CSOM. The issue is that Sharepoint won't recognize the new date format as a valid date, so these records won't import when I run my import script. These scripts worked perfectly on Windows 8 but I haven't been able to track down what's changed in PowerShell on Windows 10 that would cause this oddity. By the way here's how my export script converts a collection of List data into a CSV file: code:
IAmKale fucked around with this message at 21:20 on Sep 17, 2015 |
# ? Sep 17, 2015 21:18 |
|
If you run this on both machines, what do you get:code:
|
# ? Sep 17, 2015 21:32 |
|
Toshimo posted:If you run this on both machines, what do you get: quote:dddd, MMMM d, yyyy h:mm:ss tt I imagine it'll be different on Windows 8. If this is the case, is there anything I can incorporate into my script to help control date output in my CSV exports?
|
# ? Sep 17, 2015 21:47 |
|
Karthe posted:I don't have access to Windows 8 at the moment but here's what I get back on Windows 10: I'm not entirely sure that's the variable that's causing you problems then, since it doesn't match the output. You can try taking "-NoTypeInformation" off your export and seeing what kind of variable it is kicking out for the date.
|
# ? Sep 17, 2015 22:25 |
|
Karthe posted:I don't have access to Windows 8 at the moment but here's what I get back on Windows 10: use get-date -format and dump it from both machines so you end up with the same format in the CSV file? example: code:
|
# ? Sep 18, 2015 00:08 |
|
|
# ? May 15, 2024 03:16 |
|
Wow Exchange has gotten so much easier to use with Powershell. I don't have to do a lot of Exchange administration but the few times I have powershell has made things much easier to work with. Seems like Microsoft really meant for Exchange 2013 to be ran almost exclusively through it. Here is a little tiny script I wrote that checks messages sent from a specific user during a certain time frame. It's been useful for when someone claims an automated message didnt get sent;code:
|
# ? Sep 18, 2015 17:24 |