|
I wish this could be the thread title
|
# ? Feb 26, 2018 21:55 |
|
|
# ? May 28, 2024 10:14 |
|
Less Fat Luke posted:Actually I'm trying to confirm this but I'm almost certain the newest version on Windows does not support a local vault except when importing one to their cloud service. This was apparently a temporary problem (rather, they simply had not implemented it yet, period, for the Windows version which was a new codebase) that they're working on fixing for 1Password 7. See last few paragraphs here https://blog.agilebits.com/2017/08/01/1password-6-7-for-windows-a-feature-buffet/. Rest assured that a poo poo ton of users are keeping close eyes on them re: their promises to continue supporting local vaults in general, for a long time. They clearly implement fun new shiny things for the server-based feature set first nowadays, but they haven't quite yet gone over the line to clear & obvious neglect of the local-only use case. If/when they do, there will probably be a pretty big exodus to...who knows where. (Or a big shitstorm & walking-back, who knows.)
|
# ? Feb 26, 2018 22:47 |
|
EVIL Gibson posted:Unless battlenet changed anything I was able to extract the key and time shift (and something else..) out of the app into a windows application to generate the same codes as the one on my phone. From what I've read, you can actually do this with Twitch's Authy stuff too. I haven't been able to get the secret yet, but you can run it as an 8-digit TOTP and just lop off the first digit to get a working 7-digit code.
|
# ? Feb 27, 2018 18:13 |
Why we Don't Deserve the Internet: Memcached Reflected DDoS Attacks is about a factor-100 DDoS amplification attack using a service that's publically available despite default configuration being bound to localhost or sockets. Meanwhile, Ars reports about a factor-51k DDoS amplification attack with a very appropriate image: The image is also a link. BlankSystemDaemon fucked around with this message at 21:41 on Feb 27, 2018 |
|
# ? Feb 27, 2018 21:05 |
|
|
# ? Feb 27, 2018 22:01 |
|
What lol
|
# ? Feb 27, 2018 22:03 |
|
bitprophet posted:This was apparently a temporary problem (rather, they simply had not implemented it yet, period, for the Windows version which was a new codebase) that they're working on fixing for 1Password 7. See last few paragraphs here https://blog.agilebits.com/2017/08/01/1password-6-7-for-windows-a-feature-buffet/.
|
# ? Feb 27, 2018 22:09 |
|
Hacking characters in my eight- or nine-digit password?!
|
# ? Feb 27, 2018 22:10 |
|
lol
|
# ? Feb 27, 2018 22:14 |
|
ChubbyThePhat posted:What lol We don’t sanitize our passwords and putting Drop Table; would destroy our site.
|
# ? Feb 27, 2018 23:23 |
|
ratbert90 posted:We don’t sanitize our passwords and putting Drop Table; would destroy our site.
|
# ? Feb 27, 2018 23:55 |
|
anthonypants posted:None of the characters they disallow were semicolons or single quotes. But they did disallow 11 characters. Problem solved
|
# ? Feb 28, 2018 01:25 |
|
Wiggly Wayne DDS posted:do you plan on any auditing process for this auto-signing process at all or is it too much hassle for you? were a malicious executable made from that process how long would it take you to notice No auditing - that would indeed be too much of a hassle. If someone infiltrated the system and got my build process to sign their malicious code, I doubt I would ever notice (maybe if Windows Defender catches it by coincidence during the signing process). I accept this risk. apseudonym posted:Signing with your release keys is more than just this came from you, it also implies you're OK with it being installed anywhere and everywhere. I do not accept this definition. A signature says who the code came from, that is all. What is the logic here? If you draw other implications from this signature, your security model is a bit dubious (though I can accept drawing negative implications from a *lack* of any accepted signature). EssOEss fucked around with this message at 09:26 on Feb 28, 2018 |
# ? Feb 28, 2018 09:21 |
|
It sounds like you're the exact person who should have to use a dongle rather than having free reign over a code signing certificate that chains up to a public root. Use your own self-signed garbage cert for your unvalidated garbage builds.
|
# ? Feb 28, 2018 09:58 |
|
|
# ? Feb 28, 2018 10:22 |
|
Christ. This isn't your kind of thread.
|
# ? Feb 28, 2018 11:30 |
|
the crl in practice on that cert was a treat
|
# ? Feb 28, 2018 11:51 |
|
idgi
|
# ? Feb 28, 2018 13:46 |
|
You have a private key that corresponds to this certificate.
|
# ? Feb 28, 2018 14:00 |
|
I might not be remembering this properly but I'm pretty sure those are screenshots anthonypants took around the time of the Sony Pictures breach when private keys for code-signing certs were found amongst the data released and anthonypants used them to sign an executable. The issue here is that the cert is issued by a trusted root CA so clients would have no issues with accepting it.
|
# ? Feb 28, 2018 14:02 |
|
EssOEss posted:I do not accept this definition. A signature says who the code came from, that is all. What is the logic here? If you draw other implications from this signature, your security model is a bit dubious (though I can accept drawing negative implications from a *lack* of any accepted signature). E: gently caress it. This has all the harbingers to be another "LastPass" conversation in terms of pointlessness. Proteus Jones fucked around with this message at 14:17 on Feb 28, 2018 |
# ? Feb 28, 2018 14:14 |
|
the crl caches for longer than you'd think on windows (and virustotal...)
|
# ? Feb 28, 2018 14:15 |
|
EssOEss posted:No auditing - that would indeed be too much of a hassle. If someone infiltrated the system and got my build process to sign their malicious code, I doubt I would ever notice (maybe if Windows Defender catches it by coincidence during the signing process). I accept this risk. A code signing cert is supposed to say that the signing organization has tested and validated a given release. You might not "accept this definition," but the rest of the industry does. You're turning it into "this was built on a certain build automation server." On that note, does anybody have good resources on securing Jenkins and friends when they're used for deployment automation in web apps? It's always worried me that a lot of these systems get godlike permissions (especially w/r/t AWS/Azure/GCP accounts!) but tend to have lovely security.
|
# ? Feb 28, 2018 16:02 |
|
EssOEss posted:I do not accept this definition. A signature says who the code came from, that is all. What is the logic here? If you draw other implications from this signature, your security model is a bit dubious (though I can accept drawing negative implications from a *lack* of any accepted signature). For internal build/test automation, use an internal CA and roll your own code signing certificate. For distributing to the public at large, resign with a public cert as part of your publishing process. This seems like the best compromise position for everyone. (This was basically what we did in a previous life at an ISV) Think of this as a good way to ensure that test builds never accidentally get shipped to the public at large. In fact, don't get the automated test build process to use a timestamp server and your test builds are also automagically time-bombed to the expiration of the certificate (definitely use a timestamp server for your published builds. we fumbled that one on some early releases).
|
# ? Feb 28, 2018 16:08 |
|
Yesssss
|
# ? Feb 28, 2018 18:45 |
|
Space Gopher posted:On that note, does anybody have good resources on securing Jenkins and friends when they're used for deployment automation in web apps? It's always worried me that a lot of these systems get godlike permissions (especially w/r/t AWS/Azure/GCP accounts!) but tend to have lovely security. Do you mean specifically hardening the Jenkins servers/services themselves, or securing the overall workflow? Your 2nd comment implies you're at least thinking about the latter, in which case you should take a look at secrets management systems like Vault. Having a tool in charge of distributing & rotating secrets, and enforcing that they are on short-lived leases, is a big step up from "meh I just dropped my, or a similarly long-lived, AWS API secret into Jenkins' config, now an attacker gets to be god forever if they break in". Instead, they only get to be god for, say, 15 minutes, or an hour, instead of retaining those privileges for weeks/months until they're ready to leverage them. Related, it doesn't require use of a secrets store (tho they often make the process easier) but another relatively low hanging fruit option is to follow principle of least privilege and only give Jenkins API keys that do exactly and only what it really needs to do. You may think "ugh, my deployment needs instance creation, listing, modification and termination, plus all the same for volumes, plus most of that for AMIs, and ... being explicit is too much work, I'm just gonna give it a full admin role." Resisting that temptation and handing out only what you need, means that if Jenkins starts working for the enemy it doesn't have e.g. the ability to assign admin privileges to other users, or destroy backups, or etc. An attacker that can nuke instances is one thing, an attacker that can lock you out of the system or create a quietly unnoticed backdoor is much worse.
|
# ? Feb 28, 2018 18:58 |
|
Space Gopher posted:A code signing cert is supposed to say that the signing organization has tested and validated a given release. You might not "accept this definition," but the rest of the industry does. Nope.
|
# ? Feb 28, 2018 19:05 |
|
Slanderer posted:Nope. Citation needed.
|
# ? Feb 28, 2018 19:09 |
|
Welcome 2 discount bob's code signing warehouse. Manage to copy a compiled application or library to our world-writable ingest folder? You better believe we're signing that thing as soon as the bash script fires off again. Approval workflows? Audit trails? Fuhgettaboutit! That would only slow down our deploy stack.
|
# ? Feb 28, 2018 19:21 |
|
A bad code signing workflow makes things like this possible.
|
# ? Feb 28, 2018 19:25 |
|
No, no, you're all missing the point.. they've decided, you see. Silly thread, with your facts and industry-standard practices. When will you learn? Queue the strawman-ing to (attempt to) confuse the lines between "The evidence shows.." and "I feel..." statements.
|
# ? Feb 28, 2018 20:03 |
|
The Fool posted:A bad code signing workflow makes things like this possible. I still see people on these forums recommending CCleaner despite the fact that it's a pile of crap.
|
# ? Feb 28, 2018 20:26 |
|
The Fool posted:A bad code signing workflow makes things like this possible. I love that that happened like less than a year after Staples started bundling CCleaner with some of its tech services.
|
# ? Feb 28, 2018 20:29 |
|
Space Gopher posted:A code signing cert is supposed to say that the signing organization has tested and validated a given release. You might not "accept this definition," but the rest of the industry does. Oh, I see what you mean - it is the equivalent of the lock icon on the address bar that tells you the website is trustworthy, right? That makes a lot of sense. Code signing says the exe is known good, just like seeing the lock icon means it is safe to enter my passwords onto that website.
|
# ? Feb 28, 2018 21:17 |
|
This certificate is OK. bitprophet posted:Do you mean specifically hardening the Jenkins servers/services themselves, or securing the overall workflow? Your 2nd comment implies you're at least thinking about the latter, in which case you should take a look at secrets management systems like Vault. Having a tool in charge of distributing & rotating secrets, and enforcing that they are on short-lived leases, is a big step up from "meh I just dropped my, or a similarly long-lived, AWS API secret into Jenkins' config, now an attacker gets to be god forever if they break in". Instead, they only get to be god for, say, 15 minutes, or an hour, instead of retaining those privileges for weeks/months until they're ready to leverage them. Also, use instance roles. No long lived IAM keys on EC2 instances. So much hand wringing with that.
|
# ? Feb 28, 2018 21:25 |
|
you guys see this? Tavis O tweeted about it a bit ago https://twitter.com/digicert/status/968925980533207040
|
# ? Feb 28, 2018 21:31 |
|
you should at least include their public meltdown over it https://groups.google.com/forum/#!topic/mozilla.dev.security.policy/wxX4Yv0E3Mk it is v standard to revoke a compromised private key, usually someone just submits a signed message as proof but it's great that the ceo was so enthusiastic about proving they had the keys in an easily accessible format
|
# ? Feb 28, 2018 21:39 |
|
EssOEss posted:Oh, I see what you mean - it is the equivalent of the lock icon on the address bar that tells you the website is trustworthy, right? That makes a lot of sense. Code signing says the exe is known good, just like seeing the lock icon means it is safe to enter my passwords onto that website. Yes, there is a basic assumption that the CA issuing the cert did some work to validate that the owner of the cert is in fact the one who owns that domain and then the site owner needs to handle their key material appropriately. If the CA issues you a code signing cert because you validated that you are the company you say you are then don't be surprised when they revoke it if you don't do the latter half of that arrangement.
|
# ? Feb 28, 2018 22:46 |
|
BangersInMyKnickers posted:Yes, there is a basic assumption that the CA issuing the cert did some work to validate that the owner of the cert is in fact the one who owns that domain and then the site owner needs to handle their key material appropriately. Question about this. If I am doing hosting for extremely non-technical people who still own the domain themselves, what steps would need to be taken to get a CA to issue a cert to me in their name? I don't want to own any of this poo poo so that divesting myself of it is easier, just hand over the keys to the business owners/their new tech people.
|
# ? Feb 28, 2018 23:07 |
|
|
# ? May 28, 2024 10:14 |
|
Getting a signing cert from a public CA for your AD Enterprise root trust is relatively involved. You should be point, and ask your clients for info/verification as needed. Your clients may get it wrong and torch cash on a useless cert. I'm assuming from context clues that you're wanting the root AD CS cert to be publicly recognizable. There's no particular reason to not just make a one-off 4096 root enterprise cert if none of this needs to be trusted externally. Potato Salad fucked around with this message at 23:40 on Feb 28, 2018 |
# ? Feb 28, 2018 23:36 |