Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BlankSystemDaemon
Mar 13, 2009



proprietary software vendors don’t have solutions to the issues exposed by this
since it’s mostly a series of political issues, it’s sad that people are only proposing technical solutions

Adbot
ADBOT LOVES YOU

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
mmmmmm yea nah.

the altruistic community has to catch this every single time, and it has been happenstance rather than designed safeguards that found issues so far.

the bad guys only have to win once and it can be real real bad. like collapse of a federal reserve banking system bad.

mystes
May 31, 2006

fresh_cheese posted:

mmmmmm yea nah.

the altruistic community has to catch this every single time, and it has been happenstance rather than designed safeguards that found issues so far.

the bad guys only have to win once and it can be real real bad. like collapse of a federal reserve banking system bad.
I guess I should have joined the evil fandom then

BlankSystemDaemon
Mar 13, 2009



fresh_cheese posted:

mmmmmm yea nah.

the altruistic community has to catch this every single time, and it has been happenstance rather than designed safeguards that found issues so far.

the bad guys only have to win once and it can be real real bad. like collapse of a federal reserve banking system bad.
the opensource movement is about being able to fix software, as opposed to letting proprietary vendors fix them if they think they can profit from it
propeietary vendors are no less subject to the whims of state-sponsored apts

big black turnout
Jan 13, 2009



Fallen Rib
Backdoors have to snuck into open source at least, compared to a state just going to a company and telling them they'll insert a backdoor

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
im not saying proprietary software rulez

im saying the oss community needs to be more skeptical about adding dependencies on rando libraries

mystes
May 31, 2006

fresh_cheese posted:

im not saying proprietary software rulez

im saying the oss community needs to be more skeptical about adding dependencies on rando libraries
What is the other option?

Athas
Aug 6, 2007

fuck that joker

fresh_cheese posted:

im not saying proprietary software rulez

im saying the oss community needs to be more skeptical about adding dependencies on rando libraries

Agreed, but not just (or even mainly) because of security issues.

Also, while I think this is primarily a political and/or social problem, there are also possible technical solutions, like capability systems and poo poo that prevents dependencies from doing completely arbitrary things. (Probably not really practical with our current knowledge of computers.)

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
when an oss project adds a dependency on some other oss project, the bullshit and bad practices of the included project rolls up and applies to the including project.

at the moment, it seems like everybody just assumes their dependancies are on the up and up and have no bad practices so hell yea #include fart.h awesome

if you are going to call somebody elses code from your code and their code turns out to be evil im saying thats on you if you did insufficient diligence in vetting that code and that communities practices.

im blaming the sshd and systemd teams here, only slightly less than the actual xz bad actor.

mystes
May 31, 2006

fresh_cheese posted:

when an oss project adds a dependency on some other oss project, the bullshit and bad practices of the included project rolls up and applies to the including project.

at the moment, it seems like everybody just assumes their dependancies are on the up and up and have no bad practices so hell yea #include fart.h awesome

if you are going to call somebody elses code from your code and their code turns out to be evil im saying thats on you if you did insufficient diligence in vetting that code and that communities practices.

im blaming the sshd and systemd teams here, only slightly less than the actual xz bad actor.
So are you going to pay them to reimplement everything themselves or do a security audit of every dependency every month or something?

Athas
Aug 6, 2007

fuck that joker
The sshd developers didn't have anything to do with this. The patch was added by distributions. And the patch itself added a completely unnecessary dependency on libsystemd; the systemd notification protocol is by sending a single datagram to a local socket. You're not expected to include all of libsystemd just to do that. This is simply poor craftsmanship in the sshd patch.

I don't know whether libsystemd has a justified dependency on xz, but I don't really see a problem with an omnibus library needing to decompress poo poo.

Athas fucked around with this message at 15:30 on Mar 31, 2024

BlankSystemDaemon
Mar 13, 2009



fresh_cheese posted:

im entitled to have programmers who also practice opensource as a hobby do due diligence that i don’t expect from anyone else, and furthermore

BlankSystemDaemon
Mar 13, 2009



please please please read this post before you post anything else, fresh_cheese
because the above poo poo is loving intolerable

Cybernetic Vermin
Apr 18, 2005

tbf fresh_cheeses take is entirely compatible with the sensible "stop doing that" solution, which is also real good for preventing burnout

BlankSystemDaemon
Mar 13, 2009



Cybernetic Vermin posted:

tbf fresh_cheeses take is entirely compatible with the sensible "stop doing that" solution, which is also real good for preventing burnout
doing the equivalent of a kid kicking up a fuzz, grabbing their toys, and going home

Cybernetic Vermin
Apr 18, 2005

BlankSystemDaemon posted:

doing the equivalent of a kid kicking up a fuzz, grabbing their toys, and going home

no, im asking others to go home

Athas
Aug 6, 2007

fuck that joker

BlankSystemDaemon posted:

please please please read this post before you post anything else, fresh_cheese
because the above poo poo is loving intolerable

While the point of this post is true & appreciated, there is the twist that the ungrateful mailing list posters cited in that post are likely to be sock puppets of whichever bad actor decided to attack xz-tools, intended to pressure the original developer into adding a new evil maintainer.

mystes
May 31, 2006

Athas posted:

While the point of this post is true & appreciated, there is the twist that the ungrateful mailing list posters cited in that post are likely to be sock puppets of whichever bad actor decided to attack xz-tools, intended to pressure the original developer into adding a new evil maintainer.
this seems like a dumb assumption but otoh it means that it's ok to assume that ungrateful users are bad actors trying to sneak in vulnerabilities so let's go with that

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
if im paying money for support and security fixes then the distro better be owning this vetting if nobody else in the community feels like its their problem

Athas
Aug 6, 2007

fuck that joker

fresh_cheese posted:

if im paying money for support and security fixes then the distro better be owning this vetting if nobody else in the community feels like its their problem

if

Captain Foo
May 11, 2004

we vibin'
we slidin'
we breathin'
we dyin'

fresh_cheese posted:

if im paying money for support and security fixes then the distro better be owning this vetting if nobody else in the community feels like its their problem

lol

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
a giant glaring security vulnerability appears, again

the community participants collectively put their finger on their nose and say “not it” regarding whos responsibility it was to catch that

all agree that its fine and nothing needs to be done because see we found it!


lol

mystes
May 31, 2006

fresh_cheese posted:

a giant glaring security vulnerability appears, again

the community participants collectively put their finger on their nose and say “not it” regarding whos responsibility it was to catch that

all agree that its fine and nothing needs to be done because see we found it!


lol
Thank you for volunteering

Athas
Aug 6, 2007

fuck that joker

fresh_cheese posted:

a giant glaring security vulnerability appears, again

the community participants collectively put their finger on their nose and say “not it” regarding whos responsibility it was to catch that

all agree that its fine and nothing needs to be done because see we found it!


lol

i agree Someone needs to do Something

Beeftweeter
Jun 28, 2005

OFFICIAL #1 GNOME FAN

Athas posted:

The sshd developers didn't have anything to do with this. The patch was added by distributions. And the patch itself added a completely unnecessary dependency on libsystemd; the systemd notification protocol is by sending a single datagram to a local socket. You're not expected to include all of libsystemd just to do that. This is simply poor craftsmanship in the sshd patch.

I don't know whether libsystemd has a justified dependency on xz, but I don't really see a problem with an omnibus library needing to decompress poo poo.

i'm not sure anything has a justified dependency on liblzma, there's an independent lzma implementation (that is not vulnerable to this exploit since it executed in the build stage) in the kernel already. xz-utils is a bit different since you presumably need some sort of front-end for the archiver, but some cli wrapper around a compression library shouldn't be that complex anyway

sb hermit posted:

why not just stop development on these things unless it’s critically important

like unless you need to forward port yet another piece of code to be compliant with compiler changes or to take advantage of new hardware, just leave it the hell alone

yeah i think this position is where i'm at (with poo poo like liblzma anyway) now. like, was a supposedly slightly faster CRC32 implementation reeeeaaallly worth all this bullshit?

Beeftweeter fucked around with this message at 16:08 on Mar 31, 2024

fresh_cheese
Jul 2, 2014

MY KPI IS HOW MANY VP NUTS I SUCK IN A FISCAL YEAR AND MY LAST THREE OFFICE CHAIRS COMMITTED SUICIDE
mystes and athas raise excellent points

im just an email address on the internet tho, so not to be trusted by anyone whos opinion matters

Ill have a conversation with the security weasel at work to see about what can be done

Beeftweeter
Jun 28, 2005

OFFICIAL #1 GNOME FAN
that's theoretically why oss projects have maintainers and code reviews and poo poo. the process breaks down when it's just one guy suffering extreme burnout though, making the project vulnerable to this kind of "benevolent" maintainer stepping in

idk what a good solution to this might be. identify critical packages and have salaried employees review the code? seems like everyone's definition of critical might vary, and if you're paying someone to review or maintain the code then there's always the "well gently caress everyone else" incentive to just make it a closed source project

but doing that would mean no visibility into the code at all, by anyone, and i think we can all agree that would be worse. so what's your better solution here?

Phobeste
Apr 9, 2006

never, like, count out Touchdown Tom, man
i mean in some respects this is the system working, right? like in any kind of layered action model you can point to the layers that got bypassed before the one that caught it, and the luck of the specific layer that caught it, but the entire point of the many-eyes philosophy is an inversion of the classic attacker has to be right once, defender has to be right every time: the attacker has to get past all the defenders, and each individual defender only has to get lucky once. this code was not shipped in the lts distros. the distros that had incorporate in testing phase and unstable releases have to revert, which they are capable of doing. and the reason that it was caught and that this is possible and fast is that somebody could be doing perf testing, and could notice something wrong, and could go and look at the source for why.

it is not perfect and it should be improved and step 0 of that improvement should be paying, but i think people leaping to an indictment of the oss model as a whole are kind of missing the forest for the trees

mystes
May 31, 2006

Phobeste posted:

i mean in some respects this is the system working, right? like in any kind of layered action model you can point to the layers that got bypassed before the one that caught it, and the luck of the specific layer that caught it, but the entire point of the many-eyes philosophy is an inversion of the classic attacker has to be right once, defender has to be right every time: the attacker has to get past all the defenders, and each individual defender only has to get lucky once. this code was not shipped in the lts distros. the distros that had incorporate in testing phase and unstable releases have to revert, which they are capable of doing. and the reason that it was caught and that this is possible and fast is that somebody could be doing perf testing, and could notice something wrong, and could go and look at the source for why.

it is not perfect and it should be improved and step 0 of that improvement should be paying, but i think people leaping to an indictment of the oss model as a whole are kind of missing the forest for the trees
I mean it just barely got caught by chance because someone was profiling sshd or something and it had gotten pretty far so it is a little concerning but I don't know what the solution is

Sapozhnik
Jan 2, 2005

Nap Ghost

mystes posted:

I mean it just barely got caught by chance because someone was profiling sshd or something and it had gotten pretty far so it is a little concerning but I don't know what the solution is

in this particular case, reduce the number of nooks in which some shady poo poo can hide. there was something lurking in the tarball that wasn't in the git repository.

this sort of story is the way in which engineering processes improve in practice, by writing the rules in blood.

Well Played Mauer
Jun 1, 2003

We'll always have Cabo
I mean a dude spent two years of his state sponsored life to infiltrate a zip project only to have his payload get caught by a guy benchmarking software on a bleeding edge distribution that only crazies would use in an environment where they want safe data.

How fast did the high fives stop, do you think?

I’m in no way even reasonably informed how all this poo poo works but the idea that some smugdog using Arch btw interrupted what could have been a heist of the decade is very funny to me. Very 10,000 monkeys with a typewriter energy.

shackleford
Sep 4, 2006

unironically burn autotools to the ground and replace it with meson

leaving aside the "build everything from upstream tarballs" vs. "build everything from git" vs. "build everything from tarballs that are constructed only using the contents of the git repository" debate which is gonna play out in linux distros for the next few years

there is no good reason why the build system needs to be several megabytes of unreadable, un- version controlled shell scripts so that linux developers can cosplay that they're writing portable code. "portable" now means your platform looks enough like linux that you can run linux code

Sapozhnik
Jan 2, 2005

Nap Ghost
a dude slipped a backdoor into a crack between what's in git and what's in a tarball. the solution to prevent this particular exploit from happening again is to have a stronger chain of custody from code reviews to source control to distribution build scripts, i.e. by tying build scripts to particular source control commit ids instead of a "dude trust me bro" tarball that is just assumed to reflect a particular git tag. this isn't new and we knew that we should be doing that to begin with, but nonetheless a bunch of distros (almost) got caught sleeping. this rule is part of the movement towards reproducible builds in general, but this rule is now written and underlined in blood.

you learn from specific catastrophic or near-catastrophic incidents and devise specific rules to prevent them from happening again, that's how this sort of thing works in practice.

Sapozhnik
Jan 2, 2005

Nap Ghost
if anything it is a far more specifically teachable moment than something like heartbleed or the log4j catastrofuck. the latter was particularly awful because that library literally had rce as its documented albeit unintended behavior and somehow nobody noticed.

mystes
May 31, 2006

I wonder if there should also be some move to try to somehow simplify build processes in general so it's harder to hide stuff there? I don't know how that would look though

Athas
Aug 6, 2007

fuck that joker
Would the xz attack have been found sooner if it had been present in the commit log? I think it is pretty easy to hide obfuscated things in build scripts.

Sapozhnik
Jan 2, 2005

Nap Ghost
it would have been much harder because then a diff to introduce a backdoor would need to pass code review and a plausible good-faith justification would have to be presented. of course, this assumes that people are actually looking at and participating in a key dependency's development and it isn't just a one-man show that every major cloud company silently mooches off.

the main factor saving us from that aspect of the attack is the fact that the bar for inclusion into the base packages for debian is way higher than it is for publishing something to the npm or pypi trash heap. a change of upstream maintainer for a key dependency like that should be cause for scrutiny. we already follow this rule elsewhere: in the browser add-on world a change in maintainership for popular a add-on is immediately treated with great suspicion because of a repeated history of attacks like this.

so we have one human-factors takeaway and one technological takeaway from this incident.

Beeftweeter
Jun 28, 2005

OFFICIAL #1 GNOME FAN

Athas posted:

Would the xz attack have been found sooner if it had been present in the commit log? I think it is pretty easy to hide obfuscated things in build scripts.

you mean, like, a commit with the comment "adds backdoor exploit to openssh"? yeah probably

mystes
May 31, 2006

Beeftweeter posted:

you mean, like, a commit with the comment "adds backdoor exploit to openssh"? yeah probably
I don't think they necessary mean the commit message or a single commit that just added the backdoor, just a change in a commit that someone could have looked at and wondered why it was added

so yeah, not using tarballs that can have stuff not tracked in git is definitely a start

Adbot
ADBOT LOVES YOU

shackleford
Sep 4, 2006

most of the components of the attack were present in the git repository, for instance the compiled shell code was disguised as compression test data

to detect that specific thing, you probably have to build some sort of data processing pipeline that emits the net new binary content added to the git repos of some however defined core set of open source repositories and then hand off the output to reviewers. essentially an anti-steganography problem

then the obvious countermeasure would be to hex encode or base64 encode the payload so that it bypasses such a detection pipeline, so now you gotta recognize unusual text content as well

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply