Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
spankmeister
Jun 15, 2008






It should be fairly easy to have the PiHole use DoH as its recursive resolver, no?

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

spankmeister posted:

It should be fairly easy to have the PiHole use DoH as its recursive resolver, no?

Depends on your standard for easy? A pihole is often pointed to as a my-first-linux project for non-technical people. The normal setup process is pretty 1-2-3, with a guided script walking you through everything.

Setting up DoH, there are instructions but it's not part of the automatic setup, and you have to know enough to make some decisions. So by the standards of pihole I don't think it's "easy".

BrianRx
Jul 21, 2007
T-Mobile got hit again and, as a customer, it got me thinking about something I've been kicking around in my head for a while. I have an IT background but am currently working in a position that deals more with auditing and compliance for government systems. I'm new in the position and am cramming as much info as possible at the moment, but my current level of experience and understanding of private enterprise systems leaves me unable to take a holistic view of how security measures are implemented in a practical sense and how decisions are made by leadership.

So, as many of you work for companies that handle customer PII, what do you think would reduce the frequency of breaches, particularly among repeat offenders like T-Mobile? I imagine you all deal with the same problem of infrastructure being seen as a cost center leading to a lack of critical/effective systems like the rest of us.

I've always thought that allowing parties holding personal data who suffer breaches to be held civilly liable for any damages suffered by customers to be the likely mechanism for forcing them to take data security seriously, but it occurs to me that it would be impossible to prove that any one breach is the source of leaked personal information and thus also impossible to hold any one organization responsible.

Similarly, I had considered fines to be an option so that it's cheaper to invest in the necessary infrastructure than to pay the fine, but it would be difficult to employ a punishment scheme that would cause large companies to take notice that would also not destroy smaller companies who suffer one-off attacks. Especially so if insurance becomes involved and shields large companies from financial hits while increasing the overhead of smaller companies.

I'm at a loss for further options, save for strict compliance rules and effective auditing, which is extremely unlikely (see the position of the FAA w/r/t aircraft maintenance). What would more knowledgeable people suggest as solutions?

fake edit: the Mike Liddel poo poo is as hilarious as it is scary for people in the elections community. A friend of mine is featured on an infographic of people who are responsible for "subverting democracy" that is circulating online and was discussed personally for an hour during one presentation at the conference.

Cup Runneth Over
Aug 8, 2009

She said life's
Too short to worry
Life's too long to wait
It's too short
Not to love everybody
Life's too long to hate


BrianRx posted:

Similarly, I had considered fines to be an option so that it's cheaper to invest in the necessary infrastructure than to pay the fine, but it would be difficult to employ a punishment scheme that would cause large companies to take notice that would also not destroy smaller companies who suffer one-off attacks. Especially so if insurance becomes involved and shields large companies from financial hits while increasing the overhead of smaller companies.

You could scale the size of the fine to how many customers/how much data is involved in the leak. That might also encourage companies to collect less data in the advertising dollar driven hellworld that is web 3.0.

Thanks Ants
May 21, 2004

#essereFerrari


GDPR fine upper limits are the higher of 4% of turnover or €20m, but I agree with the insurance aspect making people think they will just pay for a policy rather than address the issues. Maybe they think they will need insurance regardless so why bother with the improvements part?

BrianRx
Jul 21, 2007

Cup Runneth Over posted:

You could scale the size of the fine to how many customers/how much data is involved in the leak. That might also encourage companies to collect less data in the advertising dollar driven hellworld that is web 3.0.

Many years ago, I had assumed that the cost of storage would be a check on the amount of data collected/retained, but since the data collected amounts to KBs per individual and storage is dirt cheap, I was clearly mistaken. I recently had a surprising amount of personal data exposed by a breach of an app that I used one time to park my car in front of a restaurant that required it nearly ten years ago. Surely there's no business reason to have that data still, aside from fact that it's cheaper to store it than to go through the effort of deleting it. I can't imagine it has any additional market value at this point.

Thanks Ants posted:

GDPR fine upper limits are the higher of 4% of turnover or €20m, but I agree with the insurance aspect making people think they will just pay for a policy rather than address the issues. Maybe they think they will need insurance regardless so why bother with the improvements part?

In the US at least, I have a hard time imagining a similar fine scheme. Cynically, looking at the example of the SEC, I would expect that the fines would either be too low or too inconsistently enforced to be seen as anything other than an occasional additional business cost (like, as you say, insurance premiums).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BrianRx posted:

I'm at a loss for further options, save for strict compliance rules and effective auditing, which is extremely unlikely (see the position of the FAA w/r/t aircraft maintenance). What would more knowledgeable people suggest as solutions?

The answer is to switch it from "civil" to "criminal."

Now, I will fully admit this is a bad idea, but it's the least bad idea that also actually fixes the problem. As has been mentioned above, fines are basically useless so long as insurance is an option, and these leaks are common enough that they no longer provide a meaningful PR hit for most companies. So there's really almost no incentive for most companies to give a gently caress these days.

Until the threat of jail time lights a fire under a CISO/CIO's rear end, nothing is going to change.

some kinda jackal
Feb 25, 2003

 
 
I’m having a really hard time posting an opinion on the state of auditing without breaking a handful of separate NDAs. I’ve literally deleted this post like five times before hitting submit. One day I’ll gain some clarity :cool:

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

DrDork posted:

The answer is to switch it from "civil" to "criminal."

Now, I will fully admit this is a bad idea, but it's the least bad idea that also actually fixes the problem. As has been mentioned above, fines are basically useless so long as insurance is an option, and these leaks are common enough that they no longer provide a meaningful PR hit for most companies. So there's really almost no incentive for most companies to give a gently caress these days.

Until the threat of jail time lights a fire under a CISO/CIO's rear end, nothing is going to change.

This is a breathtakingly stupid take especially when the CISO often has far less power, authority, or competence than you might think.

imo, literally nothing will meaningfully reduce the frequency of compromise when defenders need to cover every hole and attackers need to make only one. Penalizing CISOs is just like making ransomware payments illegal. It just drives things underground and further incentivizes companies to not disclose or remediate breaches.

So the solution has to be a little bit different. Embrace radical transparency, flood the zone, stop pretending social security numbers authenticator anything, and if there’s information that needs to be secure for the love of god don’t put it on anything electronic.

The incentives are whatever. It’s an unsolvable problem to begin with. At most you can deter your everyday cyber criminals, but at the end of the day any sufficiently motivated and funded attacker will be able to compromise you. And with the increasing democratization of malware as a service… ransomware gangs are quickly becoming very well resourced these days.

The Iron Rose fucked around with this message at 23:21 on Aug 15, 2021

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Martytoof posted:

I’m having a really hard time posting an opinion on the state of auditing without breaking a handful of separate NDAs. I’ve literally deleted this post like five times before hitting submit. One day I’ll gain some clarity :cool:

I mean, you don't have to get into NDA-breaking detail to see the broad brush-strokes here: large tech sectors are almost entirely unregulated, many have vague security guidance from places like NIST or CISA (if they have guidance at all), and those that do have guidance/regulations in place are only loosely audited because it would take a mid-sized army to have enough people to functionally audit basically every mid-sized company on up in the US and lol that's just never going to happen.

So, yeah, the tl;dr is that outside absolute critical industries, government-based auditing will never be a functional answer.

"You need to do these security things or insurance companies can reject your claim" would help, but then you're stuck trying to make effective guidance for every single business sector, and that's also an enormous task unless you're just going to keep the guidance so vague and shallow that it doesn't do a whole lot. Which is the tactic that's been taken so far, in a lot of cases.

The super fun part is that, for as bad as the situation is in the US, it's worse in pretty much every other country in the world. Sometimes considerably worse.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

The Iron Rose posted:

This is a breathtakingly stupid take especially when the CISO often has far less power, authority, or competence than you might think.

imo, literally nothing will meaningfully reduce the frequency of compromise when defenders need to cover every hole and attackers need to make only one. Penalizing CISOs is just like making ransomware payments illegal. It just drives things underground and further incentivizes companies to not disclose or remediate breaches.

So the solution has to be a little bit different. Embrace radical transparency, flood the zone, stop pretending social security numbers authenticator anything, and if there’s information that needs to be secure for the love of god don’t put it on anything electronic.

The argument would be that you'd hold the senior individuals responsible for negligence (CIO if not CISO, whatever, depends on corporate structure), not just every time a new 0-day pops out, because yeah there's poo poo you can really do about that in a lot of cases. Right now negligence is just a fine, if that, ergo no one cares. Getting the exact legal structures tuned would certainly take effort, and probably have some false starts (hence my noting it's a bad idea to begin with), but there are only two things that are ever going to change companies not giving a poo poo: (1) throwing some people in jail, or (2) fines so large the bankrupt some major companies. And (2) seems iffy unless you either ban insurance or do a whole lot of work in terms of providing functional, actionable security guidance, and that would likely take something like increasing CISA's size by an order of magnitude.

I find it kinda funny you start off saying how breathtakingly stupid the idea is, only to then suggest a shift to just brazenly blasting all our PII out for everyone on the open internet. If you think SSNs will stop being used for sensitive transactions within our lifetime, well that is...uh...breathtakingly stupid. Same for missing numerous reasons people have for not wanting their address, phone numbers, etc all readily available to anyone who wants them. There's a reason people want that data to stay private, after all: transparency isn't a viable option for a lot of it, and never will be.

And non-electronic data stores? Like...paper? The business cost and personal hassle of that would be so bad at this point that I think most people would just prefer to have their data leaked every now and then.

Fart Amplifier
Apr 12, 2003

The Iron Rose posted:

So the solution has to be a little bit different. Embrace radical transparency, flood the zone, stop pretending social security numbers authenticator anything, and if there’s information that needs to be secure for the love of god don’t put it on anything electronic.

"Make medical records public or put them on paper" isn't a solution to anything

Tryzzub
Jan 1, 2007

Mudslide Experiment
decimate the IT staff every time a breach occurs, in the roman sense

BrianRx
Jul 21, 2007

DrDork posted:

Until the threat of jail time lights a fire under a CISO/CIO's rear end, nothing is going to change.

We could also jail CEOs and other C-level leadership just because. I'd support that.

Is it technologically feasible to attach the collecting/storing party's information to PII in a way that it could be read but not altered so that it is possible, at least in some cases, to definitively prove who exposed it? I guess I'm thinking about something like metadata that includes the chain of custody of that particular copy of information. I think the main issue is that data is ephemeral and some means of making it less so might add accountability. I know any measure can be defeated, but car dealerships tend not to mess with odometers or swap VINs despite that being pretty easy to do.

Tryzzub posted:

decimate the IT staff every time a breach occurs, in the roman sense

This too. It's the only way I would've gotten off help desk. In a sys admin role or a body bag.

Martytoof posted:

I’m having a really hard time posting an opinion on the state of auditing without breaking a handful of separate NDAs. I’ve literally deleted this post like five times before hitting submit. One day I’ll gain some clarity :cool:

Blink twice if there are firms that are known to guarantee a pass regardless of actual controls and practices.

BrianRx fucked around with this message at 00:18 on Aug 16, 2021

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BrianRx posted:

We could also jail CEOs and other C-level leadership just because. I'd support that.

I mean, in a perfect world, right? But yeah, none of that is ever going to actually happen; we'll all collectively just keep shrugging like it's impossible to hold anyone accountable and that we should just accept the state of things and move on, because anything else would be Rather Hard to implement in a functional manner (which, frankly, is true--it'd be hard, but I'm not convinced that's enough to excuse not bothering to put in the work to try to make things better).

BrianRx posted:

Is it technologically feasible to attach the collecting/storing party's information to PII in a way that it could be read but not altered so that it is possible, at least in some cases, to definitively prove who exposed it? I guess I'm thinking about something like metadata that includes the chain of custody of that particular copy of information. I think the main issue is that data is ephemeral and some means of making it less so might add accountability. I know any measure can be defeated, but car dealerships tend not to mess with odometers or swap VINs despite that being pretty easy to do.

You're basically describing a blockchain concept. So yeah, it's possible, at least in theory. But the costs of involving blockchain in many cases ends up far outweighing the benefits of it (at least in terms of MBA's views on cost, where the cost of implementing new tech is $$$ and the 'cost' of your data being lost is basically zero). Plus once some hacker has the data, it's likely they could strip it out of the blockchain and now you have free floating PII again anyhow.

But even then, "where did this data come from?" often isn't even a real question: the people executing the attacks straight up tell us where they go it, and often provide some measure of proof to back up their claims. The problem is that right now we know when a company made hilariously poor/short sighted security decisions that resulted in data leaks, but there's minimal consequence for the company--it's the "ok, we suffered a leak, so what?" part that's proving hard to fix.

DrDork fucked around with this message at 00:25 on Aug 16, 2021

Thanks Ants
May 21, 2004

#essereFerrari


The best way to deal with PII is to just not store it. If you're making an app so that people can pay for parking in your city using a smartphone then you just need the number plate on the car and to take payment through Apple Pay or Google Pay or whatever which can also handle the receipt for you so you don't really need to store an email address either. There's no need to know someone's name, address, date of birth etc. but people are just used to these services asking for it.

If I'm setting up a smart thermostat then you only need to handle my login which you can do by interfacing with Google, Apple, Microsoft so you don't even need to deal with passwords. My address is only relevant if you have to ship me something, if some geolocation features are being used, or if you're using it to lookup outside weather. If those features aren't being used then you don't need that information either.

I think you almost have to accept that you're not 100% unbreachable, even if you do everything right it just takes an employee to do something maliciously to achieve the same result. If you're responsible for keeping user data safe then you need to start by just not storing it wherever possible. Stuff like social security IDs as primary keys is long past the point where it was acceptable.

Ynglaur
Oct 9, 2013

The Malta Conference, anyone?

DrDork posted:

The argument would be that you'd hold the senior individuals responsible for negligence (CIO if not CISO, whatever, depends on corporate structure), not just every time a new 0-day pops out, because yeah there's poo poo you can really do about that in a lot of cases. Right now negligence is just a fine, if that, ergo no one cares. Getting the exact legal structures tuned would certainly take effort, and probably have some false starts (hence my noting it's a bad idea to begin with), but there are only two things that are ever going to change companies not giving a poo poo: (1) throwing some people in jail, or (2) fines so large the bankrupt some major companies. And (2) seems iffy unless you either ban insurance or do a whole lot of work in terms of providing functional, actionable security guidance, and that would likely take something like increasing CISA's size by an order of magnitude.

I find it kinda funny you start off saying how breathtakingly stupid the idea is, only to then suggest a shift to just brazenly blasting all our PII out for everyone on the open internet. If you think SSNs will stop being used for sensitive transactions within our lifetime, well that is...uh...breathtakingly stupid. Same for missing numerous reasons people have for not wanting their address, phone numbers, etc all readily available to anyone who wants them. There's a reason people want that data to stay private, after all: transparency isn't a viable option for a lot of it, and never will be.

And non-electronic data stores? Like...paper? The business cost and personal hassle of that would be so bad at this point that I think most people would just prefer to have their data leaked every now and then.

If you want to understand where power sits in an organization, look at budgets. In most Fortune 500 companies, CIOs have an annual discretionary budget somewhere around 10% of the CFO. The CFO, in turn, has around 10% of the discretionary spend of the CMO. How many years has it taken for MFA to finally get traction, and even then gets disabled for C-suite executives who find it too onerous, even though security research from Microsoft shows that MFA alone would reduce breaches by at least an order of magnitude.

CIOs have very little real power in most large companies, and most large companies have already setup the CIO and/or CISO to take any falls related to data breaches anyways. It sounds good in theory, but has not worked in practice.

some kinda jackal
Feb 25, 2003

 
 

BrianRx posted:

Blink twice if there are firms that are known to guarantee a pass regardless of actual controls and practices.

I mean, I’m pretty sure I wouldn’t be breaking any NDA if I was guessing that literally every organization can and has, in an audit: talked its way out of scope, exaggerated or made up controls, and overwhelmed auditors with jumbles of technical documentation in hope that the gory details are lost in the chaos.

Because that’s how I feel about audits :cool:

BrianRx
Jul 21, 2007

Martytoof posted:

I mean, I’m pretty sure I wouldn’t be breaking any NDA if I was guessing that literally every organization can and has, in an audit: talked its way out of scope, exaggerated or made up controls, and overwhelmed auditors with jumbles of technical documentation in hope that the gory details are lost in the chaos.

Because that’s how I feel about audits :cool:

Yeah, but that's on the organization that's being audited's side. I'm talking about like sketchy smog check places where you don't have to pay if you don't pass.

Klyith
Aug 3, 2007

GBS Pledge Week

DrDork posted:

The answer is to switch it from "civil" to "criminal."

Now, I will fully admit this is a bad idea, but it's the least bad idea that also actually fixes the problem. As has been mentioned above, fines are basically useless so long as insurance is an option, and these leaks are common enough that they no longer provide a meaningful PR hit for most companies. So there's really almost no incentive for most companies to give a gently caress these days.

Until the threat of jail time lights a fire under a CISO/CIO's rear end, nothing is going to change.

Yes, that would definitely light a fire under the CISO/CIO's rear end. It would light a fire under their rear end to produce a ton of documentation that their underlings must follow for all the approved procedures that this new criminal code requires, and the CISO/CIO will only sign their names on projects that follow said documentation.

Now it's your job as project manager to build a system with many additional constraints and you have been given zero additional money. Your options:
1. deliver a project late and over budget, get fired
2. skimp on the security i-dots and t-crosses, go to jail if you get hacked
3. work 12 hour days trying to do everything right


I think looking at other traditional industries is instructive. They have fines for violations and criminal liability for criminally negligent actions, but they also have strong unions such that Joe Steelworker can report that the bolts keep shearing off under load and maybe not get fired. Even in industries where people actually die from corporate fuckups, the C-level shitheads only go to jail when they were the ones covering up the reports or bribing the investigators.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

BrianRx posted:

Blink twice if there are firms that are known to guarantee a pass regardless of actual controls and practices.

You trying to give someone epilepsy or

BrianRx
Jul 21, 2007

DrDork posted:

You're basically describing a blockchain concept. So yeah, it's possible, at least in theory. But the costs of involving blockchain in many cases ends up far outweighing the benefits of it (at least in terms of MBA's views on cost, where the cost of implementing new tech is $$$ and the 'cost' of your data being lost is basically zero).

I'm skeptical that any impetus to increase data security is going to come from companies themselves, so I think the corporate cost/benefit analysis could sidestepped. If it comes, I think it will have to be forced on them through regulation. As far as cost goes, I think increasing it would possibly reduce the amount of collection and retention of data and result in fewer potential targets for hackers.

quote:

Plus once some hacker has the data, it's likely they could strip it out of the blockchain and now you have free floating PII again anyhow.

I don't know that they would want to remove that information for the same reason they advertise where they got it now. It would add weight to claims that it's genuine and from a particular source, which is usually part of the value of a dataset.

quote:

But even then, "where did this data come from?" often isn't even a real question: the people executing the attacks straight up tell us where they go it, and often provide some measure of proof to back up their claims. The problem is that right now we know when a company made hilariously poor/short sighted security decisions that resulted in data leaks, but there's minimal consequence for the company--it's the "ok, we suffered a leak, so what?" part that's proving hard to fix.

Based on the comments I've heard from practitioners (including you folks) it seems to me that ultimately the problem to be solved is how to make the relevant people care. That's everyone from CEOs to shareholders to lawmakers. I think more people would have to become aware of what is being collected and how it's used (including illicitly) and start making noise about it to either force action from the government or make breaches embarrassing again. I'm not sure that'll ever happen, though. Even people convinced COVID shots give you 5G use Chrome and carry an Android phone around in their pockets. What would shake them?

droll
Jan 9, 2020

by Azathoth
Constitutional amendment right to privacy?

wargames
Mar 16, 2008

official yospos cat censor

droll posted:

Constitutional amendment right to privacy?

But think of the billionaires.

BonHair
Apr 28, 2007

GDPR seems to be doing a lot of good in terms of information security, at least in Denmark and in companies I hear about (and they have a bias towards wanting better security).

But a fun different vector is the insurance companies: when they realize how much they have to pay for companies with open access to everything, they might require companies to have, say, implemented CIS controls if they want to be insured (or have a non crazy premium).

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

BonHair posted:

GDPR seems to be doing a lot of good in terms of information security, at least in Denmark and in companies I hear about (and they have a bias towards wanting better security).

But a fun different vector is the insurance companies: when they realize how much they have to pay for companies with open access to everything, they might require companies to have, say, implemented CIS controls if they want to be insured (or have a non crazy premium).

Only if they have to pay out money to make others whole as a result of the insured company's breach. At this point, the only cost of a beach is PR, which I suspect is exactly the calculation any company offering insurance is expecting.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Thanks Ants posted:

The best way to deal with PII is to just not store it. If you're making an app so that people can pay for parking in your city using a smartphone then you just need the number plate on the car and to take payment through Apple Pay or Google Pay or whatever which can also handle the receipt for you so you don't really need to store an email address either. There's no need to know someone's name, address, date of birth etc. but people are just used to these services asking for it.

...

If you're responsible for keeping user data safe then you need to start by just not storing it wherever possible. Stuff like social security IDs as primary keys is long past the point where it was acceptable.

In concept sure, but you're missing some parts here: most data is collected not because it's particularly required, but because it's potentially useful in the future if a company wants to mine it for...whatever. Marketing campaigns can't happen if they don't know where to send the spam, so they collect your email. Maybe they wonder about how many of their customers are from out of town vs locals, so they collect your billing address. Everyone wants to know market demographics of their customers so they collect your age. And so on and so forth. They do it because if it gets leaked, well, basically no cost to them, and if they have the data on hand and it's useful at some point in the future, well hey, great! All it cost them was a couple of hard drives of space.

Another part is refactor costs: while there's no additional fee for using Apple/Google/whatever Pay, not everyone has it. Not everyone wants them. So at minimum you probably have to have both a traditional payment processor in addition to *Pay anyhow. And if you're a smaller business who already set up the traditional processor a few years ago before *Pay was much of a thing, why would you bother the extra expense of going back and adding in new features like that for a small set of customers in order to better protect their data at no benefit to your company?

The lovely thing is that a lot of companies don't act like they're really responsible for keeping your info safe. At best they look at it as their own proprietary dataset that they'd be grumpy about letting others see because maybe that gives away some details about their business that they'd prefer not to be public.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Volmarias posted:

Only if they have to pay out money to make others whole as a result of the insured company's breach. At this point, the only cost of a beach is PR, which I suspect is exactly the calculation any company offering insurance is expecting.

Bingo. It's a minor PR hit and then maybe they bother to pay for credit monitoring for a year. Not a big cost outlay, especially compared to other aspects of cybersecurity like recovering from ransomeware or--heaven forbid--paying for actual security.

PII data leaks are the Trump scandals of the security world: most people at this point are just so tired of hearing of them they've stopped caring about any particular one, and are just generally grumpy about the whole aggregate situation without any idea what to do to change it.

BonHair posted:

GDPR seems to be doing a lot of good in terms of information security, at least in Denmark and in companies I hear about (and they have a bias towards wanting better security).

Yeah, GDPR is certainly a step in the right direction, and is helpfully pulling along a lot of US companies simply because it's easier to apply GDPR requirements to all customers rather than try to split behaviors up by location. The requirements aren't perfect by any means, but they're at least something.

DrDork fucked around with this message at 16:54 on Aug 16, 2021

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
https://twitter.com/Jeremy_Kirk/status/1427144723731402756?s=20

Mopp
Oct 29, 2004

this is the dumbest question but this thread has done wonders for me in the past...

I am taking Offsec's exploit development course but the lab time has run out. They are using an trial of software called IBM Tivoli Storage Manager FastBack server v6.1.4 for their lab. I want to find and install this trial, but IBM is a loving mess of a website and Google isn't helping. Any wild ideas on how I could find it?

BonHair
Apr 28, 2007

DrDork posted:


Yeah, GDPR is certainly a step in the right direction, and is helpfully pulling along a lot of US companies simply because it's easier to apply GDPR requirements to all customers rather than try to split behaviors up by location. The requirements aren't perfect by any means, but they're at least something.

The fun thing is that it's 95% old rules here, the main addition is the fines. The typical story we see is

  1. Company decides to not get a huge fine.
  2. Company looks up what to do.
  3. Company realizes they're in over their heads.
  4. Company hires consultants to help. In my experience either my company (good) or some lawyers (bad).
  5. We help Company organise their poo poo, which means figuring out what PII they have, what they use it for and what systems they put it in.
  6. We help Company assign some responsible people/roles for their systems.
  7. Company begins actually thinking about their data.

This is very basic and takes a huge amount of time, struggle and effort. But the end result is that Company is a lot more mature in their handling of data, and with some luck (assisted by "you need to tell people what legitimate purpose you are collecting their data for" from GDPR), they will actually only collect and save stuff they need. And maybe even consider protecting it if they identify a risk along the way. Clear responsibility helps, because quite often, the business owner assumes IT takes care of security, and vice versa, leading to no actual security. Hammering out that one or the other is responsible means they can't deflect as easily.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BonHair posted:

The fun thing is that it's 95% old rules here, the main addition is the fines. The typical story we see is

And yet on the other side of the pond we have crap like this:


Pearson, a giant standardized testing company, had a data breach involving 13,000 schools' admin credentials, millions of student's names, DOBs, emails, etc. They then tried to play a little too aggressive with their PR campaign about it and basically said it wasn't a big deal and that a bunch of the data didn't get taken when they drat well knew it did.

So the SEC fined them $1M, on a net profit of about $400M in 2020.

13,000 schools and universities' admin creds leaked, blatant lying about it, and the sum total cost to the company is 0.25% of their profits this year. No wonder it's hard to get anyone to take this whole thing seriously--they probably paid more on exec expensed lunches for the year.

BonHair
Apr 28, 2007

To be fair, actual fines from GDPR have been pretty much slaps on the wrist over here too. You have to be super blatant to get the full amount.

The fun thing for private citizens is the right to know what information a company has on you. It's usually a huge hassle for them to find it all, but if they don't bother, that's a very clear crime on their part. So if you are really pissed at a company, it's a godsend for pettyness. I made a bank waste a bunch of time and got them a slap on the wrist (no actual fine) from the data protection agency just by sending a couple of short emails.

BrianRx
Jul 21, 2007

DrDork posted:

And yet on the other side of the pond we have crap like this:

Pearson, a giant standardized testing company, had a data breach involving 13,000 schools' admin credentials, millions of student's names, DOBs, emails, etc. They then tried to play a little too aggressive with their PR campaign about it and basically said it wasn't a big deal and that a bunch of the data didn't get taken when they drat well knew it did.

So the SEC fined them $1M, on a net profit of about $400M in 2020.

13,000 schools and universities' admin creds leaked, blatant lying about it, and the sum total cost to the company is 0.25% of their profits this year. No wonder it's hard to get anyone to take this whole thing seriously--they probably paid more on exec expensed lunches for the year.

You know, the one good thing about neoliberalism (in this very, very specific context) is that companies' fiduciary responsibility to investors requires them to generate the maximum ROI regardless of means. That means crushing labor and destroying the environment, but it also means taking preventative measures to avoid foreseeable penalties that would reduce returns. If breaches could be made to cost even a dollar more than infrastructure investment, investor pressure would force leadership to view security differently. That still involves regulation with consistent enforcement as well as a commitment not to reduce fines below the cost of maintaining effective infrastructure. As you mentioned, the SEC is a great example of how that would probably go, though, considering the size and resources of companies involved in the data economy. But even low fines might dissuade companies that collect data as a secondary part of their business "just in case", as a poster above described, from doing so.

BonHair posted:

The fun thing for private citizens is the right to know what information a company has on you. It's usually a huge hassle for them to find it all, but if they don't bother, that's a very clear crime on their part. So if you are really pissed at a company, it's a godsend for pettyness. I made a bank waste a bunch of time and got them a slap on the wrist (no actual fine) from the data protection agency just by sending a couple of short emails.

I am curious how easy it is to access information about a specific person by abusing a legitimate mechanism like the one you describe or by searching through public data dumps. I have a coworker who spends a lot of time and money keeping her name off of search engines because of an abusive ex-partner. It'd be pretty hosed up if he could locate her because her local gym or supermarket didn't know what it was doing and exposed a database with customer information to the internet.

BrianRx fucked around with this message at 20:23 on Aug 16, 2021

BonHair
Apr 28, 2007

BrianRx posted:


I am curious how easy it is to access information about a specific person by abusing a legitimate mechanism like the one you describe or by searching through public data dumps. I have a coworker who spends a lot of time and money keeping her name off of search engines because of an abusive ex-partner. It'd be pretty hosed up if he could locate her because her local gym or supermarket didn't know what it was doing and exposed a database with customer information to the internet.

In my experience, depressingly easy. I don't know about dumps, but just writing a mail from weedgoku1488@gmail.com asking for anything they have on me, John Q Uniquename, is likely to succeed. I think the best line of defense is having a common enough name that it would likely need more information to be disambiguated.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
https://twitter.com/RayRedacted/status/1427641178380546049?s=20

New episode of Darknet Diaries is out. Its good.

BrianRx
Jul 21, 2007

This was really cool, thanks. I remember reading the story about this in the NY Times, but they understandably didn't include Igor in it. I love this show, but every few episodes I learn something that makes me extremely paranoid for a few days.

BonHair posted:

In my experience, depressingly easy. I don't know about dumps, but just writing a mail from weedgoku1488@gmail.com asking for anything they have on me, John Q Uniquename, is likely to succeed. I think the best line of defense is having a common enough name that it would likely need more information to be disambiguated.

Yeah, I just submitted a few requests to OneTrust, Palantir, and a few of their clients. OneTrust immediately responded and said they had no data on me, but could not provide any information about what may be held by their customers. I reached out to Moody's, which has some credit and background check affiliates, and they are processing my request. Palantir just had an email address to contact so I don't know what information they will require from me. At no point in any transaction have I been asked for anything but my name and a contact email. If I had a common last name, I really don't know how they would differentiate between individuals.

I also found a data broker that offers trial access to their database through an API. Name and address are fields available to trial users and more sensitive information is available to subscribers. I requested an API key and we'll see how that goes. There at least seems to be a review process with a human involved.

BrianRx fucked around with this message at 19:19 on Aug 17, 2021

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
https://twitter.com/BleepinComputer/status/1428585535312891913?s=20

evil_bunnY
Apr 2, 2003

DrDork posted:

So the SEC fined them $1M, on a net profit of about $400M in 2020..
this is why gdpr max fines are defined as % of turnover.

Adbot
ADBOT LOVES YOU

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

evil_bunnY posted:

this is why gdpr max fines are defined as % of turnover.

And is precisely why large companies actually take this seriously, at least from what I've seen.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply