Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Morbus
May 18, 2004

CNNs (which are main technique for this kind of computer vision these days) are famous both for replicating any biases in their training data, and for performing very poorly when trained on an insufficiently broad set of examples. They are also trained against very large example sets, and usually judged on overall accuracy--outside of this overall success metric, the importance of particular cases is mostly reliant on engineer followup and judgement.

So if black faces are insufficiently represented in training data and/or you optimize your machine for an overall accuracy metric that can still be very good even if black faces aren't detected as well as white ones, you'll have some problems. Like, a classifier that is 10x better at identifying white male faces at the expense of a 4x reduction in detection accuracy for everyone else may score "higher" than a more neutral one, due to inherent biases in the training data. Additionally, white engineers are less likely to notice/care about the biases in the training data and even less likely to notice/care about poor performance on black faces as long as the overall score of the algorithm goes up.

Another (bigger) problem is that, can you really say a data set is "biased" if it is accurately reflecting the real world? If reality itself underrepresents or undervalues black women, for example, the most "accurate" machine will often be similarly biased. If the data is actually full of racist patterns, then a pattern recognition machine will itself be racist. This was famously the case when companies used machine learning to "predict" the probability of a candidate being hired.

It's actually kind of a beautiful and horrifying illustration of the nature of institutional racism vs. individual bigotry. Even a stone cold dumb machine, with no concept of race, running a totally abstract, platonic ideal of an algorithm which itself has no human input or guidance and is indeed totally opaque to it's human creators (i.e.an artificial neural network), cannot help but replicate the racist nature of the reality it is trained to understand.

You can try to band-aid the problem by going in and tweaking training data or changing metrics--in much the same way that you can try to address institutionalized racism via individual wokeness--but fundamentally the racist structure of society guides its constituents into racist patterns almost irrespective of their individual programming or goals, and without requiring their consent or even awareness.

Adbot
ADBOT LOVES YOU

Morbus
May 18, 2004

YOLOsubmarine posted:

I didn’t say don’t use ML, I said don’t use it when the results are lovely and it literally just reproduces structural inequalities already baked into society. It’s a tool, and a fairly dull one at times. But plenty of companies treat it like the problem is in the tool (we don’t know why the model is racist, it’s unintentional, we’re trying to fix it) as opposed to their choice to use the tool for that particular problem.

But the very definition of a racist society in the first place is that "reproducing structural inequalities" is a feature, not a bug. Just look at how badly people lose their poo poo when you do anything else.

This isn't to say companies can't* or shouldn't do better, but it's not exactly mysterious why they favor a machine that performs well in the ways they care about without being bothered by it performing poorly in ways they don't care about

*lol capitalism.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply