Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ate shit on live tv
Feb 15, 2004

by Azathoth
oh for sure. i thought this was some kind of "all tools are products of a racist society" so we shouldnt use them take. but in this case the answer is, capitalism will make "number go up" and drat the consequences. not climate-change, racism, loss of biodiversity or human lives will be heeded, ever.

Adbot
ADBOT LOVES YOU

Morbus
May 18, 2004

YOLOsubmarine posted:

I didnt say dont use ML, I said dont use it when the results are lovely and it literally just reproduces structural inequalities already baked into society. Its a tool, and a fairly dull one at times. But plenty of companies treat it like the problem is in the tool (we dont know why the model is racist, its unintentional, were trying to fix it) as opposed to their choice to use the tool for that particular problem.

But the very definition of a racist society in the first place is that "reproducing structural inequalities" is a feature, not a bug. Just look at how badly people lose their poo poo when you do anything else.

This isn't to say companies can't* or shouldn't do better, but it's not exactly mysterious why they favor a machine that performs well in the ways they care about without being bothered by it performing poorly in ways they don't care about

*lol capitalism.

MoaM
Dec 1, 2009

Joyous.

Charles 2 of Spain posted:

Everyone who works in computer vision is a cop

bvj191jgl7bBsqF5m
Apr 16, 2017

Í̝̰ ͓̯̖̫̹̯̤A҉m̺̩͝ ͇̬A̡̮̞̠͚͉̱̫ K̶e͓ǵ.̻̱̪͖̹̟̕

e-dt posted:

The racism computer? I think we should turn it off, OP.

But how will I game

Atrocious Joe
Sep 2, 2011

The solution is ML ML

Machine Learning Marxism Leninism

Gods_Butthole
Aug 9, 2020
Probation
Can't post for 8 years!

StashAugustine posted:

I'm taking a biometrics course to round out a double major and I think its driving me insane. Lecture this morning was on facial recognition of mugshots (all mugshots were of black people) then later in lab we had a news report about British police employing human "super recognizers" to surveil protests. Its all some loving Voight-Kampf poo poo

lol, what the gently caress is a "super recognizer"? How does someone get a job like that?

Grand Theft Autobot
Feb 28, 2008

I'm something of a fucking idiot myself

YOLOsubmarine posted:

I didn’t say don’t use ML, I said don’t use it when the results are lovely and it literally just reproduces structural inequalities already baked into society. It’s a tool, and a fairly dull one at times. But plenty of companies treat it like the problem is in the tool (we don’t know why the model is racist, it’s unintentional, we’re trying to fix it) as opposed to their choice to use the tool for that particular problem.

Like someone else said up thread, it's about being able to shift blame to the computer.

As an example, I worked for a few years on an ML project for bail setting in criminal courts. Judges routinely gave black defendants higher bail amounts than white defendants charged with the same or similar offenses. Part of the reason was that Judges used an evaluation tool that assigned points based on poo poo like stable employment, level of education, marital status, geography, whether the person owned or rented, and so on.

Some consultants came in with a solution: a bail evaluation tool with a machine learning algorithm that would take the racism out of the process with magic, and better yet the judges would no longer have the freedom to set bail amounts outside of the ranges recommended by the algorithm.

As it turns out, not only did the ML tool make zanily racist decisions on our test data, it actually discriminated against black defendants even moreso than the judges did. This was because judges often assigned lower bail amounts than the old tool recommended, and sometimes they would let defendants go home without bail to wait for trial, even if the tool said lock them up.

Because judges, despite many of them having biases against non-white people, can overcome their biases with effort, compassion, and empathy. It's easier, in my view, to make judges better at correcting social injustices than it is to craft the perfect algorithm to do the hard work for us.

In the end, much of the drive toward ML in public service is a bunch of egghead liberals who think we can fix society by adjusting the knobs ever so carefully. As if we can just move the racism slider a little to the left and leave a parasitic and depraved institution in place. Cash bail should be abolished. Full stop.

Grand Theft Autobot has issued a correction as of 02:08 on Sep 25, 2020

Dr Pepper
Feb 4, 2012

Don't like it? well...

So how long until catchpa's make us click on pictures of black people.

YOLOsubmarine
Oct 19, 2004

When asked which Pokemon he evolved into, Kamara pauses.

"Motherfucking, what's that big dragon shit? That orange motherfucker. Charizard."

Morbus posted:

But the very definition of a racist society in the first place is that "reproducing structural inequalities" is a feature, not a bug. Just look at how badly people lose their poo poo when you do anything else.

This isn't to say companies can't* or shouldn't do better, but it's not exactly mysterious why they favor a machine that performs well in the ways they care about without being bothered by it performing poorly in ways they don't care about

*lol capitalism.

I dont think its mysterious that companies behave this way, theyre just following their prime directive. Its still important to call bullshit on it and point out that its evil when people try to come up with technocratic explanations for why twitters algorithm simply must be racist, because mathematics itself demands it.

Grand Theft Autobot posted:

Like someone else said up thread, it's about being able to shift blame to the computer.

....


I agree with all of this, I was merely pointing out that decision making in neural net ML models is opaque even to the people developing them is not a valid excuse for a company continuing to use a model that they know is broken in pretty disgusting ways. To say nothing of releasing it publicly in the first place.

StashAugustine
Mar 24, 2013

Do not trust in hope- it will betray you! Only faith and hatred sustain.

Gods_Butthole posted:

lol, what the gently caress is a "super recognizer"? How does someone get a job like that?

In theory some people are abnormally good at spotting facial similarities, like how some people are savants at math or whatever, but I'm pretty much assuming in practice its an excuse to invent probable cause

A Buttery Pastry
Sep 4, 2011

Delicious and Informative!
:3:

Gods_Butthole posted:

lol, what the gently caress is a "super recognizer"? How does someone get a job like that?
First you must watch a TON of porn.

Random Asshole
Nov 8, 2010

Dr Pepper posted:

So how long until catchpa's make us click on pictures of black people.

Not likely, the machines don't even realize they exist.

animist
Aug 28, 2018
plugging the yospos computer racism thread where we also talk about computer racism

Charles 2 of Spain
Nov 7, 2017

https://twitter.com/baumard_nicolas/status/1308715606196342784
Lmao what the hell is this

Main Paineframe
Oct 27, 2010

evolutionary psychology for the social sciences, focused on economics

Platystemon
Feb 13, 2012

BREADS

Main Paineframe posted:

evolutionary psychology for the social sciences, focused on economics

powerfully cursed premise

Smythe
Oct 12, 2003

Grand Theft Autobot posted:

Like someone else said up thread, it's about being able to shift blame to the computer.

As an example, I worked for a few years on an ML project for bail setting in criminal courts. Judges routinely gave black defendants higher bail amounts than white defendants charged with the same or similar offenses. Part of the reason was that Judges used an evaluation tool that assigned points based on poo poo like stable employment, level of education, marital status, geography, whether the person owned or rented, and so on.

Some consultants came in with a solution: a bail evaluation tool with a machine learning algorithm that would take the racism out of the process with magic, and better yet the judges would no longer have the freedom to set bail amounts outside of the ranges recommended by the algorithm.

As it turns out, not only did the ML tool make zanily racist decisions on our test data, it actually discriminated against black defendants even moreso than the judges did. This was because judges often assigned lower bail amounts than the old tool recommended, and sometimes they would let defendants go home without bail to wait for trial, even if the tool said lock them up.

Because judges, despite many of them having biases against non-white people, can overcome their biases with effort, compassion, and empathy. It's easier, in my view, to make judges better at correcting social injustices than it is to craft the perfect algorithm to do the hard work for us.

In the end, much of the drive toward ML in public service is a bunch of egghead liberals who think we can fix society by adjusting the knobs ever so carefully. As if we can just move the racism slider a little to the left and leave a parasitic and depraved institution in place. Cash bail should be abolished. Full stop.

really cool post. i enjoyed it and will retell it to friends without citing you as anyone other than "some guy online posted." bl*ss

Jewel Repetition
Dec 24, 2012

Ask me about Briar Rose and Chicken Chaser.

I read it and the idea is that you can tell what people find normal/expected/desirable by how artists draw portraits, because it's never photorealistic and they always exaggerated or deemphasize some features. So they took some large random samples of people, surveyed what faces they found trustworthy, trained an AI on that data, and then applied the AI to portraits. The result is that perceived trustworthiness has become more normal as crime has fallen and large-scale social cohesion has gone up, and it's correlated with GDP, probably because people will act more honorably when making one mistake won't lead them to die

endlessmonotony
Nov 4, 2009

by Fritz the Horse

Gods_Butthole posted:

lol, what the gently caress is a "super recognizer"? How does someone get a job like that?

Racist with an excuse.

Atrocious Joe
Sep 2, 2011

https://twitter.com/adamjohnsonNYC/status/1309126364205850627?s=20

I realize this was actual people at tech companies making the decision, and not the *~algorithm~* but this is still infuriating.

Hodgepodge
Jan 29, 2006
Probation
Can't post for 233 days!

Jewel Repetition posted:

I read it and the idea is that you can tell what people find normal/expected/desirable by how artists draw portraits, because it's never photorealistic and they always exaggerated or deemphasize some features. So they took some large random samples of people, surveyed what faces they found trustworthy, trained an AI on that data, and then applied the AI to portraits. The result is that perceived trustworthiness has become more normal as crime has fallen and large-scale social cohesion has gone up, and it's correlated with GDP, probably because people will act more honorably when making one mistake won't lead them to die

oh boy, just so stories with graphs, thanks evo psych

Robo Reagan
Feb 12, 2012

by Fluffdaddy

Real Mean Queen posted:

I figured the title was referring to that microsoft (?) chatbot that went from “hello world” to “hitler did nothing wrong” within like a day, but I was also thinking it might refer to the youtube algorithm that says “oh you like any kind of video about any topic? We think you’ll enjoy racism.” The fact that “the racism computer” is vague enough to not refer to any one racist computer makes me think that maybe we’re not doing a great job having computers.


https://www.youtube.com/watch?v=HsLup7yy-6I

probably :nsfw:

Gods_Butthole
Aug 9, 2020
Probation
Can't post for 8 years!

A Buttery Pastry posted:

First you must watch a TON of porn.

Done (with gusto), what's the next step?

YOLOsubmarine
Oct 19, 2004

When asked which Pokemon he evolved into, Kamara pauses.

"Motherfucking, what's that big dragon shit? That orange motherfucker. Charizard."

Smythe posted:

really cool post. i enjoyed it and will retell it to friends without citing you as anyone other than "some guy online posted." bl*ss

"Weapons of Math Destruction" and "Technically Wrong" are both books that cover this same ground and have a lot of similar examples.

There's also this ProPublica article about Compas, which is black box software that generates a recidivism score that judges use during sentencing.

quote:

ON A SPRING AFTERNOON IN 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kids blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.

Just as the 18-year-old girls were realizing they were too big for the tiny conveyances which belonged to a 6-year-old boy a woman came running after them saying, Thats my kids stuff. Borden and her friend immediately dropped the bike and scooter and walked away.

But it was too late a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80.

Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store.

Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile.

Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden who is black was rated a high risk. Prater who is white was rated a low risk.

Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars worth of electronics.

https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

exmarx
Feb 18, 2012


The experience over the years
of nothing getting better
only worse.

i want to die

exmarx
Feb 18, 2012


The experience over the years
of nothing getting better
only worse.

Platystemon
Feb 13, 2012

BREADS

Jewel Repetition posted:

I read it and the idea is that you can tell what people find normal/expected/desirable by how artists draw portraits, because it's never photorealistic and they always exaggerated or deemphasize some features. So they took some large random samples of people, surveyed what faces they found trustworthy, trained an AI on that data, and then applied the AI to portraits. The result is that perceived trustworthiness has become more normal as crime has fallen and large-scale social cohesion has gone up, and it's correlated with GDP, probably because people will act more honorably when making one mistake won't lead them to die

They trained it on portraits of rich fucks who were never one mistake from death.

StashAugustine
Mar 24, 2013

Do not trust in hope- it will betray you! Only faith and hatred sustain.

i think im now mentally prepared that if i don't get a job when i graduate i can just become the unabomber

blatman
May 10, 2009

14 inc dont mez


StashAugustine posted:

i think im now mentally prepared that if i don't get a job when i graduate i can just become the unabomber

goonabomber

Jewel Repetition
Dec 24, 2012

Ask me about Briar Rose and Chicken Chaser.

Platystemon posted:

They trained it on portraits of rich fucks who were never one mistake from death.

Yeah, but it wasn't measuring whether those people actually had the features, it was measuring whether the portrait artists gave them those features, and therefore the features were expected/valued

spacemang_spliff
Nov 29, 2014

wide pickle
Is blackface okay if you're doing it to hide from a Killbot?

Colonel Cancer
Sep 26, 2015

Tune into the fireplace channel, you absolute buffoon
No but for other reasons than you think

spacemang_spliff
Nov 29, 2014

wide pickle

lmao

A Buttery Pastry
Sep 4, 2011

Delicious and Informative!
:3:

Gods_Butthole posted:

Done (with gusto), what's the next step?
Congratulations, you might just be the next pussy-recognizer goon. Call your local porn empire and ask them if they need someone to identify all the performers in their archive footage.

spacemang_spliff
Nov 29, 2014

wide pickle

Colonel Cancer posted:

No but for other reasons than you think

It's wrong to hide from your just desserts?

FormaldehydeSon
Oct 1, 2011


lol

Charles 2 of Spain
Nov 7, 2017

https://twitter.com/Simon_Whitten/status/1309382555918049282

Victory Position
Mar 16, 2004

YOLOsubmarine posted:

"Weapons of Math Destruction" and "Technically Wrong" are both books that cover this same ground and have a lot of similar examples.

There's also this ProPublica article about Compas, which is black box software that generates a recidivism score that judges use during sentencing.


https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

given the mentoring required to serve on the highest court in the land and everything on down, it sounds like judges are trained from a very early age to be complete babies

e-dt
Sep 16, 2019

I'm going to become unable to tell the difference between black people so that I can get a job as a super recogniser

Adbot
ADBOT LOVES YOU

Giga Gaia
May 2, 2006

360 kickflip to... Meteo?!

StashAugustine posted:

i think im now mentally prepared that if i don't get a job when i graduate i can just become the unabomber

finally the qcs dream realized

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply