Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
FlamingLiberal
Jan 18, 2009

Would you like to play a game?



Not satisfied with ruining everything else, SCOTUS appears to be preparing to ruin the Internet

https://twitter.com/scotusblog/status/1576932123801374721?s=46&t=Lu2DkZrjK_LqH81B3t0b6w

Adbot
ADBOT LOVES YOU

bird food bathtub
Aug 9, 2003

College Slice

BIG-DICK-BUTT-gently caress posted:

and yet the DOJ has indicated that they wont prosecute. Maybe they have access to information of which you are unaware.

Patience, young padawan, all will be revealed in time ;)

Of the two available options, knowledge of mid- to current historical trends of the country and the cynicism it has created for me leads me to give much more credence to the option predicated on a white, male politician being given preferential treatment and let go with no punishment. There's just a much, much larger weight of history on that side of the scale.

I AM GRANDO
Aug 20, 2006

FlamingLiberal posted:

Not satisfied with ruining everything else, SCOTUS appears to be preparing to ruin the Internet

https://twitter.com/scotusblog/status/1576932123801374721?s=46&t=Lu2DkZrjK_LqH81B3t0b6w

So is pornography going to be illegal now? What are the specifics of that communications decency case?

haveblue
Aug 15, 2005



Toilet Rascal

I AM GRANDO posted:

So is pornography going to be illegal now? What are the specifics of that communications decency case?

https://www.scotusblog.com/2022/04/...up-section-230/

tldr: The father of one of the victims of the November 2015 Paris terror attacks sued Google for allowing the Youtube algorithm to recommend ISIS recruitment videos. Section 230 is clear that merely hosting content is protected, but the open question is whether algorithmic recommendations are editorial decisions that would cause Youtube to lose protection.

haveblue fucked around with this message at 15:39 on Oct 3, 2022

projecthalaxy
Dec 27, 2008

Yes hello it is I Kurt's Secret Son


I guess this is a stupid naive take but stopping youtube from recommending isis recruitment (or instagram from by their own admission trying to give teen girls eating disorders or facebook promoting nazi pages "to see what would happen" seems... good to do? Like those things shouldn't get to happen and companies should get obliterated for doing them?

the_steve
Nov 9, 2005

We're always hiring!

projecthalaxy posted:

I guess this is a stupid naive take but stopping youtube from recommending isis recruitment (or instagram from by their own admission trying to give teen girls eating disorders or facebook promoting nazi pages "to see what would happen" seems... good to do? Like those things shouldn't get to happen and companies should get obliterated for doing them?

I would imagine the argument against it involves the slippery slope.
First it bans things like ISIS recruitment videos, but then further on down the line now it's banned from recommending resources for some kid looking for information on transitioning because the right people greased the right palms to put that under the ban umbrella, and so on and so on.

Twincityhacker
Feb 18, 2011

I think it's more a case of "we are taking a stand against Nazis!" and then it gets abused by people who think "Nazis" is "people we dislike."

Which usually turns into "how can we use this tool to hurt queer people?" almost always.

EDIT: See also, "how all queer content is porn in some people's hosed up minds."

Twincityhacker fucked around with this message at 15:47 on Oct 3, 2022

projecthalaxy
Dec 27, 2008

Yes hello it is I Kurt's Secret Son


OK, yeah given the court and how America works and so forth that makes sense but maybe also people shouldn't be able to run ISIS tapes and do a 4 billion dollar ad campaign for bulimia and I wish there was a way to make them stop? Is this one of those "paradox of intolerance" things

Kalit
Nov 6, 2006

The great thing about the thousands of slaughtered Palestinian children is that they can't pull away when you fondle them or sniff their hair.

That's a Biden success story.

projecthalaxy posted:

OK, yeah given the court and how America works and so forth that makes sense but maybe also people shouldn't be able to run ISIS tapes and do a 4 billion dollar ad campaign for bulimia and I wish there was a way to make them stop? Is this one of those "paradox of intolerance" things

Fighting ISIS recruitment/etc is probably best done at higher levels. Trying to go after individuals posting videos is not very effective. In my naïve view, I'm guessing doing things such as educating and supporting vulnerable communities will go a lot further. Granted, there are issues with those approaches as well.

Issues like these will never be solved with a [metaphorical] silver bullet. Lets just hope that the approaches that are being taken by organizations such as Global Coalition against Daesh are much more helpful than harmful.

Kalit fucked around with this message at 16:25 on Oct 3, 2022

bird food bathtub
Aug 9, 2003

College Slice

projecthalaxy posted:

OK, yeah given the court and how America works and so forth that makes sense but maybe also people shouldn't be able to run ISIS tapes and do a 4 billion dollar ad campaign for bulimia and I wish there was a way to make them stop? Is this one of those "paradox of intolerance" things

Not really the paradox of tolerance. More an understanding that we can't have nice things because any tool created and used for obvious things like industrialised bulimia manufacturing will end up in the hands of hateful bigots looking for ways to harm their preferred target. Nothing can be crafted with the belief that it will be used in good faith, everything needs to be viewed in the hateful bigot light.

Fart Amplifier
Apr 12, 2003

projecthalaxy posted:

I guess this is a stupid naive take but stopping youtube from recommending isis recruitment (or instagram from by their own admission trying to give teen girls eating disorders or facebook promoting nazi pages "to see what would happen" seems... good to do? Like those things shouldn't get to happen and companies should get obliterated for doing them?

AFAIK the way the cases are potentially heading is that if companies remove certain content, then they lose the protection afforded by section 230. IE, if you remove the ISIS recruitment videos, you are moderating the content and can be held liable for anything else hosted on the platform.

These lawsuits are brought because right wingers want to make it impossible for providers to ban misinformation on things like vaccines.

FlamingLiberal
Jan 18, 2009

Would you like to play a game?



We’ve seen the slippery slope with abortion already. The SC did not ban contraception, but in several states we have already seen a chilling effect on contraception coverage because providers are afraid that they will run afoul of the new abortion restrictions.

What will likely happen if holes are punched in Section 230 is that companies will just restrict a lot of content that is not necessarily offensive but that they consider possibly bad for them in terms of liability. Which then ends the modern internet as we know it.

There are also efforts underway by anti-abortion states and groups to water down Section 230 so that say, you can sue Google for hosting websites that provide information on how to access abortions or get at-home abortifacients if you are in a place like Missouri.

Kalli
Jun 2, 2001



Fart Amplifier posted:

AFAIK the way the cases are potentially heading is that if companies remove certain content, then they lose the protection afforded by section 230. IE, if you remove the ISIS recruitment videos, you are moderating the content and can be held liable for anything else hosted on the platform.

These lawsuits are brought because right wingers want to make it impossible for providers to ban misinformation on things like vaccines.

But they already moderate content, otherwise all those sites would look like pornhub, which also moderates content.

I AM GRANDO
Aug 20, 2006

I always forget that right-wing media is hugely tied up in alt-med scams and selling poo poo to the rubes they cultivate by destroying their minds.

Twincityhacker
Feb 18, 2011

In a nicer world, the FDA would regulate dietary supplements, but I think it's too baked into right-wing grifts to regulate right now. =/

Kalli
Jun 2, 2001



I wouldn't say just right wing. Doesn't Gwyneth Paltrow's Goop company sell most of the same stuff as Alex Jones? I always saw it as the same vein as how anti-vaxx stuff migrated from wealthier white women to the right with the same rebranding.

Looking it up, yes to some, some are sold through another wellness brand she recommends a lot:
https://qz.com/1010684/all-the-wellness-products-american-love-to-buy-are-sold-on-both-infowars-and-goop/

Laughing at:

Goop: Vapour Beauty’s Mesmerize Eye Shimmer

vs

Infowars: Occu Power

Riptor
Apr 13, 2003

here's to feelin' good all the time

Fart Amplifier posted:

AFAIK the way the cases are potentially heading is that if companies remove certain content, then they lose the protection afforded by section 230. IE, if you remove the ISIS recruitment videos, you are moderating the content and can be held liable for anything else hosted on the platform.

These lawsuits are brought because right wingers want to make it impossible for providers to ban misinformation on things like vaccines.

no, the point of contention is whether or not the site is making editorial decisions which is where you lose your protections, not moderating content given that the argument is that once you, as a site, are making editorial decisions, you are no longer just essentially a bulletin board that other people are making posts/arguments/points/etc on. The allegation is that the algorithmic recommendations is, itself, an editorial decision which is stupid

I AM GRANDO
Aug 20, 2006

Alt-med has been a right-wing mainstay since the 1970s, growing on the back of Barry Goldwater’s mailings lists. That’s not to deny that there aren’t other avenues for it or that it hasn’t been historically popular with other populations.

Rick Perlstein wrote the definitive history for The Baffler last decade: https://thebaffler.com/salvos/the-long-con

haveblue
Aug 15, 2005



Toilet Rascal

Riptor posted:

The allegation is that the algorithmic recommendations is, itself, an editorial decision which is stupid

Why is it stupid? Youtube is not a fully autonomous strong AI, every decision the algorithm makes was the result of human input into its priorities. It surfaces and buries videos based on what Youtube's human operators believe optimizes their business, and not based on social good or anything like that because that's how the humans configured it. It may have evolved into a black box that no one person fully understands, and it may have grown far too large to transition to direct human oversight of all content, but that doesn't necessarily protect them from liability or responsibility to fix it

haveblue fucked around with this message at 16:51 on Oct 3, 2022

projecthalaxy
Dec 27, 2008

Yes hello it is I Kurt's Secret Son


Also when they constantly admit to like A/B testing a Facebook algorithm specifically designed to drive people into suicidal despair vs one that doesn't and decide to keep the first one cause it gets more clicks we can't necessarily just say "oops all computers"

Paracaidas
Sep 24, 2016
Consistently Tedious!

Kalli posted:

But they already moderate content, otherwise all those sites would look like pornhub, which also moderates content.
Fart Amplifier's point was that where the lawsuits are heading is that if you do that sort of moderation, then you are liable for everything you haven't moderated.

Techdirt is on the more fearmongery side of things as it comes to internet legislation, but two good summaries are Why section 230 reform effectively means section 230 repeal and Hello, you've been referred here because you're wrong about section 230 of the communications decency act. If anyone is curious about the topic, I'd recommend you give both a read.

From the former, regarding the risk of algorithmic carveouts (which is the legislative version of what Gonzalez is asking the court to invent in Section 230)

quote:

Algorithmic display carve-outs. Algorithmic display has become a target for many lawmakers eager to take a run at Section 230. But as with every other proposed reform, changing Section 230 so that it no longer applies to platforms using algorithmic display would end up obliterating the statute for just about everyone. And it’s not quite clear that lawmakers proposing these sorts of changes quite realize this inevitable impact.

And part of the problem seems to be that they don’t really understand what an algorithm is, or how commonly they are used. They seem to regard it as something nefarious, but there’s nothing about an algorithm that inherently is. The reality is that nearly every platform uses software in some way to handle the display of user-provided content, and algorithms are just the programming logic coded into the software giving it the instructions for how to display that content. Moreover, these instructions can even be as simple as telling the software to display the content chronologically, alphabetically, or some other relevant way the platform has decided to render content, which the First Amendment protects. After all, a bookstore can decide to shelve books however it wants, including in whatever order or with whatever prominence it wants. What these algorithms do is implement these sorts of shelving decisions, just as applied to the online content a platform displays.

If algorithms were to end up banned by making the Section 230 protection platforms need to host user-generated content contingent on not using them, it would make it impossible for platforms to actually render any of that content. They either couldn’t do it technically, if they were to abide by this rule withholding their Section 230 protection, or legally if that protection were to be withheld because they used this display. Such a rule would also represent a fairly significant change to Section 230 itself by gutting the protection for moderation decisions, since those decisions are often implemented by an algorithm. In any case, conditioning Section 230 on not using algorithms is not a small change but one that would radically upend the statutory protection and all the online services it enables.

Again, Techdirt is on the radical side of this (in part due to the Thiel-aligned nuisance suit it had to fight from "The Inventor of Email", intending to send it the way of Gawker for pointing out that he's a thinskinned litigious fraud) so read those takes in context and take with the grain of salt.

FlamingLiberal posted:

There are also efforts underway by anti-abortion states and groups to water down Section 230 so that say, you can sue Google for hosting websites that provide information on how to access abortions or get at-home abortifacients if you are in a place like Missouri.
Just picking nits in an otherwise accurate post: It's not even that google is hosting the content, it's that they're providing links to the sites hosted elsewhere.

The oversimplified way to understand section 230 is that it answers the question of "Who is responsible for the content on a site, the uploading user or the site itself?" with "The user. Next question?" That doesn't mean that sites can't be liable (Roommates.com, for instance, wasn't liable for user postings that violated the Fair Housing Act. It was liable for their dropdown menu for racial preferences), it just massively narrows the scope of what they can be sued for.

ETA:

haveblue posted:

Why is it stupid? Youtube is not a fully autonomous strong AI, every decision the algorithm makes was the result of human input into its priorities. It surfaces and buries videos based on what Youtube's human operators believe optimizes their business, and not based on social good or anything like that because that's how the humans configured it. It may have evolved into a black box that no one person fully understands, and it may have grown far too large to transition to direct human oversight of all content, but that doesn't necessarily protect them from liability or responsibility to fix it
It's stupid because "someone uploaded a terrorist video and your algorithm promoted it" is indistinguishable from "someone uploaded a terrorist video and you have an algorithm", and for all the many horrors of youtube and facebook's engagement strategies, "you lose all liability protection for user content if you do any curation, moderation, or serve content in anything but a purely random way" is a godawful and worse solution.

Paracaidas fucked around with this message at 16:58 on Oct 3, 2022

Barrel Cactaur
Oct 6, 2021

Changes to 230 are too complex for the binary of a courtroom. You need actual reforms to the underlying law (you almost certainly don't, but alas)

Ironically removing these protections would get all that right wing hate posting, doxing, etc. gone fast as it's the most dangerously criminal content to host.

Also internet comment sections will be gone. And of course SA will perish from the earth.

Rappaport
Oct 2, 2013

haveblue posted:

Why is it stupid? Youtube is not a fully autonomous strong AI, every decision the algorithm makes was the result of human input into its priorities. It surfaces and buries videos based on what Youtube's human operators believe optimizes their business, and not based on social good or anything like that because that's how the humans configured it. It may have evolved into a black box that no one person fully understands, and it may have grown far too large to transition to direct human oversight of all content, but that doesn't necessarily protect them from liability or responsibility to fix it

I was going to make a joke about corporations being people, too, my friend, but dang, now I'm half-way convinced The Algorithm may have Dial F for Frankenstein'd itself to an emergent alien intelligence :tinfoil:

Mooseontheloose
May 13, 2003

projecthalaxy posted:

Also when they constantly admit to like A/B testing a Facebook algorithm specifically designed to drive people into suicidal despair vs one that doesn't and decide to keep the first one cause it gets more clicks we can't necessarily just say "oops all computers"

To add, we also know that Facebook is basically allowing right wing extremisms groups run the site because they pay so much money to manipulate the algorithm.

silence_kit
Jul 14, 2011

by the sex ghost

Kalli posted:

I wouldn't say just right wing. Doesn't Gwyneth Paltrow's Goop company sell most of the same stuff as Alex Jones? I always saw it as the same vein as how anti-vaxx stuff migrated from wealthier white women to the right with the same rebranding.

Yeah, I think it is a little misleading to declare alternative medicine to be a predominantly right wing subject. E.g. alternative medicine is a big part of New Age beliefs and New Age beliefs are more popular among liberals than conservatives.

Baronash
Feb 29, 2012

So what do you want to be called?

Kalit posted:

Fighting ISIS recruitment/etc is probably best done at higher levels. Trying to go after individuals posting videos is not very effective. In my naïve view, I'm guessing doing things such as educating and supporting vulnerable communities will go a lot further. Granted, there are issues with those approaches as well.

Issues like these will never be solved with a [metaphorical] silver bullet. Lets just hope that the approaches that are being taken by organizations such as Global Coalition against Daesh are much more helpful than harmful.

Cool, now do white supremacists. Or Q. Or anti-vaxxers.

A huge component of the rise of all of these was Youtube's algorithm that aimed to maximize time watching videos, and did so by feeding you progressively more (I'm struggling to come up with a better term here) *intense* versions of what you already watched. It essentially created a path from alt-lite folks like Pewdiepie or various edgy gamer personas to hardcore "great replacement"-types. Now this wasn't solely for the purpose of pushing extremist content, (if you're watching a video on woodworking, you probably would be interested in a deeper dive video on wood glues or using a new tool) but extremists certainly took advantage of the phenomenon.

Does Youtube bear an editorial responsibility for this? I believe they do. So do they, at least to an extent, and they made changes to the algorithm designed to weaken that pipeline in response to intense public pressure.

This particular SC taking up a case related to Section 230 is obviously not being done in good faith, but the Court being your enemy doesn't make Youtube your friend.

VitalSigns
Sep 3, 2011

E: drat it thought this was SCOTUS thread sorry!

VitalSigns fucked around with this message at 19:08 on Oct 3, 2022

VikingofRock
Aug 24, 2008




Twincityhacker posted:

EDIT: See also, "how all queer content is porn in some people's hosed up minds."

This one is wild. I've had two separate conversations about the Don't Say Gay bill where several minutes into the discussion, I realized that the other person thought that talking about LGBTQ people in schools meant that third graders were learning about the mechanics of anal sex. It took several more minutes of explaining to get them to conceive of the idea that being gay is about more than sex, and that you actually don't need to mention sex at all to say "these two women love each other".

Captain_Maclaine
Sep 30, 2001

Every moment that I'm alive, I pray for death!

VikingofRock posted:

This one is wild. I've had two separate conversations about the Don't Say Gay bill where several minutes into the discussion, I realized that the other person thought that talking about LGBTQ people in schools meant that third graders were learning about the mechanics of anal sex. It took several more minutes of explaining to get them to conceive of the idea that being gay is about more than sex, and that you actually don't need to mention sex at all to say "these two women love each other".

Homophobes aren't particularly deep thinkers, generally. To them, any sort of non heteronormative identity is Gross Pervert Sex Thing, without anything beyond that in terms of being or feeling, etc. Not for nothing is a recurring joke in the freeper thread that they obsess about anal more than gay men do.

Main Paineframe
Oct 27, 2010

the_steve posted:

I would imagine the argument against it involves the slippery slope.
First it bans things like ISIS recruitment videos, but then further on down the line now it's banned from recommending resources for some kid looking for information on transitioning because the right people greased the right palms to put that under the ban umbrella, and so on and so on.

Algorithm-driven sites already tend to penalize stuff like that. They tend to categorize LGBT content and resources as too "political" and "controversial" and penalize them in searches...unlike unhinged right-wing rants about murdering all Jews how someone needs to stop "the globalists" somehow at all costs, which are a-OK for these platforms.

Rappaport
Oct 2, 2013

Captain_Maclaine posted:

Homophobes aren't particularly deep thinkers, generally. To them, any sort of non heteronormative identity is Gross Pervert Sex Thing, without anything beyond that in terms of being or feeling, etc. Not for nothing is a recurring joke in the freeper thread that they obsess about anal more than gay men do.



Culture war stuff is hilarious and :sad:

Paracaidas
Sep 24, 2016
Consistently Tedious!

VitalSigns posted:

How is that any different from a newspaper letting people submit editorials and then deliberately promoting extremist ones to sell more papers.
Indeed they are similar in that both are currently protected speech and any restriction on either would be used to further suppress left wing, antiracist, and/or antifacist dissent and rhetoric.

VitalSigns posted:

They could just, you know, NOT recommend violent content on purpose to get people hooked. They don't even have to take it down they just have to stop trying on purpose to get people to watch it!
While it's true that they wouldn't have to take it down if Gonzalez wins, they would have to take it down if they want to:
  • Display videos in chronological order
  • Display videos in alphabetical order
  • Group videos into genres or topics
  • Show users the other videos of the creator they just watched
  • Utilize any automated moderation prevent the uploading of child pornography, revenge porn, bleachcentric mental health cures, or even terrorism recruiting videos

Again, FB, Google, and the rest of the socials have a tremendous amount to answer for. "Oops, you curated content in any form and now you're liable for everything on your website" (the world Gonzalez is attempting to create) worsens the situation and further consolidates market and sociological power in the hands of the biggies like google, facebook, and their ilk.

Professor Beetus
Apr 12, 2007

They can fight us
But they'll never Beetus

Rappaport posted:



Culture war stuff is hilarious and :sad:

That tweet has to be the absolute apex of conservatives going "is this thing gay?" Fellas, is it gay to find women shaking their asses attractive? The answer may surprise you!

haveblue
Aug 15, 2005



Toilet Rascal

Rappaport posted:



Culture war stuff is hilarious and :sad:

https://www.youtube.com/watch?v=CjGolrmbk9s

Young Freud
Nov 26, 2006

FlamingLiberal posted:

Not satisfied with ruining everything else, SCOTUS appears to be preparing to ruin the Internet

https://twitter.com/scotusblog/status/1576932123801374721?s=46&t=Lu2DkZrjK_LqH81B3t0b6w

For a moment that this was Netchoice vs. Paxton (which argues that platforms with 50 million users are public entities and cannot moderate content) and what the outcome Gonzales vs. Google has on that case?

Judgy Fucker
Mar 24, 2006

Rappaport posted:



Culture war stuff is hilarious and :sad:

BlueBlazer
Apr 1, 2010

Baronash posted:

Cool, now do white supremacists. Or Q. Or anti-vaxxers.

A huge component of the rise of all of these was Youtube's algorithm that aimed to maximize time watching videos, and did so by feeding you progressively more (I'm struggling to come up with a better term here) *intense* versions of what you already watched. It essentially created a path from alt-lite folks like Pewdiepie or various edgy gamer personas to hardcore "great replacement"-types. Now this wasn't solely for the purpose of pushing extremist content, (if you're watching a video on woodworking, you probably would be interested in a deeper dive video on wood glues or using a new tool) but extremists certainly took advantage of the phenomenon.

Does Youtube bear an editorial responsibility for this? I believe they do. So do they, at least to an extent, and they made changes to the algorithm designed to weaken that pipeline in response to intense public pressure.

This particular SC taking up a case related to Section 230 is obviously not being done in good faith, but the Court being your enemy doesn't make Youtube your friend.

The act of a recommendation is a form of editorialization that should have to allow for human consent. I frankly hope for a Dune style Jihad that wipes algorithmic decision making off the map.

Crain
Jun 27, 2007

I had a beer once with Stephen Miller and now I like him.

I also tried to ban someone from a Discord for pointing out what an unrelenting shithead I am! I'm even dumb enough to think it worked!

BlueBlazer posted:

The act of a recommendation is a form of editorialization that should have to allow for human consent. I frankly hope for a Dune style Jihad that wipes algorithmic decision making off the map.

Butlerian Jihad.

Return the internet to the days of Angelfire, Geocities, and Usenet.

Jaxyon
Mar 7, 2016
I’m just saying I would like to see a man beat a woman in a cage. Just to be sure.

Crain posted:

Butlerian Jihad.

Return the internet to the days of Angelfire, Geocities, and Usenet.

Never underestimate the influence of the Webring

Adbot
ADBOT LOVES YOU

Twincityhacker
Feb 18, 2011

...I miss Geocities. It might not have been morr than the barebones of a website, but it was mine.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply