Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
It’s like Murder on the Orient Express, if the entire rest of the world was bribing Trump how could they all be punished?

Adbot
ADBOT LOVES YOU

Absurd Alhazred
Mar 27, 2010

by Athanatos
https://twitter.com/joshtpm/status/1075212618476990465

Is this gun smoking, or is it just happy to see me?

Dapper_Swindler
Feb 14, 2012

Im glad my instant dislike in you has been validated again and again.

Tibalt posted:

The scuttlebutt in the other thread is that Rosneft is no longer owned by a country, and that the Middle East expert was spotted at the appeal, so more likely the KSA sovereign fund.

i guess that means the Kush got burned.

HootTheOwl
May 13, 2012

Hootin and shootin

Dapper_Swindler posted:

i guess that means the Kush got burned.

So he's in a stick situation?
Sticky Kush?

Shimrra Jamaane posted:

It’s like Murder on the Orient Express, if the entire rest of the world was bribing Trump how could they all be punished?
I think it's more like Monty Burns where they all tried at once and none could hurt him. But now some are gone so the stooge logjam is free

Morbus
May 18, 2004

Kale posted:

...
I'd also argue that I can literally see this algorithm being exploited at play sometimes. Like I can't help but notice that youtube starts recommending me some weird loving poo poo that has little to do with what I'm searching for at times. I haven't gotten any of the Prager U stuff but every so often I'll notice a Jordan Peterson video or "Right Wing Pundit DESTROYS leftist icon" video when simply searching for a tip video on Breath of The Wild or something.

The thing is companies like youtube know exactly what they are doing here. They'll often deflect criticism by maybe mumbling that that their algorithms are simply giving users the content they most want, and then go on to discuss how it's a difficult "content moderation" issue. LIke, oh gosh, the problem is there is just so much of this content out there and how are we to police it?

But this is transparently bullshit if you compare, for example, the results returned by a google search (an algorithm optimized for relevance) vs. the results returned by youtube for the same query (an algorithm optimized for clicks and view time).

Like I just tried "Islam and the West" as both a google search and in youtube. On the google search, nothing on the first page is overtly crazy, and most links are from at least ostensibly reputable organizations or books. Run the same poo poo through youtube and like right off the bat there are 3-4 crazy alt-right shitlords in the first dozen results including "Top Illuminati Grand Wizard: “We Control Islam and We'll Use It to Destroy the West - WW3".

You have to try pretty hard to get Illuminati Grand Wizard conspiracy theories in the first page of a google search for any halfway reasonable topic, but if you click on something as mundane as an Overwatch video in youtube, get ready for a gaggle of nazis in your recommendation tab ranting about feminists.

Google knows how to build algorithms which--without much human intervention or moderation--return relevant results and weed out crazy horseshit. Their search engine depends on it. The problem with extremist clickbait on social media is not a challenging engineering problem to be solved, it's a direct result of a conscious decision to optimize advertising revenue no matter what.

insert_funny
Jan 5, 2013

I can never have plastic surgery, because I don't feel like chipping in another five bucks to change the picture.

Absurd Alhazred posted:

Is this gun smoking, or is it just happy to see me?

No, that's just Giuliani and his cheap cigars again.

[your signature is being processed. check the notification bar for updates]

Kale
May 14, 2010

Peacoffee posted:

It’s probably also fair to say that gamergate was greatly enabled and expanded by these new algorithms and the platforms using them.

A part of me has always known all of these crazy beliefs that have been taking hold in Western civilization are somehow interconnected and part of a grand influence campaign, I just never realized the sheer scope of it until people like Mueller and actually good journalists have started to put it all together for the greater public. Also somewhat encouraging to see that not all the people that drank the kool-aid have stayed that way and some get out and try to prevent the same thing from happening to others.

Then of course you have the 4chan incel type people that I suspect do and think these horrible things and act that way on purpose because they are truly reprehensible human beings at their very core, but the guy mentioned in that article doesn't seem like one of them.

Morbus posted:

trimmed for post length

It doesn't seem like people are buying it as much as they used to if nothing else. At least people like European and Canadian parliaments. I'm sure as long as it's politically convenient the Republican congress will just believe whatever, but I expect Adam Schiff to start looking into it more once the Democrats take over the house intelligence committee and that Mark Zuckberg and Susan Wojcicki will be seeing a fair bit of D.C in 2019.

Incidentally when I search "Islam and The West" here in Canada I just seem to get panel lectures by Islamic professors that seem like scholarly videos. There's a few that look like Western crazy types but overall it strikes me as kind of balanced. Then again I also have my watch and search history permanently turned off, never like any videos (it drastically affects your search and recommendation results) and am only subscribed to about 5 channels so there's that too.

Kale fucked around with this message at 08:06 on Dec 19, 2018

alpha_destroy
Mar 23, 2010

Billy Butler: Fat Guy by Day, Doubles Machine by Night

Morbus posted:

Google knows how to build algorithms which--without much human intervention or moderation--return relevant results and weed out crazy horseshit.

Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love.

PhazonLink
Jul 17, 2010
or when that young earth creation group got the first result to be for "what killed the dinos?"

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
I remember back in 2009 one of the first results for googling Martin Luther King Jr was a StormFront affiliate. It was probably like that for years.

haveblue
Aug 15, 2005



Toilet Rascal

alpha_destroy posted:

Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love.

There were also some really good autocomplete oopsies regarding Jews and Muslims IIRC

evilweasel
Aug 24, 2002

alpha_destroy posted:

Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love.

Google worked on those problems...for the Google search. Not only is it not doing so for Youtube, but it seems pretty clear that youtube is doing this by design rather than an occasional algorithm quirk.

Yiggy
Sep 12, 2004

"Imagination is not enough. You have to have knowledge too, and an experience of the oddity of life."

evilweasel posted:

Google worked on those problems...for the Google search. Not only is it not doing so for Youtube, but it seems pretty clear that youtube is doing this by design rather than an occasional algorithm quirk.

I know part of the virtue of the algorithm is it seems to be great for locking children into place which are a huge part of the current YouTube userbase. My wife teaches elementary school age children and reports the only thing they watch is YouTube. Although I don’t see why they couldn’t have a process in place for their legion of content reviewers to be able to flag content to be skipped over by the algorithm except they that don’t want to.

Internet Explorer
Jun 1, 2005





Not to defend Google, but it's also a lot easier to index and categorize text than it is for video. Especially once you start getting into things like political nuance. They're probably working on it with tools like their Video Intelligence API. But yeah, it's an area that needs to be talking about and needs to be looked at.

evilweasel
Aug 24, 2002

Internet Explorer posted:

Not to defend Google, but it's also a lot easier to index and categorize text than it is for video. Especially once you start getting into things like political nuance. They're probably working on it with tools like their Video Intelligence API. But yeah, it's an area that needs to be talking about and needs to be looked at.

Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money.

Internet Explorer
Jun 1, 2005





evilweasel posted:

Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money.

That is all almost certainly true but doesn't change, disagree with, or disprove what I said. That's why I said it needs to be talked about and looked into.

Z. Autobahn
Jul 20, 2004

colonel tigh more like colonel high
The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc.

The problem isn't that the algorithms prioritize conspiracy content, it's that people are drawn to shocking, provocative and extreme content, and any model that relies on curating based on clicks is going to push those to the top. You can see this in the children's Youtube debacle of last year, where basically "kids naturally click on hosed up poo poo that makes them scared and uncomfortable", so Youtube kids was full of like CG Pregant Batman Getting a Root Canal. What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping.

Quorum
Sep 24, 2014

REMIND ME AGAIN HOW THE LITTLE HORSE-SHAPED ONES MOVE?

Z. Autobahn posted:

What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping.

Yep, and silicon valley in general is absolutely allergic to that. The whole mindset that a better algorithm will fix everything means that they view "hiring people to make decisions" as giving up, and refuse to do it.

Z. Autobahn
Jul 20, 2004

colonel tigh more like colonel high

Quorum posted:

Yep, and silicon valley in general is absolutely allergic to that. The whole mindset that a better algorithm will fix everything means that they view "hiring people to make decisions" as giving up, and refuse to do it.

It's not just that (though it's a big part of it); you could absolutely make algorithms that filter out clickbait and bury provocative, controversial content. But that content is an absolutely vital part of the attention economy, so 'fixing it' is directly at odds with their business model. Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one.

Basically, asking these companies to fix their algorithms is like asking drug dealers to sell less addictive drugs. There's zero chance of them doing it unless forced by law.

evilweasel
Aug 24, 2002

Z. Autobahn posted:

The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc.

The problem isn't that the algorithms prioritize conspiracy content, it's that people are drawn to shocking, provocative and extreme content, and any model that relies on curating based on clicks is going to push those to the top. You can see this in the children's Youtube debacle of last year, where basically "kids naturally click on hosed up poo poo that makes them scared and uncomfortable", so Youtube kids was full of like CG Pregant Batman Getting a Root Canal. What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping.

You need algorithms that are not biased towards clickbait and provocation. The algorithm prioritizing this stuff is part of the problem, adding in a moderation layer won't fix that.

Z. Autobahn posted:

It's not just that (though it's a big part of it); you could absolutely make algorithms that filter out clickbait and bury provocative, controversial content. But that content is an absolutely vital part of the attention economy, so 'fixing it' is directly at odds with their business model. Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one.

Basically, asking these companies to fix their algorithms is like asking drug dealers to sell less addictive drugs. There's zero chance of them doing it unless forced by law.

Yeah, I agree with this.

Kale
May 14, 2010

Z. Autobahn posted:

The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc.

The problem isn't that the algorithms prioritize conspiracy content, it's that people are drawn to shocking, provocative and extreme content, and any model that relies on curating based on clicks is going to push those to the top. You can see this in the children's Youtube debacle of last year, where basically "kids naturally click on hosed up poo poo that makes them scared and uncomfortable", so Youtube kids was full of like CG Pregant Batman Getting a Root Canal. What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping.

I feel like this used to be a basic understood concept of how the internet is supposed to work that's given way to just pure profit motive. Places like Gamefaqs didn't used to be quite the shitholes they are now that I can recall, but they didn't used to be owned by literal CBS who haven't shown they give a solitary gently caress about ensuring proper moderation so the site doesn't have a garbage toxic community. Same with Youtube, I don't remember it having quite the problems it does now before google bought it. They're model for copyright strikes is still the most ridiculous Guilty until Proven innocent backwards nonsense as well that gives all the onus to the user and no real responsibility to media corporations other than to just make the claim.

Apparently the disaster in Myanmar with Facebook launching there and helping the military to disseminate hate against the Muslim population has resulted in them upping their content moderators from like literally one person that barely spoke Burmese to something almost resembling a proper team but still nowhere near enough. In fact the very first information that pops up when you search Myanmar on google are stories about how badly Facebook hosed that country and the meager steps they are taking to try to fix it.

Slow News Day
Jul 4, 2007

evilweasel posted:

Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money.

I don't know if ad revenue has to enter the picture, necessarily. Even if they were, say, a subscription-based ecosystem, they would behave the way they do right now. Meaning, any content platform wants to prioritize the content that is relevant to the user because that's how they retain users, and the only way they can reliably measure the relevance of a given piece of content is to measure how much the user engages with it. This is fine when content is benign (e.g. if you watch a lot of cat videos then Google will recommend more to you, which is probably a good thing?), but becomes a massive problem when you start to consider content that has a detrimental impact on both the user and on society at large.

Slow News Day fucked around with this message at 20:55 on Dec 19, 2018

haveblue
Aug 15, 2005



Toilet Rascal

Kale posted:

Apparently the disaster in Myanmar with Facebook launching there and helping the military to disseminate hate against the Muslim population has resulted in them upping their content moderators from like literally one person that barely spoke Burmese to something almost resembling a proper team but still nowhere near enough. In fact the very first information that pops up when you search Myanmar on google are stories about how badly Facebook hosed that country and the meager steps they are taking to try to fix it.

This is another fundamental problem with the online model- they could not possibly exist at their current scale with comprehensive human moderation. There aren't enough employees on the planet to review every video and curate every list.

I remember an article a few years ago about Facebook's content moderation team; the job destroys you mentally from all the horrifying poo poo you have to investigate.

Jarmak
Jan 24, 2005

evilweasel posted:

You need algorithms that are not biased towards clickbait and provocation. The algorithm prioritizing this stuff is part of the problem, adding in a moderation layer won't fix that.


Yeah, I agree with this.

I don't necessarily disagree but I think it's important to make the distinction that the algorithms aren't necessarily biased towards clickbait and provocation. The algorithms are biased towards what people want to watch... people are biased towards clickbait and provocation.

Taerkar
Dec 7, 2002

kind of into it, really

Also moderating the shitpit that is some of the darker parts of the internet breaks people.

evilweasel
Aug 24, 2002

Jarmak posted:

I don't necessarily disagree but I think it's important to make the distinction that the algorithms aren't necessarily biased towards clickbait and provocation. The algorithms are biased towards what people want to watch... people are biased towards clickbait and provocation.

i don't think that's correct. they're monitoring engagement and other things that bias followup videos that people did not search for. If you biased it towards what was relevant to what was initially searched for you'd potentially fix this issue. Nutjob conspiracy movies coming on after someone searched for a factual based video is a flawed algorithm that is prioritizing hooking the viewer over giving the viewer what they searched for.

Kale
May 14, 2010

Z. Autobahn posted:

Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one.

I'm glad I'm not the only one that has determined that this is in fact twitters actual business model. Holy gently caress did the level of arguing over the stupidest loving poo poo explode on the internet after twitter took off. Twitter is the social media platform I have the most overall contempt for out of any of them and refuse to even sign up for. I never got a good feeling off the whole "hey let's deliberately limit the number of characters a person can use to communicate a given thought at a time" model, and for me it's the one that ultimately tipped the balance of social media over from possible tool for keeping people in touch to readily exploitable propaganda dissemination platform and corporate method for dividing and conquering people.

haveblue posted:

This is another fundamental problem with the online model- they could not possibly exist at their current scale with comprehensive human moderation. There aren't enough employees on the planet to review every video and curate every list.

I remember an article a few years ago about Facebook's content moderation team; the job destroys you mentally from all the horrifying poo poo you have to investigate.

Yeah I can only imagine. Still it's clear from this year alone that they can at least by trying a little harder and have started to at least on the public face of it. It's very likely all just for show though so they can go back to various congress and parliaments and be like "see we're fixing our poo poo like you asked".

Jarmak
Jan 24, 2005

evilweasel posted:

i don't think that's correct. they're monitoring engagement and other things that bias followup videos that people did not search for. If you biased it towards what was relevant to what was initially searched for you'd potentially fix this issue. Nutjob conspiracy movies coming on after someone searched for a factual based video is a flawed algorithm that is prioritizing hooking the viewer over giving the viewer what they searched for.

I mean that becomes an almost philosophical argument. Is what the viewer wants what the type in the search box or what they actually watch?

I guess the more pro-humanity view is it's a trick of psychology that draws attention to the extreme rather than being subconsciously what they really want. I'm just not sure I buy that.

Martian
May 29, 2005

Grimey Drawer
For who is interested, here is a great article from a Dutch newspaper about working as a Facebook moderator (the article is in English): The Misery of Facebook

It's pretty hosed up.

quote:

A subject matter expert teaches Erik what to do: evaluate tickets. The four Dutch people get about 8,000 tickets per day: reported messages about hate, violence, child pornography, automutilation. The goal is to assess a minimum of 1,800 per person. Break-time is not work-time: moderators are obliged to log out during breaks. The Dutch are far behind, Erik is told. There are 22,000 tickets on hold, including cries for help from people who are about to commit suicide.

[...]

When Erik has assessed a couple thousand tickets in his third week, he automatically also gets to see videos, without warning. Immediately, he sees a video that will haunt him for a long time. A man in orange overalls hops along on an asphalt road. His legs are cuffed, which is why he moves in small skips. A tank pulls into view and runs over the man. Then the camera zooms in on his flat, bloody remains.

[...]

In the weeks to come, he witnesses every imaginable horror. Torture, executions, stoning. But also challenges. Teenagers who put erasers to their arms until their muscles are visible. Anorexia patients making appeals: for every like they get, they’ll refrain from eating for another hour. ‘And then the likes pour in. From 2 to 5, to 14.’ Videos of Sponge Bob that turn into footage of a decapitation. They’re images that keep lingering in his head. ‘We’re not at all trained to see this kind of misery, day in, day out.’

E: this bit is very relevant for this discussion

quote:

People have to come back as often as possible. That is why Facebook managers go to conferences on influencing behavior. When users open the app, they have to be surprised. The algorithm makes sure of that. Posts that, for whatever reason, stand out, get more attention. Those are the posts that stir emotion and incite response. A nuanced contribution doesn’t score as well as a more explicit op-ed piece. Social media thrive on controversy, as long as the controversy doesn’t center on them.

Martian fucked around with this message at 21:11 on Dec 19, 2018

Z. Autobahn
Jul 20, 2004

colonel tigh more like colonel high

Kale posted:

I'm glad I'm not the only one that has determined that this is in fact twitters actual business model. Holy gently caress did the level of arguing over the stupidest loving poo poo explode on the internet after twitter took off. Twitter is the social media platform I have the most overall contempt for out of any of them and refuse to even sign up for. I never got a good feeling off the whole "hey let's deliberately limit the number of characters a person can use to communicate a given thought at a time" model, and for me it's the one that ultimately tipped the balance of social media over from possible tool for keeping people in touch to readily exploitable propaganda dissemination platform and corporate method for dividing and conquering people.

Twitter is uniquely awful both because of the limited communication and because the nature of quote-tweeting means people are constantly amplifying awful takes in order to argue with them, thus spreading them far and wide.

The Glumslinger
Sep 24, 2008

Coach Nagy, you want me to throw to WHAT side of the field?


Hair Elf
https://twitter.com/kylegriffin1/status/1075464561296330752

https://twitter.com/John_Hudson/status/1075460543161475077

JasonV
Dec 8, 2003
https://twitter.com/BrianKarem/status/1075491004252336130

Slow News Day
Jul 4, 2007


"...Perfect." --Trump, probably

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe
1https://twitter.com/politico/status/1075491725630717952
2https://twitter.com/readsludge/status/1075493154177069065
3https://twitter.com/nytimesworld/status/1075499277126905860
4https://twitter.com/BuzzFeedNews/status/1075499207220314112
5https://twitter.com/srl/status/1075503717510717440
6https://twitter.com/kylegriffin1/status/1075412944102535168
7https://twitter.com/AP/status/1075498247274278913
8https://twitter.com/DionNissenbaum/status/1075497100538986496
9https://twitter.com/srl/status/1075489659810181120
10https://twitter.com/srl/status/1075505606285496320
11https://twitter.com/GarrettHaake/status/1075504397671641088
12https://twitter.com/bradheath/status/1075502845649137666
13https://twitter.com/AP/status/1075503822569648134
14https://twitter.com/NYDailyNews/status/1075018854194143233
15https://twitter.com/schmangee/status/1075400234027638784
16https://twitter.com/CIGIonline/status/1075493513922514944
17https://twitter.com/MotherJones/status/1075498023768199168
18https://twitter.com/blakehounshell/status/1075495305821470720
19https://twitter.com/arappeport/status/1075498839099011073
20https://twitter.com/samleecole/status/1075489317546536962
https://i.imgur.com/WXn0b7L.gifv

Bar Ran Dun
Jan 22, 2006




It's probably time to regulate the internet.

awesmoe
Nov 30, 2005

Pillbug

BrandorKP posted:

It's probably time to regulate the internet.
Countries all over the world already do? oh you mean regulate it in a way that suits you as a consumer not them as governments. I've...got some bad news...

Bar Ran Dun
Jan 22, 2006




awesmoe posted:

. I've...got some bad news...

Regulation usually doesn't happen until there is death and blood. We didn't join SOLAS until a goddamn town was exploded.

Herstory Begins Now
Aug 5, 2003
SOME REALLY TEDIOUS DUMB SHIT THAT SUCKS ASS TO READ ->>
I don't even know why anonymity is even a thing on the internet anymore. It hasn't been an actual technical reality for 99% of people for 15 years, instead it's just that internet providers allow a level of privacy for users that wouldn't exist in any other context. Tie peoples' online conduct to their real life identities and let them face consequences. Plenty of countries already link online presence to real identity and it doesn't automatically cause a bunch of authoritarian problems.

People don't realize how absolutely unbelievably hostile and cruel the internet is to basically anyone who isn't a white guy, but if you experienced for a day or two the amount of anger and hate directed at any woman or poc who happens to go online you'd think this needs to be dealt with immediately. Ironically there's disproportionately less homophobia now, but, cynically, probably only because white tech dudes can be gay or bi so that one hit close to home enough that they cracked down on it.

Like the idea that you should be able to just log on and send off death and rape threats on twitter to any female journalists with zero fear of any consequence either legally or in your personal life is completely hosed

Herstory Begins Now fucked around with this message at 23:51 on Dec 19, 2018

Silver2195
Apr 4, 2012

Herstory Begins Now posted:

I don't even know why anonymity is even a thing on the internet anymore. It hasn't been an actual technical reality for 99% of people for 15 years, instead it's just that internet providers allow a level of privacy for users that wouldn't exist in any other context. Tie peoples' online conduct to their real life identities and let them face consequences. Plenty of countries already link online presence to real identity and it doesn't automatically cause a bunch of authoritarian problems.

People don't realize how absolutely unbelievably hostile and cruel the internet is to basically anyone who isn't a white guy, but if you experienced for a day or two the amount of anger and hate directed at any woman or poc who happens to go online you'd think this needs to be dealt with immediately. Ironically there's disproportionately less homophobia now, but, cynically, probably only because white tech dudes can be gay or bi so that one hit close to home enough that they cracked down on it.

Like the idea that you should be able to just log on and send off death and rape threats on twitter to any female journalists with zero fear of any consequence either legally or in your personal life is completely hosed

This is a terrible idea. Anonymity can provide important protections for women and PoC too.

Adbot
ADBOT LOVES YOU

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
Remember how for years the adage was that the Internet + anonymity made people into jerks they wouldn’t normally be? Well the last decade of social media has put that idea to rest.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply