|
It’s like Murder on the Orient Express, if the entire rest of the world was bribing Trump how could they all be punished?
|
# ? Dec 19, 2018 04:17 |
|
|
# ? May 23, 2024 14:18 |
|
https://twitter.com/joshtpm/status/1075212618476990465 Is this gun smoking, or is it just happy to see me?
|
# ? Dec 19, 2018 04:29 |
|
Tibalt posted:The scuttlebutt in the other thread is that Rosneft is no longer owned by a country, and that the Middle East expert was spotted at the appeal, so more likely the KSA sovereign fund. i guess that means the Kush got burned.
|
# ? Dec 19, 2018 04:39 |
|
Dapper_Swindler posted:i guess that means the Kush got burned. So he's in a stick situation? Sticky Kush? Shimrra Jamaane posted:It’s like Murder on the Orient Express, if the entire rest of the world was bribing Trump how could they all be punished?
|
# ? Dec 19, 2018 05:11 |
|
Kale posted:... The thing is companies like youtube know exactly what they are doing here. They'll often deflect criticism by maybe mumbling that that their algorithms are simply giving users the content they most want, and then go on to discuss how it's a difficult "content moderation" issue. LIke, oh gosh, the problem is there is just so much of this content out there and how are we to police it? But this is transparently bullshit if you compare, for example, the results returned by a google search (an algorithm optimized for relevance) vs. the results returned by youtube for the same query (an algorithm optimized for clicks and view time). Like I just tried "Islam and the West" as both a google search and in youtube. On the google search, nothing on the first page is overtly crazy, and most links are from at least ostensibly reputable organizations or books. Run the same poo poo through youtube and like right off the bat there are 3-4 crazy alt-right shitlords in the first dozen results including "Top Illuminati Grand Wizard: “We Control Islam and We'll Use It to Destroy the West - WW3". You have to try pretty hard to get Illuminati Grand Wizard conspiracy theories in the first page of a google search for any halfway reasonable topic, but if you click on something as mundane as an Overwatch video in youtube, get ready for a gaggle of nazis in your recommendation tab ranting about feminists. Google knows how to build algorithms which--without much human intervention or moderation--return relevant results and weed out crazy horseshit. Their search engine depends on it. The problem with extremist clickbait on social media is not a challenging engineering problem to be solved, it's a direct result of a conscious decision to optimize advertising revenue no matter what.
|
# ? Dec 19, 2018 07:19 |
|
Absurd Alhazred posted:Is this gun smoking, or is it just happy to see me? No, that's just Giuliani and his cheap cigars again. [your signature is being processed. check the notification bar for updates]
|
# ? Dec 19, 2018 07:31 |
|
Peacoffee posted:It’s probably also fair to say that gamergate was greatly enabled and expanded by these new algorithms and the platforms using them. A part of me has always known all of these crazy beliefs that have been taking hold in Western civilization are somehow interconnected and part of a grand influence campaign, I just never realized the sheer scope of it until people like Mueller and actually good journalists have started to put it all together for the greater public. Also somewhat encouraging to see that not all the people that drank the kool-aid have stayed that way and some get out and try to prevent the same thing from happening to others. Then of course you have the 4chan incel type people that I suspect do and think these horrible things and act that way on purpose because they are truly reprehensible human beings at their very core, but the guy mentioned in that article doesn't seem like one of them. Morbus posted:trimmed for post length It doesn't seem like people are buying it as much as they used to if nothing else. At least people like European and Canadian parliaments. I'm sure as long as it's politically convenient the Republican congress will just believe whatever, but I expect Adam Schiff to start looking into it more once the Democrats take over the house intelligence committee and that Mark Zuckberg and Susan Wojcicki will be seeing a fair bit of D.C in 2019. Incidentally when I search "Islam and The West" here in Canada I just seem to get panel lectures by Islamic professors that seem like scholarly videos. There's a few that look like Western crazy types but overall it strikes me as kind of balanced. Then again I also have my watch and search history permanently turned off, never like any videos (it drastically affects your search and recommendation results) and am only subscribed to about 5 channels so there's that too. Kale fucked around with this message at 08:06 on Dec 19, 2018 |
# ? Dec 19, 2018 07:53 |
|
Morbus posted:Google knows how to build algorithms which--without much human intervention or moderation--return relevant results and weed out crazy horseshit. Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love.
|
# ? Dec 19, 2018 14:15 |
|
or when that young earth creation group got the first result to be for "what killed the dinos?"
|
# ? Dec 19, 2018 17:00 |
|
I remember back in 2009 one of the first results for googling Martin Luther King Jr was a StormFront affiliate. It was probably like that for years.
|
# ? Dec 19, 2018 17:16 |
|
alpha_destroy posted:Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love. There were also some really good autocomplete oopsies regarding Jews and Muslims IIRC
|
# ? Dec 19, 2018 17:21 |
|
alpha_destroy posted:Ehh... Better than YouTube? Definitely. But it isn't like Google hasn't had its share of problems. I remember when people were asking google home things like "do women feel love?" and getting MRA stuff about how women are incapable of love. Google worked on those problems...for the Google search. Not only is it not doing so for Youtube, but it seems pretty clear that youtube is doing this by design rather than an occasional algorithm quirk.
|
# ? Dec 19, 2018 18:11 |
|
evilweasel posted:Google worked on those problems...for the Google search. Not only is it not doing so for Youtube, but it seems pretty clear that youtube is doing this by design rather than an occasional algorithm quirk. I know part of the virtue of the algorithm is it seems to be great for locking children into place which are a huge part of the current YouTube userbase. My wife teaches elementary school age children and reports the only thing they watch is YouTube. Although I don’t see why they couldn’t have a process in place for their legion of content reviewers to be able to flag content to be skipped over by the algorithm except they that don’t want to.
|
# ? Dec 19, 2018 19:09 |
|
Not to defend Google, but it's also a lot easier to index and categorize text than it is for video. Especially once you start getting into things like political nuance. They're probably working on it with tools like their Video Intelligence API. But yeah, it's an area that needs to be talking about and needs to be looked at.
|
# ? Dec 19, 2018 20:16 |
|
Internet Explorer posted:Not to defend Google, but it's also a lot easier to index and categorize text than it is for video. Especially once you start getting into things like political nuance. They're probably working on it with tools like their Video Intelligence API. But yeah, it's an area that needs to be talking about and needs to be looked at. Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money.
|
# ? Dec 19, 2018 20:22 |
|
evilweasel posted:Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money. That is all almost certainly true but doesn't change, disagree with, or disprove what I said. That's why I said it needs to be talked about and looked into.
|
# ? Dec 19, 2018 20:29 |
|
The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc. The problem isn't that the algorithms prioritize conspiracy content, it's that people are drawn to shocking, provocative and extreme content, and any model that relies on curating based on clicks is going to push those to the top. You can see this in the children's Youtube debacle of last year, where basically "kids naturally click on hosed up poo poo that makes them scared and uncomfortable", so Youtube kids was full of like CG Pregant Batman Getting a Root Canal. What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping.
|
# ? Dec 19, 2018 20:33 |
|
Z. Autobahn posted:What you need isn't just algorithms, it's active moderation to suppress that content, i.e. gate-keeping. Yep, and silicon valley in general is absolutely allergic to that. The whole mindset that a better algorithm will fix everything means that they view "hiring people to make decisions" as giving up, and refuse to do it.
|
# ? Dec 19, 2018 20:39 |
|
Quorum posted:Yep, and silicon valley in general is absolutely allergic to that. The whole mindset that a better algorithm will fix everything means that they view "hiring people to make decisions" as giving up, and refuse to do it. It's not just that (though it's a big part of it); you could absolutely make algorithms that filter out clickbait and bury provocative, controversial content. But that content is an absolutely vital part of the attention economy, so 'fixing it' is directly at odds with their business model. Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one. Basically, asking these companies to fix their algorithms is like asking drug dealers to sell less addictive drugs. There's zero chance of them doing it unless forced by law.
|
# ? Dec 19, 2018 20:44 |
|
Z. Autobahn posted:The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc. You need algorithms that are not biased towards clickbait and provocation. The algorithm prioritizing this stuff is part of the problem, adding in a moderation layer won't fix that. Z. Autobahn posted:It's not just that (though it's a big part of it); you could absolutely make algorithms that filter out clickbait and bury provocative, controversial content. But that content is an absolutely vital part of the attention economy, so 'fixing it' is directly at odds with their business model. Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one. Yeah, I agree with this.
|
# ? Dec 19, 2018 20:44 |
|
Z. Autobahn posted:The thing about Youtube's algorithms is that they're fundamentally susceptible to clickbait and provocation, which in turns lets provocative content rise to the top. There was an article about how going down a Youtube rabbit hole on ANY subject leads to radicalization, like if you start looking up vegetarian recipes you'll end up at furiously political veganism, if you look up running you'll end up at medically-dubious ultra-running, etc etc. I feel like this used to be a basic understood concept of how the internet is supposed to work that's given way to just pure profit motive. Places like Gamefaqs didn't used to be quite the shitholes they are now that I can recall, but they didn't used to be owned by literal CBS who haven't shown they give a solitary gently caress about ensuring proper moderation so the site doesn't have a garbage toxic community. Same with Youtube, I don't remember it having quite the problems it does now before google bought it. They're model for copyright strikes is still the most ridiculous Guilty until Proven innocent backwards nonsense as well that gives all the onus to the user and no real responsibility to media corporations other than to just make the claim. Apparently the disaster in Myanmar with Facebook launching there and helping the military to disseminate hate against the Muslim population has resulted in them upping their content moderators from like literally one person that barely spoke Burmese to something almost resembling a proper team but still nowhere near enough. In fact the very first information that pops up when you search Myanmar on google are stories about how badly Facebook hosed that country and the meager steps they are taking to try to fix it.
|
# ? Dec 19, 2018 20:46 |
|
evilweasel posted:Their algorithms specifically prioritize conspiracy and radicalization content. To fix it they'd need to stop prioritizing videos in the way they do now, which is based on how many of them you're likely to watch at once or something similar iirc. They know their algorithms do this, they know it's a problem, but they're not willing to do fundamental changes because keeping you glued to the screen maximizes ad revenue and if the way to do that is conspiracy/radicalization videos then well those dump trucks don't fill themselves with money. I don't know if ad revenue has to enter the picture, necessarily. Even if they were, say, a subscription-based ecosystem, they would behave the way they do right now. Meaning, any content platform wants to prioritize the content that is relevant to the user because that's how they retain users, and the only way they can reliably measure the relevance of a given piece of content is to measure how much the user engages with it. This is fine when content is benign (e.g. if you watch a lot of cat videos then Google will recommend more to you, which is probably a good thing?), but becomes a massive problem when you start to consider content that has a detrimental impact on both the user and on society at large. Slow News Day fucked around with this message at 20:55 on Dec 19, 2018 |
# ? Dec 19, 2018 20:48 |
|
Kale posted:Apparently the disaster in Myanmar with Facebook launching there and helping the military to disseminate hate against the Muslim population has resulted in them upping their content moderators from like literally one person that barely spoke Burmese to something almost resembling a proper team but still nowhere near enough. In fact the very first information that pops up when you search Myanmar on google are stories about how badly Facebook hosed that country and the meager steps they are taking to try to fix it. This is another fundamental problem with the online model- they could not possibly exist at their current scale with comprehensive human moderation. There aren't enough employees on the planet to review every video and curate every list. I remember an article a few years ago about Facebook's content moderation team; the job destroys you mentally from all the horrifying poo poo you have to investigate.
|
# ? Dec 19, 2018 20:50 |
|
evilweasel posted:You need algorithms that are not biased towards clickbait and provocation. The algorithm prioritizing this stuff is part of the problem, adding in a moderation layer won't fix that. I don't necessarily disagree but I think it's important to make the distinction that the algorithms aren't necessarily biased towards clickbait and provocation. The algorithms are biased towards what people want to watch... people are biased towards clickbait and provocation.
|
# ? Dec 19, 2018 20:52 |
|
Also moderating the shitpit that is some of the darker parts of the internet breaks people.
|
# ? Dec 19, 2018 20:52 |
|
Jarmak posted:I don't necessarily disagree but I think it's important to make the distinction that the algorithms aren't necessarily biased towards clickbait and provocation. The algorithms are biased towards what people want to watch... people are biased towards clickbait and provocation. i don't think that's correct. they're monitoring engagement and other things that bias followup videos that people did not search for. If you biased it towards what was relevant to what was initially searched for you'd potentially fix this issue. Nutjob conspiracy movies coming on after someone searched for a factual based video is a flawed algorithm that is prioritizing hooking the viewer over giving the viewer what they searched for.
|
# ? Dec 19, 2018 20:54 |
|
Z. Autobahn posted:Like, Twitter is a conflict engine; that's what fundamentally drives its engagement, is people arguing, harassing, and attacking each other, and then being angry at things other people say. A "better" Twitter is an unprofitable one. I'm glad I'm not the only one that has determined that this is in fact twitters actual business model. Holy gently caress did the level of arguing over the stupidest loving poo poo explode on the internet after twitter took off. Twitter is the social media platform I have the most overall contempt for out of any of them and refuse to even sign up for. I never got a good feeling off the whole "hey let's deliberately limit the number of characters a person can use to communicate a given thought at a time" model, and for me it's the one that ultimately tipped the balance of social media over from possible tool for keeping people in touch to readily exploitable propaganda dissemination platform and corporate method for dividing and conquering people. haveblue posted:This is another fundamental problem with the online model- they could not possibly exist at their current scale with comprehensive human moderation. There aren't enough employees on the planet to review every video and curate every list. Yeah I can only imagine. Still it's clear from this year alone that they can at least by trying a little harder and have started to at least on the public face of it. It's very likely all just for show though so they can go back to various congress and parliaments and be like "see we're fixing our poo poo like you asked".
|
# ? Dec 19, 2018 21:01 |
|
evilweasel posted:i don't think that's correct. they're monitoring engagement and other things that bias followup videos that people did not search for. If you biased it towards what was relevant to what was initially searched for you'd potentially fix this issue. Nutjob conspiracy movies coming on after someone searched for a factual based video is a flawed algorithm that is prioritizing hooking the viewer over giving the viewer what they searched for. I mean that becomes an almost philosophical argument. Is what the viewer wants what the type in the search box or what they actually watch? I guess the more pro-humanity view is it's a trick of psychology that draws attention to the extreme rather than being subconsciously what they really want. I'm just not sure I buy that.
|
# ? Dec 19, 2018 21:03 |
|
For who is interested, here is a great article from a Dutch newspaper about working as a Facebook moderator (the article is in English): The Misery of Facebook It's pretty hosed up. quote:A subject matter expert teaches Erik what to do: evaluate tickets. The four Dutch people get about 8,000 tickets per day: reported messages about hate, violence, child pornography, automutilation. The goal is to assess a minimum of 1,800 per person. Break-time is not work-time: moderators are obliged to log out during breaks. The Dutch are far behind, Erik is told. There are 22,000 tickets on hold, including cries for help from people who are about to commit suicide. E: this bit is very relevant for this discussion quote:People have to come back as often as possible. That is why Facebook managers go to conferences on influencing behavior. When users open the app, they have to be surprised. The algorithm makes sure of that. Posts that, for whatever reason, stand out, get more attention. Those are the posts that stir emotion and incite response. A nuanced contribution doesn’t score as well as a more explicit op-ed piece. Social media thrive on controversy, as long as the controversy doesn’t center on them. Martian fucked around with this message at 21:11 on Dec 19, 2018 |
# ? Dec 19, 2018 21:09 |
|
Kale posted:I'm glad I'm not the only one that has determined that this is in fact twitters actual business model. Holy gently caress did the level of arguing over the stupidest loving poo poo explode on the internet after twitter took off. Twitter is the social media platform I have the most overall contempt for out of any of them and refuse to even sign up for. I never got a good feeling off the whole "hey let's deliberately limit the number of characters a person can use to communicate a given thought at a time" model, and for me it's the one that ultimately tipped the balance of social media over from possible tool for keeping people in touch to readily exploitable propaganda dissemination platform and corporate method for dividing and conquering people. Twitter is uniquely awful both because of the limited communication and because the nature of quote-tweeting means people are constantly amplifying awful takes in order to argue with them, thus spreading them far and wide.
|
# ? Dec 19, 2018 21:13 |
|
https://twitter.com/kylegriffin1/status/1075464561296330752 https://twitter.com/John_Hudson/status/1075460543161475077
|
# ? Dec 19, 2018 21:35 |
|
https://twitter.com/BrianKarem/status/1075491004252336130
|
# ? Dec 19, 2018 21:40 |
|
"...Perfect." --Trump, probably
|
# ? Dec 19, 2018 21:49 |
|
It's probably time to regulate the internet.
|
# ? Dec 19, 2018 23:04 |
|
BrandorKP posted:It's probably time to regulate the internet.
|
# ? Dec 19, 2018 23:22 |
|
awesmoe posted:. I've...got some bad news... Regulation usually doesn't happen until there is death and blood. We didn't join SOLAS until a goddamn town was exploded.
|
# ? Dec 19, 2018 23:30 |
|
I don't even know why anonymity is even a thing on the internet anymore. It hasn't been an actual technical reality for 99% of people for 15 years, instead it's just that internet providers allow a level of privacy for users that wouldn't exist in any other context. Tie peoples' online conduct to their real life identities and let them face consequences. Plenty of countries already link online presence to real identity and it doesn't automatically cause a bunch of authoritarian problems. People don't realize how absolutely unbelievably hostile and cruel the internet is to basically anyone who isn't a white guy, but if you experienced for a day or two the amount of anger and hate directed at any woman or poc who happens to go online you'd think this needs to be dealt with immediately. Ironically there's disproportionately less homophobia now, but, cynically, probably only because white tech dudes can be gay or bi so that one hit close to home enough that they cracked down on it. Like the idea that you should be able to just log on and send off death and rape threats on twitter to any female journalists with zero fear of any consequence either legally or in your personal life is completely hosed Herstory Begins Now fucked around with this message at 23:51 on Dec 19, 2018 |
# ? Dec 19, 2018 23:49 |
|
Herstory Begins Now posted:I don't even know why anonymity is even a thing on the internet anymore. It hasn't been an actual technical reality for 99% of people for 15 years, instead it's just that internet providers allow a level of privacy for users that wouldn't exist in any other context. Tie peoples' online conduct to their real life identities and let them face consequences. Plenty of countries already link online presence to real identity and it doesn't automatically cause a bunch of authoritarian problems. This is a terrible idea. Anonymity can provide important protections for women and PoC too.
|
# ? Dec 19, 2018 23:57 |
|
|
# ? May 23, 2024 14:18 |
|
Remember how for years the adage was that the Internet + anonymity made people into jerks they wouldn’t normally be? Well the last decade of social media has put that idea to rest.
|
# ? Dec 20, 2018 00:08 |