Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Strategic Tea
Sep 1, 2012

I should probably clarify I'm getting more at their use of grand ideologies to ~make the hard choices~ via number crunching. I'm sure guys like Hanson are doing perfectly interesting thought experiments with a side order of foot-in-mouth, but let's not pretend there aren't a tonne of 'rationalists' who are far more interested in why that THAT drat BITCH cheating on them is worse than rape because of the 'biological harm' it causes or its dollar value to the economy.

Adbot
ADBOT LOVES YOU

Dracula Factory
Sep 7, 2007


I think the idea of utilitarianism is just fine until you start trying to quantify things like happiness and well-being with hard numbers and math. It seems like it should be more of a helpful guideline rather than an absolute directive.

Telarra
Oct 9, 2012

Xander77 posted:

Is that a problem now? Are we sliding into... eh, let's not name fallacies as to avoid the similarity of expression. Is everything "people like Yud" like now tainted by association?

Yud likes utilitarianism because, like Bayesianism, it means he can justify anything he wants by pulling appropriately-sized numbers out of his rear end.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
Utilitarianism is a great method for making ethical decisions in any situation where you need to weigh degrees of harm and usefulness. It's limited in daily use by virtue of running on ethical calculus instead of human sensibilities, but still. Yet another decent idea reduced to absurdity by The Yud.

SolTerrasa
Sep 2, 2011

Dracula Factory posted:

I think the idea of utilitarianism is just fine until you start trying to quantify things like happiness and well-being with hard numbers and math. It seems like it should be more of a helpful guideline rather than an absolute directive.

And I think you probably can quantify things like happiness and well-being in specific, concrete cases, with the understanding that your computation varies per person and has huge error bars. In the abstract, you're totally right.

Let's not halo-effect this stuff; it's not like we all think AI is stupid now that Big Yud likes it. Definitely mock bad uses of utilitarianism (there are plenty on LessWrong; up-arrow notation is a good sign that you've found one), but I think the concept is sound.

Ugly In The Morning
Jul 1, 2010
Pillbug

SolTerrasa posted:


Let's not halo-effect this stuff; it's not like we all think AI is stupid now that Big Yud likes it. Definitely mock bad uses of utilitarianism (there are plenty on LessWrong; up-arrow notation is a good sign that you've found one), but I think the concept is sound.

The best example of it is their "dust speck in 50000 people's eye vs torturing a man for 50 years" thing.

Strategic Tea
Sep 1, 2012

I think you mean 3^^^3 years :smuggo:

SolTerrasa
Sep 2, 2011

Ugly In The Morning posted:

The best example of it is their "dust speck in 50000 people's eye vs torturing a man for 50 years" thing.

The mantra they use to convince themselves that they are not horrible monsters is "shut up and multiply", as in "your dislike of torture is wrong because of large numbers." Where a normal person would take this lesson to mean "hm, utilitarianism might not make sense at such a large scale, I can't imagine EVER finding dust specks comparable to torture no matter what the number of people", they take the exact opposite lesson.

E: the actual case is 3^^^^^3 dust specks in the eyes of that many people, or one person tortured for 50 years. They say that the latter is preferable because math. Non-monsters say that the former is preferable because there is not an absolute scale of suffering and it's a type error to compare the two.

SolTerrasa fucked around with this message at 21:13 on Sep 30, 2014

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that

Ugly In The Morning posted:

The best example of it is their "dust speck in 50000 people's eye vs torturing a man for 50 years" thing.

atelier morgan
Mar 11, 2003

super-scientific, ultra-gay

Lipstick Apathy

SolTerrasa posted:

The mantra they use to convince themselves that they are not horrible monsters is "shut up and multiply", as in "your dislike of torture is wrong because of large numbers." Where a normal person would take this lesson to mean "hm, utilitarianism might not make sense at such a large scale, I can't imagine EVER finding dust specks comparable to torture no matter what the number of people", they take the exact opposite lesson.

There's also the small problem that the whole idea of utilons as an atomic unit of utility you can just multiple everything to is a twisted and wrong misinterpretation of Bentham, and Mill (who was far more influential on every other philosopher of utilitarianism) started out by stating unequivocally that utility cannot be defined as strictly quantifiable under any circumstances in 1861.

It's been a strawman of utilitarianism for 150 years.

Lightanchor
Nov 2, 2012
All utilitarianism, no matter how good, is bad.

SolTerrasa
Sep 2, 2011

Lightanchor posted:

All utilitarianism, no matter how good, is bad.

Why do you think so? I am really interested to know what you mean by that.

atelier morgan
Mar 11, 2003

super-scientific, ultra-gay

Lipstick Apathy

Lightanchor posted:

All utilitarianism, no matter how good, is bad.

Given how much I hate metaethics I don't even disagree with you.

pentyne
Nov 7, 2012

SolTerrasa posted:

Why do you think so? I am really interested to know what you mean by that.

My guess? It reduces everything abstract and unique about the human condition to an excel table where you want the box at the bottom to have the highest possible value by coldly manipulating everything possible regardless of any other negative outcomes, harm or injury it causes because making that value the greatest trumps all other concerns.

Yud and anyone like him espousing utilitarianism use it more as an excuse for why their ideas should be followed rather then a comprehensive set of philosophical goals to work towards. Oh, and for all of Yud's proclamations about such things, he still spends more time writing Harry Potter self insert fanfiction then saving the future of the world with producing his AI research.

atelier morgan
Mar 11, 2003

super-scientific, ultra-gay

Lipstick Apathy

pentyne posted:

My guess? It reduces everything abstract and unique about the human condition to an excel table where you want the box at the bottom to have the highest possible value by coldly manipulating everything possible regardless of any other negative outcomes, harm or injury it causes because making that value the greatest trumps all other concerns.

Yud and anyone like him espousing utilitarianism use it more as an excuse for why their ideas should be followed rather then a comprehensive set of philosophical goals to work towards. Oh, and for all of Yud's proclamations about such things, he still spends more time writing Harry Potter self insert fanfiction then saving the future of the world with producing his AI research.

Much like his timeless formulation of quantum, Big Yud's formulation of utilitarianism is so inaccurate it's not even wrong.

Which is one way to be less wrong!

Toph Bei Fong
Feb 29, 2008



UberJew posted:

There's also the small problem that the whole idea of utilons as an atomic unit of utility you can just multiple everything to is a twisted and wrong misinterpretation of Bentham, and Mill (who was far more influential on every other philosopher of utilitarianism) started out by stating unequivocally that utility cannot be defined as strictly quantifiable under any circumstances in 1861.

It's been a strawman of utilitarianism for 150 years.

To expand on this, what Yudkowsky is saying about utilitarianism is basically the exact opposite of what it was intended to mean.

J.S. Mill's Utilitarianism posted:

I must again repeat, what the assailants of utilitarianism seldom have the justice to acknowledge, that the happiness which forms the utilitarian standard of what is right in conduct, is not the agent's own happiness, but that of all concerned. As between his own happiness and that of others, utilitarianism requires him to be as strictly impartial as a disinterested and benevolent spectator. In the golden rule of Jesus of Nazareth, we read the complete spirit of the ethics of utility. To do as one would be done by, and to love one's neighbour as oneself, constitute the ideal perfection of utilitarian morality.

And as Christ would gladly accept being nailed to a cross for the sins of the world, let alone some loving dust in his eye... Yudkowsky has it backwards, basically. The people would gladly accept a brief flash of inconvenience to save someone from years of torture, because none of them would want to be in his place, and would want to lessen his suffering by any means possible. The 50 years would be diluted to some mere specks of dust in everyone's eyes. Not that such a specific case could ever arise in the real world, anyways, and Yudkowsky's hypothetical is basically an excuse to say "I don't give a poo poo about people in 3rd world countries because they provide me with computer parts at low prices" or "gently caress minorities, I have my own problems", which, again, is the exact misinterpretation that Mill argues against in chapter 2 of Utilitarianism. Mill himself was a strident advocate for women's rights, anti-slavery, and universal education.

I'm not a utilitarian either, but people twisting philosophy annoys me... There are plenty of problems with utilitarianism as a moral system that don't come from treating The Ones Who Walk Away from Omelas as though it were a well thought out policy decision.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

pentyne posted:

My guess? It reduces everything abstract and unique about the human condition to an excel table where you want the box at the bottom to have the highest possible value by coldly manipulating everything possible regardless of any other negative outcomes, harm or injury it causes because making that value the greatest trumps all other concerns.
I think that's not really fair. In some situations, a cold and distanced judgment about benefits and detriments is exactly what you need. When emotions are going strong, people can't always judge what's good for them or for others, and sometimes you need to make a decision where someone will be harmed no matter what and figuring out the lesser evil is a necessity. Utilitarian ethics are very good at doing just that.

pentyne
Nov 7, 2012

Cardiovorax posted:

I think that's not really fair. In some situations, a cold and distanced judgment about benefits and detriments is exactly what you need. When emotions are going strong, people can't always judge what's good for them or for others, and sometimes you need to make a decision where someone will be harmed no matter what and figuring out the lesser evil is a necessity. Utilitarian ethics are very good at doing just that.

There's definitely some merit in an outside agency making the best choice in a bad situation. History is replete with examples where hard decisions had to be made that caused a lot of suffering but led to a overall positive outcome, but Yud just assumes the entire human experience comes down to a series of numbers and you can min/max those numbers like an RPG and quantify units of suffering, so that a "suffering balance sheet" where one person tortured endlessly for years is more preferable then a lot of people mildly inconvenienced for a moment.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

pentyne posted:

There's definitely some merit in an outside agency making the best choice in a bad situation. History is replete with examples where hard decisions had to be made that caused a lot of suffering but led to a overall positive outcome, but Yud just assumes the entire human experience comes down to a series of numbers and you can min/max those numbers like an RPG and quantify units of suffering, so that a "suffering balance sheet" where one person tortured endlessly for years is more preferable then a lot of people mildly inconvenienced for a moment.
Yeah, that's why he's an idiot. Not even utilitarians would agree with something like that, because they understand that suffering is qualitative, not just quantitative. For the most part, they're really perfectly sensible people who just try to take a structured approach to doing what's best for everyone.

Xander77
Apr 6, 2009

Fuck it then. For another pit sandwich and some 'tater salad, I'll post a few more.



What are the problems with traditional utilitarianism, anyways? I thought anyone approaching a moral problem from the perspective of "what helps the most people" rather than "what someone in authority said we should do" is utilitarian.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
It has its weak points, like any kind of ethics, but the biggest problem to most people is that it's just intentionally divorced from traditional ethical system.

To understand that, you need to know that there are basically three big kinds of ethical systems you can have. The first is virtue ethics, which is based in the characteristics of a person. Certain character traits are inherently good and expressing them in an action makes that action good - charity, compassion, honesty, loyalty. Ancient Greek ethics were virtue ethics, as are things like Bushido or Confucianism.

The second kind is deontological ethics. This is the kind that states that some things just are right, because of an inherent quality of themselves. It's all about rights and duties, in various senses and degrees. Kant is deontological, as are traditional Christian ethics and most moral absolutists in general. To the deontologist, rights and duties are as real as the laws of physics. Human rights fall into that category, incidentally.

The third kind is consequentialism. Consequentialism is all about outcomes. The outcome matters, not the precise action taken to get that outcome. To a consequentialist, things don't have moral values in and of themselves. Whether it's saving someone's life or killing them, the decision of whether the action was good or bad comes from whether the outcome was good or bad. The outcome itself is judged by a set of rules that can be more or less anything. Some consequentialist use "the good of the state" as the desired result, some prefer "the good of the person taking the action," some "the greatest good for the greatest number of people." Utilitarians are the latter.

Now, as you can probably see, Consequentialism is separated from the first two categories by one big thing: It's defined by the people who use it. While both virtue ethics and deontological ethics invoke an authority greater than themselves to a degree, Consequentialism, as a general rule, does not. This means that it's highly flexible. It's capable of changing with the times and applying itself in modern ways to basically any problem.

It can also be twisted to say pretty much whatever you want.

And that's the big problem a lot of people have with utilitarianims, basically: Who decides what the greatest good for the greatest number of people means? What's the good? What's the bad? How did you even arrive at that decision? It breaks with tradition. It puts the human actor at the center of everything, even in places where traditional thinkers would say that an authority above humans should be making the decisions.

On the bottom line, most utilitarians are basically hedonists: the desired outcome is pleasure and wellbeing. That can be taken to an extreme, though, one that most people alive today would consider conventionally unpleasant and even disgusting. "Brave New World" is an example of utilitarian ethics in practice, you might say.

That's just a short summary, of course, but I hope it gives you an overview of what the debate is even about.




... and, uh, sorry for the wall of text. Ethics are sort of a personal interest of mine, so I tend to get a bit wordy about it. :shobon:

Xander77
Apr 6, 2009

Fuck it then. For another pit sandwich and some 'tater salad, I'll post a few more.



Cardiovorax posted:

On the bottom line, most utilitarians are basically hedonists: the desired outcome is pleasure and wellbeing.
The issue with that definition is that in popular usage "hedonism" tends to imply an appetite for "sinful" and low pleasures, and concern primarily for ones own wellbeing.

As to the issue of flexibility - perhaps the same could be said of all religions all your philosophies. If you can twist Mill's highly concise and clear texts around to justify what you want, you can certainly do the same with something as sprawling and self-contradictory as religious texts or Marxist dogma.

(Then again, I have seen people claim that On Liberty justifies political censorship. Then again *again*, they were B.A students, so there's a fair chance they haven't actually read the text)

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Xander77 posted:

The issue with that definition is that in popular usage "hedonism" tends to imply an appetite for "sinful" and low pleasures, and concern primarily for ones own wellbeing.

As to the issue of flexibility - perhaps the same could be said of all religions all your philosophies. If you can twist Mill's highly concise and clear texts around to justify what you want, you can certainly do the same with something as sprawling and self-contradictory as religious texts or Marxist dogma.

(Then again, I have seen people claim that On Liberty justifies political censorship. Then again *again*, they were B.A students, so there's a fair chance they haven't actually read the text)
I was using it in the philosophical, Epicurean sense. But yeah, I agree with you. Nothing made by humans is safe from being abused, I just wanted to represent the idea from the perspective of its detractors.

Boing
Jul 12, 2005

trapped in custom title factory, send help

Cardiovorax posted:

Now, as you can probably see, Consequentialism is separated from the first two categories by one big thing: It's defined by the people who use it. While both virtue ethics and deontological ethics invoke an authority greater than themselves to a degree, Consequentialism, as a general rule, does not. This means that it's highly flexible. It's capable of changing with the times and applying itself in modern ways to basically any problem.

Doesn't this equally apply to virtue ethics and deontological ethics, since there's always someone in those systems saying which traits are virtuous, or which duties are just? What you said doesn't sound unique to consequentialism in that regard.

Moral philosophy is problematic because nobody is agreed on what ethics or morality even is. It's a vague label we've constructed that refers to a few things that are right or good but impossible to define in any more concrete terms. But, as humans, we do have moral intuitions that guide our behaviour in most uncomplicated situations. You wouldn't stab a woman on the street to take her bag because that's insane and repugnant, and you don't need utilitarianism (or any other ethical framework) to tell you that. We have an innate disgust response to acts that we consider intuitively immoral.

The purpose behind trying to find an ethical framework like utilitarianism or Kantian imperatives or whatever, is to try and create a precise system that matches our moral intuitions in most cases. They're meant to provide the answers to more complicated moral questions like: Should I steal this loaf of bread to feed my children?

Utilitarianism (broadly) might say yes, since the baker doesn't need it as much as you do, and his being a few pennies poorer doesn't matter as much as your children having something to eat.
Kant would say no, since if everyone stole to feed their children then we would have no concept of property and stealing wouldn't make sense (or something, it's been years since I read Kant)

In principle, we judge how good a moral framework is by how closely it matches our intuitions. But our moral intuitions are really complex and irrational sometimes, so no matter what you'll find that your chosen framework clashes with your intuitions. At that point you have to decide whether the framework is wrong or whether your intuitions are wrong. But since we judge a moral framework by how well it matches our intuitions, this means that these judgements inevitably break down. Some of these are tractable - utilitarianism permits incest, for example, so long that nobody is hurt by it or by knowing about it. That probably gives you an "ugh, gross" response but if you're a diehard utilitarian you should be able to overcome that and say "well, I guess it's okay then". And at that point you're trusting the system over your intuitions, which are what you judged the system by in the first place.



This:

is one of the classic problems (the "utility monster" thought experiment), but it's contrived and unrealistic and easy to solve by redefining the framework somewhat. But my favourite is this one:

This is one problem of utilitarianism: Imagine you're sitting and waiting in a doctor's office, and they call you in. They drug you, kill you and remove all your organs, then give various different organ transplants to 6 different young patients (one needed a heart, one needed a kidney, etc), saving their lives. You're dead, but they'll all live full and happy lives with your organs. Is this morally right?

Utilitarianism says it is, but your moral intuitions probably say it's not. So which is right? If we judge moral frameworks by their adherence to our moral intuitions, can any moral framework ever be more successful than those intuitions themselves?

(I'd still say yes, but it's complicated)

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Boing posted:

Doesn't this equally apply to virtue ethics and deontological ethics, since there's always someone in those systems saying which traits are virtuous, or which duties are just? What you said doesn't sound unique to consequentialism in that regard.
I agree, but virtue and deontological ethicists probably wouldn't. It's just how those branches of ethics work - from their perspective, they aren't deciding what's right, they're just articulating a truth that was always there.

I disagree with your ideas on the intuitive nature of ethics, though. They're not. Disgust ethics are popular with some people, but I've always found them very provincial and self-centered. Different cultures have different standards of disgust and proper behaviour, which is what makes the field so complicated and contentious to begin with. If it was as easy as saying that we're "evolved" to consider certain things right, that wouldn't be as self-evidently true as it is. Plus, it's a naturalistic fallacy - not everything that's natural is also good or desirable. Disgust ethics go a long way towards justifying sexism, homophobia and racism.

Boing
Jul 12, 2005

trapped in custom title factory, send help
Just to clarify: I'm not pushing disgust ethics, I'm commenting on the difficulty of validating or comparing ethical systems when our only real anchoring point to them is through moral intuitions. I'm a psychologist, not a philosopher, so I really enjoy meta-ethics and intuitive moral behaviour :)

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
No problem. I still don't really agree with your stance on the role of intuition in the creation and evaluation of ethical systems, but I'm not about to argue with an expert over it. I'm just an interested layman, after all.

MrRoivas
Sep 29, 2014

Boing posted:

In principle, we judge how good a moral framework is by how closely it matches our intuitions. But our moral intuitions are really complex and irrational sometimes, so no matter what you'll find that your chosen framework clashes with your intuitions. At that point you have to decide whether the framework is wrong or whether your intuitions are wrong. But since we judge a moral framework by how well it matches our intuitions, this means that these judgements inevitably break down. Some of these are tractable - utilitarianism permits incest, for example, so long that nobody is hurt by it or by knowing about it. That probably gives you an "ugh, gross" response but if you're a diehard utilitarian you should be able to overcome that and say "well, I guess it's okay then". And at that point you're trusting the system over your intuitions, which are what you judged the system by in the first place.

Speaking as another psychologist, I think moral philosophy is generally in dire need of empirical backup. A good example is the incest "problem" with utilitarianism. Supposedly one could justify incest so long as all parties are consenting and if perhaps they promise to be infertile, depending on the risk for dire genetic problems.

Great in all, if someone is talking about perfectly spherical humans in a white void. But in real life, there incest by its nature makes consent impossible. Whether it be a parent having sex with a child, or a sibling and another sibling, real world incidents of such relationships involve deeply screwed up people, and its part and parcel of how real world incest doesn't violate consent. In fact, the only example I can think of is so rare that news paper articles get written about it whenever its discovered. That being the times when siblings who were separated at birth and had no knowledge of each other meet up and enter in a a relationship, only discovering later that its incestuous. And in all honesty, I see no particular value in forbidding such relationships in a moral sense. While societal incest taboo would likely lead to them stopping the relationship once discovered, I see no great moral principles being violated if they do stay together. So long as they get genetic testing to make sure they don't have any particularly dangerous recessive genes.

Boing
Jul 12, 2005

trapped in custom title factory, send help

Cardiovorax posted:

No problem. I still don't really agree with your stance on the role of intuition in the creation and evaluation of ethical systems, but I'm not about to argue with an expert over it. I'm just an interested layman, after all.

It's an interesting issue, I'm hardly an expert on it, and I'm very open to other points of view on the topic. Though I don't know if this is the right thread for it :shobon:

atelier morgan
Mar 11, 2003

super-scientific, ultra-gay

Lipstick Apathy
Every single formulation of ethics has stronger arguments against it than for it so pick whichever you like pretty much. Just don't pick something that isn't even ethics like Big Yud.


Boing posted:

Just to clarify: I'm not pushing disgust ethics, I'm commenting on the difficulty of validating or comparing ethical systems when our only real anchoring point to them is through moral intuitions. I'm a psychologist, not a philosopher, so I really enjoy meta-ethics and intuitive moral behaviour :)

This discussion isn't even really meta-ethics because everybody posting already agrees that moral judgments exist and have prepositional value. Actual meta-ethics are far more infuriating.

Lightanchor
Nov 2, 2012

SolTerrasa posted:

Why do you think so? I am really interested to know what you mean by that.

I was just being provocative, but briefly, utilitarianism has liberal origins and its current practitioners tend to be politically naive neoliberals. So Peter Singer frames his issues like the most moral thing to do is to give to charity, but charities aren't permanent solutions to political problems.

You could argue that this isn't a problem inherent to utilitarianism, but it's not a coincidence that utilitarianism is the guiding philosophy of neoliberalism which ideologues appeal to over and over.

More to the point: utilitarianism has the wrong starting point. It bases itself on pleasure instead of power. The best arrangement is not the greatest happiness for the greatest number (which systematically justifies the ruling class's ideological version of pleasure), but more like the greatest power for the greatest number.

It's the sort of thing I could talk about all day, but let's not turn this thread into Least Wrong

The Vosgian Beast
Aug 13, 2011

Business is slow
To try to get this back on target Slate Star Codex just gave the longest most roundabout "if you're so tolerant, why don't you tolerate my intolerance" speech ever

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that

Ok, I spent so much time sorting out enough subtle leaps in logic just in the first section that there's no way I'll make it through all 12.

sat on my keys!
Oct 2, 2014


This post may be the apex of the SSC tradition of using 100 times as many words as you actually need to express your ideas.

Political Whores
Feb 13, 2012


Ironic for me to say I know, but only straight white guy could write this. It's the only conjunction of identities that can treat tolerance as an academic exercise. Oh yes, I'm sure not one black person or minority has ever written on issues of identity. It's all white dudes complaining at white dudes, anti-gay bigotry, racism, sexism, those all don't exist at all anymore (no no, he acknowledges social effects, so don't miss his poi:fuckoff:). Maybe being a republican is objectively a bad thing, maybe someone who would support a group that would try to repeal the voting rights act and only opposes gay marriage because having gays chemically castrated isn't tenable anymore deserve to be viewed with distrust. At least the hardcore communists I know have the decency to hate everyone equally in their false equivocating bullshit about the system.

The Vosgian Beast
Aug 13, 2011

Business is slow

Political Whores posted:

Ironic for me to say I know, but only straight white guy could write this.

Welcome to Slate Star Codex!

Verus
Jun 3, 2011

AUT INVENIAM VIAM AUT FACIAM

Political Whores posted:

Ironic for me to say I know, but only straight white guy could write this.


Well, well, well! At least now we know who the true bigots are.

Silver2195
Apr 4, 2012

I thought the final section (Scott realizing that he doesn't actually criticize the Gray Tribe very much) was an impressive bit of self-awareness that probably won't be followed up on. (To be fair to Scott, he did write the Non-Libertarian FAQ, but he seems to have become a lot more libertarian/Gray Tribe since then.)

Edit: I just remembered that Scott actually did poke fun at Bay Area startup culture in "HeartMath Considered Incoherent."

Silver2195 fucked around with this message at 23:27 on Oct 2, 2014

The Vosgian Beast
Aug 13, 2011

Business is slow

Silver2195 posted:

I thought the final section (Scott realizing that he doesn't actually criticize the Gray Tribe very much) was an impressive bit of self-awareness that probably won't be followed up on. (To be fair to Scott, he did write the Non-Libertarian FAQ, but he seems to have become a lot more libertarian/Gray Tribe since then.)

He has literally no idea how only dealing with a certain group can shift your views of reality and what's normal as has been proven by multiple blogposts.

Adbot
ADBOT LOVES YOU

Silver2195
Apr 4, 2012

The Vosgian Beast posted:

He has literally no idea how only dealing with a certain group can shift your views of reality and what's normal as has been proven by multiple blogposts.

That's definitely part of it, but I think another part of it is that he's only recently started to realize that liberals aren't just libertarians with slightly less attachment to free markets, and implicitly blames his confusion on liberal hypocrisy. If you read his Non-Libertarian FAQ closely, you'll notice that he only really attacks their views on markets, and never really argues against the attitude that government should be as disentangled from "the culture" as possible.

Silver2195 fucked around with this message at 00:22 on Oct 3, 2014

  • Locked thread