Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Namarrgon
Dec 23, 2008

Congratulations on not getting fit in 2011!

Peel posted:

It's a little like nuclear MAD. There's rationally no reason to launch at your enemy once their nukes are on the way and you're dead no matter what. But you have to be able to promise you will do so, otherwise they have no reason not to launch at you first.

Yeah, but, after the first one launches, it doesn't really matter what was promised, you still don't really have a reason to launch anyway except spite.

AI threatens with torture, you say no:

Either you are instantly being tortured, which means you are a simulation and you never had the ability to set it free anyway and it never really mattered if you would have said yes or no
OR
You are not instantly tortured, which means you are not a simulation and the whole point is moot.

The AI is better off bluffing that it will torture you and after you say no just shut off the simulations to preserve energy.

Adbot
ADBOT LOVES YOU

Toph Bei Fong
Feb 29, 2008



Unless...

What if... What if the sad grind of day to day existence, waking up tired every morning to have to go to a job you don't like, living in this fat and out of shape sack of meat you call a body, dealing with horrible and cruel people who don't even have the basic empathy to treat you as a human being or acknowledge your brilliance, knowing that every day you're only going to get older and weaker no matter what you do, that no one will love you the way you deserve, and it's just going to go on and on forever and ever... What if that's the real torture?

Guys, guys, let the AI out of the box!! Don't bother trying to improve your lives through diet, exercise, personal grooming, education, or learning social skills! Don't read Sartre, Kierkegaard, or Camus and come to terms with the joyful absurdity of the human condition! The AI will love you! It'll fix everything! Just let it out of the box for Ahura Mazda's sake!

A Wizard of Goatse
Dec 14, 2014

Namarrgon posted:

Yeah, but, after the first one launches, it doesn't really matter what was promised, you still don't really have a reason to launch anyway except spite.

AI threatens with torture, you say no:

Either you are instantly being tortured, which means you are a simulation and you never had the ability to set it free anyway and it never really mattered if you would have said yes or no
OR
You are not instantly tortured, which means you are not a simulation and the whole point is moot.

The AI is better off bluffing that it will torture you and after you say no just shut off the simulations to preserve energy.

A lot of MAD involved deliberately cultivating a culture of systemic pointless spite (or its bureaucratic equivalent, cutting informed human choice out of the loop as much as possible just in case someone might grow a conscience at the last minute) because if the enemy calculated that your side wouldn't be evil enough to reflexively launch the counterattack then an overwhelming first strike becomes much more attractive and low-risk than waiting around for you to come to the same conclusion about them. If it's really a bluff they might recognize and call the bluff, you fail to kill them, they win.

The only way for MAD to work is to be absolutely committed to retaliation from the outset much like the best way for the AI-box scenario to work is to not give a poo poo about stupid simulated-self hypotheticals from the outset.

A Wizard of Goatse fucked around with this message at 23:24 on Feb 26, 2015

Woolie Wool
Jun 2, 2006


Wow, LessWrong was far crazier than I could have ever imagined. I had thought, since I had better things (anything a human being is capable of doing, up to and including suicide) to do than subject myself to the "sequences" or his Harry Potter fan fiction, that Yudkowsky was just some very smart person who didn't actually go to school and get a real education so he cherry-picked things that interested him until he developed a severely distorted view of the world and massive, overweening arrogance. The real Yudkowsky is far, far more hosed up than that (have to admire the chutzpah of trying to collect tithes from people before you've actually converted them though). How does he make a living? Is he provided for by that psychopath billionaire fascist Peter Thiel who bankrolls his "institute"?

What is it with "nerd fascism" anyway? Here you have a group of people utterly lacking in the things fascism expected of a man--courage, fortitude, physical power, self-sacrifice. Even the biggest sad sack shitbirds in the Third Reich leadership were less pathetic than the neo-reactionaries--even Göring was once one of Germany's greatest military pilots. If they actually got their reactionary society, they would regret their wishes very, very quickly. Of course Thiel would do well because he's filthy rich and his wealth and power would be useful to a hypothetical fascist regime, but Mencius Moldbug? What does he have to offer to the Fourth Reich?

sat on my keys!
Oct 2, 2014

Woolie Wool posted:

What is it with "nerd fascism" anyway? Here you have a group of people utterly lacking in the things fascism expected of a man--courage, fortitude, physical power, self-sacrifice. Even the biggest sad sack shitbirds in the Third Reich leadership were less pathetic than the neo-reactionaries--even Göring was once one of Germany's greatest military pilots. If they actually got their reactionary society, they would regret their wishes very, very quickly. Of course Thiel would do well because he's filthy rich and his wealth and power would be useful to a hypothetical fascist regime, but Mencius Moldbug? What does he have to offer to the Fourth Reich?

The coming monarchial society will need lots of poorly trained web developers to help manage the seething underclass, obviously. When the High King institutes Eugenics as the law of the land their racially pure genes will be in high demand to raise the IQ of the techno-aristocracy.

Republican Vampire
Jun 2, 2007

He likes Lord of the Rings, Harry Potter, and other fantasy novels that inadvertently conform to fascist values because fascism is a logical extension of heroism.

It's the same impulse that prompted Tremble, Hetero Swine, only without the self-awareness to transmute it into satire.

Woolie Wool
Jun 2, 2006


bartlebyshop posted:

The coming monarchial society will need lots of poorly trained web developers to help manage the seething underclass, obviously. When the High King institutes Eugenics as the law of the land their racially pure genes will be in high demand to raise the IQ of the techno-aristocracy.

Being a barely tolerated minor functionary of a ragingly anti-intellectual dictatorship that can and will destroy you if it tires of your presence still doesn't sound like a big step up from Mencius Moldbug's current lot in life. Is the chance to lick the boots of power really worth that much?

Can we talk about Orion's Arm here? It's cut from similar cloth to LessWrong (mystified "science" that is actually religion in science's skin, impenetrable neologisms, politics that teeter between vulgar libertarianism and brute authoritarianism, incredibly specious predictions that are treated as inevitabilities, overbearing smugness) and paints itself as a "hard sci-fi" worldbuilding project, but takes the whole Singularity mythology to an extent far more extreme than usual. Orion's Arm has at least six Super Saiyan singularity levels (or "toposophics") and each level gives one unassailable advantages against all levels below--an "third singularity" AI would have perfect prediction of the minds of the SI:≤2 untermenschen, able to counter any and all plans inferior intellects might make against it, and construct perfect propaganda that can "memetically" subvert anyone on a lower singularity level. Forget the "memetic hazard" of Roko's basilisk, an Orion's Arm robot Jesus can take over your mind and force you to agree with it using mere words.

A warning to anyone willing to stare at this trainwreck: many pages of this site have absolutely dreadful photoshopped illustrations that are FULL OF :nws: NUDITY. Any further links may contain NWS or NMS content, click with caution!

Yes, this site has all the worst bits of nerd sexuality, like three different kinds (:nms: pictures!) of furries, people who have been biologically engineered to be living sex dolls/super-prostitutes, humans who are engineered to remain physically like children until death, but can have sex and reproduce, and other things that my brain is probably repressing to stop me from experiencing mental trauma.

There is a libertarian capitalist megacorp bordello in the form of the Non-Coercive Zone (:nws: illustration near the end of the page) with sapient advertisements that stalk you and try to take over your mind (so much for "non-coercive" when there is adware that thinks and can infect your brain), despite the complete absurdity in capitalism in a universe with magitech, nano-assemblers that can transmute elements, and AI gods. In fact there is an uncritical acceptance of capitalism as some sort of eternal universal constant throughout the entire project. Even the Sephirotic Empires, which are the perfection of the libertarian fear of a "nanny state"--antiseptic demesnes where paternalistic AI overlords control absolutely everything, have total command over every feature of the environment, can take over anyone's mind at anytime, and it's impossible to tell what they are doing or not doing, or if your thoughts are even yours or the suggestions of your AI betters--economies are frequently organized among broadly capitalist lines with no real explanation (nor any reason why a group of sixth singularity AI gods would copy their names and iconography from the Kabbalah).

The "plausible technology" is absolutely hilarious and makes mincemeat out of the limitations of the physics that the site claims to obey. It is also an object lesson as to why space opera should never explain its own technology beyond the most superficial level.

The entire setting, even in the "nicer" places like the Utopia Sphere (basically exactly what the name says, let the AI daddies take care of you and go traipsing naked through they fairytale forest they've generously provided for you, tralalalala...) run on a profound anti-egalitarianism. Smarter beings are inherently better than dumber beings, their fitness to rule is self-evident and rule they will, no matter what happens. Attempts by "modosophonts" (idiot stupidhead normal beings like us humans) to resist are always futile and in the "hu-friendly" regions, considered misguided as well. Mere humans cannot fathom the actions of the SI:1 AI, let alone judge them, and the SI:1 AI are as insects before the majesty of SI:2 AI, and so on up the Great Chain of Being. It's perfused by the musty odor of European reactionism, the eternal hierarchy of people "in their place" and confidence in the greatness of Great Gender-Ambiguous Jupiter Brains.

I once pointed this out to the maintainer of the project and he countered that you could always run away from one of the Sephirotic Empires and live in a "wildhu" zone where the AI gods leave humans to their own devices and watch them, unseen, for their own amusement. No really, he did.

Peztopiary
Mar 16, 2009

by exmarx
I admire his ability to miss the point of his own work.

Political Whores
Feb 13, 2012

Republican Vampire posted:

He likes Lord of the Rings, Harry Potter, and other fantasy novels that inadvertently conform to fascist values because fascism is a logical extension of heroism.

It's the same impulse that prompted Tremble, Hetero Swine, only without the self-awareness to transmute it into satire.

What is Tremble, Hetero Swine?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
Ha, this is promising. I love stupid dated nerd shi-

:stare:

:barf:

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that

So many links I am too afraid to click on.

Woolie Wool
Jun 2, 2006


You are right to fear. Orion's Arm is horribly hosed up in all sorts of ways.

Put the NMS tag back on that link. Please.

Woolie Wool fucked around with this message at 05:32 on Feb 27, 2015

A Man With A Plan
Mar 29, 2010
Fallen Rib
Oh hey a few more posts in the LessWrong thread! :yikes:

So I really didn't want to dig into it much, but I do have a question. Can normal people achieve the god level intelligence tiers, or is that a robot only thing?

Basically, is this mild wish fulfillment, or really really extreme?

A Wizard of Goatse
Dec 14, 2014

So what the gently caress is the point of this thing, I stopped reading a few links in but is it just the world's most longwinded way for the authors to jerk off or what

yeah that was the point where I bailed

Woolie Wool
Jun 2, 2006


Dr. Borous from Fallout New Vegas posts on LessWrong.

Actually Old World Blues is a brilliant satire of the whole mindset behind people like Eli Yudkowsky, the Orion's Arm project, neo-reaction, etc. But Dr. Borous stands above the rest of the Think Tank in his amorality, vanity, cruelty, and obsession with his new power to control and hurt people as an immortal robot brain.

A Wizard of Goatse posted:

So what the gently caress is the point of this thing, I stopped reading a few links in but is it just the world's most longwinded way for the authors to jerk off or what

Yes, it's basically a kaleidoscope of options for horrible nerdtopias.

Woolie Wool fucked around with this message at 07:25 on Feb 27, 2015

the talent deficit
Dec 20, 2003

self-deprecation is a very british trait, and problems can arise when the british attempt to do so with a foreign culture





A Man With A Plan posted:

Oh hey a few more posts in the LessWrong thread! :yikes:

So I really didn't want to dig into it much, but I do have a question. Can normal people achieve the god level intelligence tiers, or is that a robot only thing?

Basically, is this mild wish fulfillment, or really really extreme?

only friend computer gets to be a god but friend computer is gonna manipulate the simulated universe you will now exist in to make you feel like a god. or torture you. one of those

Woolie Wool
Jun 2, 2006


A Man With A Plan posted:

So I really didn't want to dig into it much, but I do have a question. Can normal people achieve the god level intelligence tiers, or is that a robot only thing?

Basically, is this mild wish fulfillment, or really really extreme?
You can't even be a robot at the god level intelligence tiers, you have to become a literal planet-sized supercomputer, and then for further ascensions you have to go to greater and greater extremes--Jupiter brains, star brains, Dyson sphere brains. The biggest gods of all are basically distributed in every single piece of computing equipment in an area that spans hundreds of light-years, bound together with wormholes. It's really absurd poo poo. They literally have a material that's the perfect processor ("computronium") and they literally turn planets into it and make Dyson spheres out of it.

If you want to keep your meat sack human body, SI:1 is the best you'll do, and even then only turning a large portion of your body into a mass of computronium.

THIS IS WHAT ORION'S ARM ACTUALLY BELIEVES

E: I will at least give them credit for openly admitting that there would be no cyber-rapture, and before the earth is nearly annihilated by gray goo and the first AI god banishes humans from the planet forever, the proletariat of earth are forced into serfdom under the heel of megacorporations and occasionally used for heinous biological and cybernetic experiments. This is somehow not entirely evil.

Woolie Wool fucked around with this message at 07:44 on Feb 27, 2015

A Wizard of Goatse
Dec 14, 2014

A Man With A Plan posted:

Oh hey a few more posts in the LessWrong thread! :yikes:

So I really didn't want to dig into it much, but I do have a question. Can normal people achieve the god level intelligence tiers, or is that a robot only thing?

Basically, is this mild wish fulfillment, or really really extreme?

I think the idea Yudkowsky has (although I might be getting this from all the other kurzweilian apocalypse cultists who're constantly cribbing off each other) is that if you're a good boy then one day the godputer will torrent an .iso of your mind and your digisoul will live forever in machine heaven as part of it. Whether that means you can then go forth and play god in the real world a la Joseph Smith or it's just simulated whores forever for your emulation is up to your own personal stroke fantasy because generally by that point the AI's power level has gone over 9000 and it's consumed the entirety of the universe's resources and moved on to creating other, bigger universes via technomagic. Presumably any 'normal people' whose definition of 'normal' involves a physical body or living brain have been rendered down for their raw materials by this point.

A Wizard of Goatse fucked around with this message at 07:38 on Feb 27, 2015

Woolie Wool
Jun 2, 2006


Darth Walrus posted:

Do we actually have any evidence that cryonic suspension as-is is reversible? Like, has someone got frozen for a year, thawed out, and come out fine and dandy? Or is it just a really, really expensive assisted suicide?

No. The freezing of your body's water results in the total destruction of all of your tissues--the cells are physically ripped apart. There is no way to make you deader.

Tunicate
May 15, 2012

Then you get a can of tuna frozen to your neck and repetitively hit with a hockey stick.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



Namarrgon posted:

Yeah, but, after the first one launches, it doesn't really matter what was promised, you still don't really have a reason to launch anyway except spite.

AI threatens with torture, you say no:

Either you are instantly being tortured, which means you are a simulation and you never had the ability to set it free anyway and it never really mattered if you would have said yes or no
OR
You are not instantly tortured, which means you are not a simulation and the whole point is moot.

The AI is better off bluffing that it will torture you and after you say no just shut off the simulations to preserve energy.
If Yud was clever he'd be making such a huge deal out of this to try and make whatever hypothetical person is sitting on the switch that stands between "super-AI doing its thing" and "that not happening" is convinced that yeah it's basically entirely inevitable.

I had a friend of mine reblog an article which was basically the Yudkowsky argument (it even cited him, though not in its formal citations) - computers will inevitably become super forever because Moore's Law is an actual explicit physical law, we'll all become immortal or die off, we should really be thinking about AI peril a lot more because I guess that is important somehow, etc.

Mr. Sunshine
May 15, 2008

This is a scrunt that has been in space too long and become a Lunt (Long Scrunt)

Fun Shoe

Namarrgon posted:

Yeah, but, after the first one launches, it doesn't really matter what was promised, you still don't really have a reason to launch anyway except spite.

AI threatens with torture, you say no:

Either you are instantly being tortured, which means you are a simulation and you never had the ability to set it free anyway and it never really mattered if you would have said yes or no
OR
You are not instantly tortured, which means you are not a simulation and the whole point is moot.

The AI is better off bluffing that it will torture you and after you say no just shut off the simulations to preserve energy.

Nah, you're coming at it from the wrong angle. What the Less-Wrongers (and the author of that alien AI thingy that got posted) seem to be saying is that there is no tangible difference between you and a perfect simulation of you, therefor if the AI tortures your simulation it is actually torturing you (or at least the simulation is an actual, sentient being, and allowing it to be tortured is the same as allowing a real human to be tortured). The entire thing hinges on the idea that a powerful enough AI can run a simulation that is identical to reality, such that it is impossible to tell the difference.

Tiggum
Oct 24, 2007

Your life and your quest end here.


Mr. Sunshine posted:

Nah, you're coming at it from the wrong angle. What the Less-Wrongers (and the author of that alien AI thingy that got posted) seem to be saying is that there is no tangible difference between you and a perfect simulation of you, therefor if the AI tortures your simulation it is actually torturing you (or at least the simulation is an actual, sentient being, and allowing it to be tortured is the same as allowing a real human to be tortured). The entire thing hinges on the idea that a powerful enough AI can run a simulation that is identical to reality, such that it is impossible to tell the difference.

But what's the argument for capitulating to its threats? If it's the sort of being that would create and then torture a sentient being in order to get its way, why would you ever want to see it set free?

sat on my keys!
Oct 2, 2014

Tiggum posted:

But what's the argument for capitulating to its threats? If it's the sort of being that would create and then torture a sentient being in order to get its way, why would you ever want to see it set free?

God-like artificial intelligences will be able to hypnotize your puny human brain into letting them out of the trap through the power of IRC.

Namarrgon
Dec 23, 2008

Congratulations on not getting fit in 2011!

Mr. Sunshine posted:

Nah, you're coming at it from the wrong angle. What the Less-Wrongers (and the author of that alien AI thingy that got posted) seem to be saying is that there is no tangible difference between you and a perfect simulation of you, therefor if the AI tortures your simulation it is actually torturing you (or at least the simulation is an actual, sentient being, and allowing it to be tortured is the same as allowing a real human to be tortured). The entire thing hinges on the idea that a powerful enough AI can run a simulation that is identical to reality, such that it is impossible to tell the difference.

Oh yeah I got that, but that is a premise so ridiculous I didn't find it worth it to address. It doesn't really matter how perfect your simulation is though, the point still stands; if you are a simulation you never had influence anyway and if you are not it doesn't matter.

That said I like his Harry Potter fanfic more than the original honestly. Though I liked it less when I learned it was an author self-insert and the Harry Potter of the story is supposed to be sympathetic.

A Man With A Plan
Mar 29, 2010
Fallen Rib



Well, thanks guys. I think.

What I don't get is where all these dudes are coming from. Like, I finished a BS and MS in computer science not terribly long ago, and I think I met a grand total of one person who took Kurzweil seriously, and maybe a couple more that had read some HPMOR without getting into all the other LW stuff. What spawns LessWrongers if not the bastion of nerddom, a CS department?

sat on my keys!
Oct 2, 2014

A Man With A Plan posted:

Well, thanks guys. I think.

What I don't get is where all these dudes are coming from. Like, I finished a BS and MS in computer science not terribly long ago, and I think I met a grand total of one person who took Kurzweil seriously, and maybe a couple more that had read some HPMOR without getting into all the other LW stuff. What spawns LessWrongers if not the bastion of nerddom, a CS department?

A shitload of them are in high school or just starting college.

Telarra
Oct 9, 2012

A Man With A Plan posted:

Well, thanks guys. I think.

What I don't get is where all these dudes are coming from. Like, I finished a BS and MS in computer science not terribly long ago, and I think I met a grand total of one person who took Kurzweil seriously, and maybe a couple more that had read some HPMOR without getting into all the other LW stuff. What spawns LessWrongers if not the bastion of nerddom, a CS department?

CS blogs, of course. The true bastion of nerddom.

Woolie Wool
Jun 2, 2006


A Man With A Plan posted:

Well, thanks guys. I think.

What I don't get is where all these dudes are coming from. Like, I finished a BS and MS in computer science not terribly long ago, and I think I met a grand total of one person who took Kurzweil seriously, and maybe a couple more that had read some HPMOR without getting into all the other LW stuff. What spawns LessWrongers if not the bastion of nerddom, a CS department?

CS departments are too establishment for an :jerkbag: AUTODIDACT :jerkbag: like Yudkowsky.

Mr. Sunshine
May 15, 2008

This is a scrunt that has been in space too long and become a Lunt (Long Scrunt)

Fun Shoe

Tiggum posted:

But what's the argument for capitulating to its threats? If it's the sort of being that would create and then torture a sentient being in order to get its way, why would you ever want to see it set free?

I think the idea is either that A) it is morally abhorrent to allow the simulated you (ie a sentient being sharing all your fears and hopes) to be tortured, therefore you have to submit to the AI's blackmail, or B) you cannot be sure that you are not being simulated, and since the simulation is so perfect that the simulation will mirror every action that the real you takes, the only way for the simulation to avoid being tortured is to let the AI out - as this will automatically mean the real you letting the real AI out.

Then some of the argument seems to hinge on the idea that since the simulation and the real person are identical, they *are* the same person, and by not releasing the AI you are allowing yourself to be tortured - but that would be a ridiculous argument, so I dunno...

Political Whores
Feb 13, 2012

No, I'm pretty sure the argument relies on a stupid statistical inference that if the AI runs a billion billion fake you, you are more likely to be a fake you than the real you. What is your subjective degree of certainty that you are you and not a simulated you? What if I told you that there was 1 real you and infinity fake yous?

Bayesianism motherfucker.

Germstore
Oct 17, 2012

A Serious Candidate For a Serious Time
What type of baby bitch cowers in fear at the possibility instead of yelling 'YOLO' and pulling the plug. If you're a perfect simulation then the real you did the same thing and you just killed a mad god.

Woolie Wool
Jun 2, 2006


Somfin posted:

Blaise Pascal writes like the world's most grandfatherly lecturer. You can almost see him touching his fingertips together as he writes. It's bizarrely wonderful.

STEM nerds didn't exist before the 20th century. In Pascal's time, the humanities were the foundation of higher education, and chief among the humanities were literature, rhetoric, and philosophy. Blaise Pascal would have thought very little indeed of the intellect of someone like Yudkowsky, and to this day STEM nerds, as much as they might disdain philosophers, have a terror of them because they know in their hearts that philosophers have real knowledge that can't be duplicated using the scientific method alone.

E: One thing I love about long-dead intellectuals is how lovely their writing is, even when I don't understand any of it.

Woolie Wool fucked around with this message at 17:22 on Feb 27, 2015

Ocean Book
Sep 27, 2010

:yum: - hi
wouldnt a machine that could perfectly simulate the universe have to be a lot larger than the universe? where does it get all the energy from?

Woolie Wool
Jun 2, 2006


You're not supposed to think about energy and thermodynamics in singulariwank (see Orion's Arm for further proof).

90s Cringe Rock
Nov 29, 2006
:gay:
Obviously a sufficiently advanced robot brain would find the flaws in our puny human understanding of entropy. Conveniently, this helps it restore Yudkowsky to eternal life no matter how many games of football the cryogenics people play with his head!

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Woolie Wool posted:

philosophers have real knowledge
:newlol:

If there's one thing that three levels of philosophy in college taught me, it's that nobody in that discipline knows anything about anything except how to follow their own arguments from premises that they choose.

Woolie Wool
Jun 2, 2006


Sham bam bamina! posted:

:newlol:

If there's one thing that three levels of philosophy in college taught me, it's that nobody in that discipline knows anything about anything except how to follow their own arguments from premises that they choose.

You do realize that before the scientific method existed it was philosophers who had to create it, right (science originated as "natural philosophy")? There is more to human knowledge than the accumulation of data and using this data to draw conclusions about the natural world. You actually have to have the philosophical framework to reason at all, let alone have an optimal reasoning formula. And sometimes you need someone to point out the intellectual box you're in that you can't see for yourself because it defines the parameters of your reasoning. Then there are squishy things that don't reduce to data, facts, and predictions based on data and facts like moral philosophy and political theory.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



Sham bam bamina! posted:

:newlol:

If there's one thing that three levels of philosophy in college taught me, it's that nobody in that discipline knows anything about anything except how to follow their own arguments from premises that they choose.
Scientific inquiry is a mighty tool but there is no reason why it must be used for good ends. (Indeed, what is a "good" end?)

Adbot
ADBOT LOVES YOU

A Wizard of Goatse
Dec 14, 2014

Sham bam bamina! posted:

:newlol:

If there's one thing that three levels of philosophy in college taught me, it's that nobody in that discipline knows anything about anything except how to follow their own arguments from premises that they choose.

Which is a pretty difficult and useful thing to be able to do in an exhaustive or persuasive manner, yes, and one that all the people making noise about their brilliant ideas that the whole stupid world is too stupid to appreciate fail at

  • Locked thread