Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
It's a perfectly reasonable concern to have - if you believe that the AI god will both be powerful and intelligent enough to turn the solar system into paperclips atom by atom and, at the same time, incapable of having even a basic sense of ethics or any kind of reflection about whether what it wants to do is a good (or even just minimally useful) idea or not.

Adbot
ADBOT LOVES YOU

Djeser
Mar 22, 2013


it's crow time again

quote:

The PUA Community began decades ago with men that wanted to learn how to get better at seducing women. As I understand it, they simply began posting their (initially) awkward attempts at love online. Over the years, they appear to have amassed a fairly impressive set of practical knowledge and skills in this domain.
Emphasis mine

quote:

I find it alarming that such a valuable resource would be monopolized in pursuit of orgasm; it's rather as if a planet were to burn up its hydrocarbons instead of using them to make useful polymers.
I wonder why this guy doesn't have more friends?

quote:

...pretty much what I was thinking was expressed well by Scott Adams:
lol

quote:

I've met people who were shockingly, seemingly preternaturally adept in social settings. Of course this is not magic. Like anything else, it can be reduced to a set of constituent steps and learned. We just need to figure out how.
Literally :spergin:

Phobophilia
Apr 26, 2008

by Hand Knit
That's been a modern science fiction staple when talking about self-perpetuating corporations that own themselves.

sat on my keys!
Oct 2, 2014

Nothing helps me win friends and influence people like standing there with a desperate look on my face, internally screaming "What was step 51?! ".

Epitope
Nov 27, 2006

Grimey Drawer

Djeser posted:

quote:

...pretty much what I was thinking was expressed well by Scott Adams:

Uhg, I forgot how horrible scott adams is, not surprising these guys love his crap.

Political Whores
Feb 13, 2012

The quote in question:

quote:

...

I think technical people, and engineers in particular, will always have good job prospects. But what if you don't have the aptitude or personality to follow a technical path? How do you prepare for the future?

I'd like to see a college major focusing on the various skills of human persuasion. That's the sort of skillset that the marketplace will always value and the Internet is unlikely to replace. The persuasion coursework might include...

Sales methodsPsychology of persuasionHuman Interface designHow to organize information for influencePropagandaHypnosisCultsArt (specifically design)DebatePublic speakingAppearance (hair, makeup, clothes)NegotiationsManaging difficult personalitiesManagement theoryVoice coachingNetworkingHow to entertainGolf and tennisConversation


You can imagine a few more classes that would be relevant. The idea is to create people who can enter any room and make it their bitch. [emphasis added]

Colleges are unlikely to offer this sort of major because society is afraid and appalled by anything that can be labeled "manipulation," which isn't even a real thing.

Manipulation isn't real because almost every human social or business activity has as its major or minor objective the influence of others. You can tell yourself that you dress the way you do because it makes you happy, but the real purpose of managing your appearance is to influence how others view you. 

Humans actively sell themselves every minute they are interacting with anyone else. Selling yourself, which sounds almost noble, is little more than manipulating other people to do what is good for you but might not be so good for others. All I'm suggesting is that people could learn to be more effective at the things they are already trying to do all day long.


I know we joke a lot about :spergin: but this stuff reads more like someone aspiring to be a sociopath than Asperger's. They explicitly don't care about connecting with people, just manipulating them.

Epitope
Nov 27, 2006

Grimey Drawer

Political Whores posted:

The quote in question:


I know we joke a lot about :spergin: but this stuff reads more like someone aspiring to be a sociopath than Asperger's. They explicitly don't care about connecting with people, just manipulating them.

That, and the oh-so-persecuted "society would never allow this!" Like, has Dilbert never heard of an MBA?

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.
Ironically enough, because connecting with people just so happens to be the best way to manipulate them, as anybody with a background in marketing or rhetorics (you know, those fields they study at universities about how to manipulate people) could tell you.

Triskelli
Sep 27, 2011

I AM A SKELETON
WITH VERY HIGH
STANDARDS


All these works of science fiction and these guys don't think researchers wouldn't put some insane contingencies in place like "don't connect the self improving AI to the Internet", "make sure it has an easily accessible off switch", or "make sure you can unplug it"?

I mean I get why the Paperclip thing is terrifying and could potentially become world ending if it became self-sufficient, but if there's one thing we're good at it's creating in-built dependencies. Who the gently caress do they send millions of dollars to to prevent this from happening?

Telarra
Oct 9, 2012

Triskelli posted:

All these works of science fiction and these guys don't think researchers wouldn't put some insane contingencies in place like "don't connect the self improving AI to the Internet", "make sure it has an easily accessible off switch", or "make sure you can unplug it"?

To answer this question, Yud roleplayed as an AI-in-a-box, betting $200 to all comers that he could convince them to let him out with only text chat.

'Course, the chat logs are strictly confidential, and he gave up after his success rate plummeted to 50-50.

LazyMaybe
Aug 18, 2013

oouagh
Not that that meant anything considering that the other person knows he isn't an AI anyways.

Triskelli
Sep 27, 2011

I AM A SKELETON
WITH VERY HIGH
STANDARDS


Well, it's still worth the exercise. You just have to make sure you can provide the AI anything it would want without letting it out of its "box". Most of these people seem to approach the whole thing as a software problem rather than a hardware one.

Like, if you want the AI to be able to access the Internet as an information building resource, let it ping another computer electronically to print out a random Wikipedia page, scoot it over to the AI terminal and then run through a scanner.

E: or y'know, make it run on C batteries. Those things are impossible to find.

Triskelli fucked around with this message at 09:11 on Oct 31, 2014

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

Moddington posted:

To answer this question, Yud roleplayed as an AI-in-a-box, betting $200 to all comers that he could convince them to let him out with only text chat.

'Course, the chat logs are strictly confidential, and he gave up after his success rate plummeted to 50-50.

Specifically, he played two games against his cultists and won them, then bragged about his amazing victory. This made people outside his cult notice his challenge, three accepted, Yudkowsky lost against all three of them. Giving up immediately in the face of adversity, Yudkowsky then wrote a triumphant blog post about how awesome the experiment was and how he's great because he never gives up in the face of adversity (a uniquely Japanese virtue).

What this means is that the only thing Yudkowsky has ever experimentally proven is that his own followers are more likely to choose to unleash an evil AI upon the world than an outsider is. So even if you agree with every single one of Yudkowsky's beliefs and arguments and accept everything he says at face value, you should still conclude that Yudkowsky's writings are harmful and MIRI should be destroyed.

Qwertycoatl
Dec 31, 2008

Lottery of Babylon posted:

Specifically, he played two games against his cultists and won them, then bragged about his amazing victory. This made people outside his cult notice his challenge, three accepted, Yudkowsky lost against all three of them. Giving up immediately in the face of adversity, Yudkowsky then wrote a triumphant blog post about how awesome the experiment was and how he's great because he never gives up in the face of adversity (a uniquely Japanese virtue).

I'm reminded of this guy, who used some bullshit martial arts skills to knock over his students without touching them, and bet $5000 he could beat a real martial artist, with hilariously predictable results.

RubberJohnny
Apr 22, 2008
That stuff about PUA highlights the main problem with their perspective - there's an old article from Yud where he asks, "if nerds are so smart, why can't they work out how to be popular?" His answer to that is a lack of rationality - they're not using the right techniques, if only they can get smart in logic and statistics they can boil down all human behaviour to a predictable flowchart to follow all the way to popularity and success.

Except that humans are fundamentally irrational, and applying rational rules to model their behaviour will give you an imperfect model at best. I don't think they've realised their premise is flawed in years.

Effectronica
May 31, 2011
Fallen Rib

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

RubberJohnny posted:

Except that humans are fundamentally irrational, and applying rational rules to model their behaviour will give you an imperfect model at best. I don't think they've realised their premise is flawed in years.
Humans are perfectly rational, most of the time. They're just not logical, is all. Unless they're insane they'll usually do whatever they think is in their best interest - but learning what they consider that to be takes getting to know them and emphasizing with their emotional states, which the LessWrong crowd not only don't care to bother with but actively consider below them.

Effectronica
May 31, 2011
Fallen Rib

Cardiovorax posted:

Humans are perfectly rational, most of the time. They're just not logical, is all. Unless they're insane they'll usually do whatever they think is in their best interest - but learning what they consider that to be takes getting to know them and emphasizing with their emotional states, which the LessWrong crowd not only don't care to bother with but actively consider below them.

Bargearse
Nov 27, 2006

🛑 Don't get your pen🖊️, son, you won't be 👌 needing that 😌. My 🥡 order's 💁 simple😉, a shitload 💩 of dim sims 🌯🀄. And I want a bucket 🪣 of soya sauce☕😋.

Triskelli posted:

All these works of science fiction and these guys don't think researchers wouldn't put some insane contingencies in place like "don't connect the self improving AI to the Internet", "make sure it has an easily accessible off switch", or "make sure you can unplug it"?

I'm pretty sure Arthur C. Clarke saw this one coming. In 2010, the astronauts re-activating HAL fitted him with completely mechanical kill-switches designed to cut all power to the AI at a millisecond's notice.

Effectronica
May 31, 2011
Fallen Rib

Bargearse posted:

I'm pretty sure Arthur C. Clarke saw this one coming. In 2010, the astronauts re-activating HAL fitted him with completely mechanical kill-switches designed to cut all power to the AI at a millisecond's notice.

Triskelli
Sep 27, 2011

I AM A SKELETON
WITH VERY HIGH
STANDARDS


...Can we help you?

Effectronica
May 31, 2011
Fallen Rib

Triskelli posted:

...Can we help you?

Ratoslov
Feb 15, 2012

Now prepare yourselves! You're the guests of honor at the Greatest Kung Fu Cannibal BBQ Ever!

Triskelli posted:

...Can we help you?

Actually, I'm kind of okay with this stchick.

Effectronica
May 31, 2011
Fallen Rib

Ratoslov posted:

Actually, I'm kind of okay with this stchick.

Toph Bei Fong
Feb 29, 2008



Lottery of Babylon posted:

Specifically, he played two games against his cultists and won them, then bragged about his amazing victory. This made people outside his cult notice his challenge, three accepted, Yudkowsky lost against all three of them. Giving up immediately in the face of adversity, Yudkowsky then wrote a triumphant blog post about how awesome the experiment was and how he's great because he never gives up in the face of adversity (a uniquely Japanese virtue).

What this means is that the only thing Yudkowsky has ever experimentally proven is that his own followers are more likely to choose to unleash an evil AI upon the world than an outsider is. So even if you agree with every single one of Yudkowsky's beliefs and arguments and accept everything he says at face value, you should still conclude that Yudkowsky's writings are harmful and MIRI should be destroyed.

"It is difficult to get a man to understand something, when his salary depends on his not understanding it."

Political Whores
Feb 13, 2012

Either it's a raid by FYAD, or a posting bot has gained sentience and is trying to communicate by posting the content of Effectronica's anime wank folder.

BHB
Aug 28, 2011

Political Whores posted:

Either it's a raid by FYAD, or a posting bot has gained sentience and is trying to communicate by posting the content of Effectronica's anime wank folder.


Political Whores
Feb 13, 2012


Hot

Zoq-Fot-Pik
Jun 27, 2008

Frungy!

Political Whores posted:

Either it's a raid by FYAD, or a posting bot has gained sentience and is trying to communicate by posting the content of Effectronica's anime wank folder.

Triskelli
Sep 27, 2011

I AM A SKELETON
WITH VERY HIGH
STANDARDS


Political Whores posted:

Either it's a raid by FYAD, or a posting bot has gained sentience and is trying to communicate by posting the content of Effectronica's anime wank folder.

Yeah, considering they've hit the PYF cat thread it's probably just a raid. They'll get bored soon enough.

ArfJason
Sep 5, 2011
is this the mock thread? I just want to say, long time reader, first time contributor.
I was playing my videogames the other day, and noticed that mother 1's (earthbound zero if you're a dumbshit rear end in a top hat who probably gets his clothes from one of those gatherings of bolivians that put their stands next to each other under a gigantic circus tent) overworld map is enormous, but has pretty much nothing in it. What kind of idiot designer was put in charge? I mean just look at this

ArfJason fucked around with this message at 16:59 on Oct 31, 2014

Srice
Sep 11, 2011

Triskelli posted:

Yeah, considering they've hit the PYF cat thread it's probably just a raid. They'll get bored soon enough.

Sounds like someone doesn't keep up to date on the rules. Shameful as heck.

Spanish Manlove
Aug 31, 2008

HAILGAYSATAN
Goddamn that's too big for PYF, but I know another thread that may appreciate it

Zoq-Fot-Pik
Jun 27, 2008

Frungy!

Triskelli posted:

Yeah, considering they've hit the PYF cat thread it's probably just a raid. They'll get bored soon enough.

Triskelli
Sep 27, 2011

I AM A SKELETON
WITH VERY HIGH
STANDARDS


Srice posted:

Sounds like someone doesn't keep up to date on the rules. Shameful as heck.

Well poo poo, I didn't know we were passing out Anime like loving halloween candy. Egg on my face for thinking it was just one rear end in a top hat.

BHB
Aug 28, 2011

Triskelli posted:

Well poo poo, I didn't know we were passing out Anime like loving halloween candy. Egg on my face for thinking it was just one rear end in a top hat.



Djeser
Mar 22, 2013


it's crow time again

on a scale of yudkowsky to nite crew, how ironic is this anime we're talking about

i know gentlegoons always temper their anime with irony

Seshoho Cian
Jul 26, 2010

Djeser posted:

on a scale of yudkowsky to nite crew, how ironic is this anime we're talking about

i know gentlegoons always temper their anime with irony

BHB
Aug 28, 2011

Djeser posted:

on a scale of yudkowsky to nite crew, how ironic is this anime we're talking about

i know gentlegoons always temper their anime with irony

Adbot
ADBOT LOVES YOU

Srice
Sep 11, 2011

Djeser posted:

on a scale of yudkowsky to nite crew, how ironic is this anime we're talking about

i know gentlegoons always temper their anime with irony

nite crew is not ironic about enjoying anime, friend

  • Locked thread