Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
WrenP-Complete
Jul 27, 2012

Broccoli Cat posted:

yes, the OP touches upon the Progressive posture. Thank you for exemplifying in such illuminating style.


Eripsa posted:


The OP is echoing eschatological fears promulgated in popular media because that's what the propaganda is designed to do. It keeps people ignorant, afraid, and compliant.

I'm trying to explain why a lot of these fears are misguided by talking seriously about the neuroscience and its philosophical implications. I'm trying to clear the fog of public ignorance surrounding these debates. I'm sorry you can't see the difference.

Hm... I don't know what the Progressive posture is so I looked at the OP.
OP:

Broccoli Cat posted:

Years back, Google had a contest called Solve For X, wherein douchebags would submit whatever they saw as a problem and offer their solution, most often telling the folks at Google that problems like lack of diversity and global warming could be solved by Socialism.

I was one of those douchebags, but my problem was "all problems faced by humanity" and my solution was "create a solid-state environment for consciousness".

so, this thread is for people who want to have an afterlife as the god of a virtual universe, much like the Matrix, except without the ridiculous aspects having to do with the ape-perspective that bodies are important and people will always be stupid. Or like Transcendence, but the sequel, where everybody gets to be a digital genius god.

The Earth will one day be post-human, with us or without us.

We must become hyperintelligent, and we must become machines.

Our inevitable evolution is hated by the Right, because the Bible never mentions computers and magic jesus souls are somehow integral with brain cells, and it is hated by the Left, because virtual godhood is the ultimate in autonomy and individualism...also nobody will need social justice, or anything else for that matter...except probably sunlight for power.

I'm too lazy to post articles, but there are many...people...stupid people, talk about how scary and dangerous stuff is...and worry about how the rich will get to be hyperintelligent gods and somehow still be petty small-minded greedy apes who won't want poor people to evolve.

Elon Musk is making a cerebral hairnet which may enable us to take steps toward fixing our greatest problem: our animal nature as desperate dying apes.

...or he's full of poo poo, but has a cool idea.
:iiam:

Could someone enlighten me? Googling got me Youtube videos on improving my physical posture.

WrenP-Complete fucked around with this message at 18:11 on May 29, 2017

Adbot
ADBOT LOVES YOU

Eripsa
Jan 13, 2002

Proud future citizen of Pitcairn.

Pitcairn is the perfect place for me to set up my utopia!

WrenP-Complete posted:

What would that be like?

So on FB I've been talking about AlphaGo all week; also noteworthy is QANTA, the AI that just beat a team of National Quiz Bowl champions for the first time. The consensus is that the next big game to fall will be Starcraft, in the next 5-10 years. The far-future challenge taken seriously today is soccer. The hardware challenge is easily 20 years away. There will undoubtedly be a great head-holding image of this to add to the set.




But the tech world has given up on the Turing test or creating AI as social conversing participants. Partly because of the Tay.ai debacle, now no one is willing to come forward with a convincing bot. Facebook just open-sourced its chatbot software, which it uses for automated customer service chat. But this isn't "conversation" in any deep. People have no idea how to formalize the constraints on "conversation". Hence, everyone gives up on the Turing Test.

The problem with Turing Test demanding "conversational language use" is that most language use isn't conversational. The exchange I have with tech support or a cashier is rote by design and can be pretty easily automated without the pretense of some deeper intelligence. Having a conversation means establishing who you are talking to, and the interests each has in engaging the conversation. Conversation isn't just about issuing appropriate responses, it is about moving the discussion from one issue to another, doubling back to reinforce or clarify issues as they are used, and appreciating the perspectives and goals of the interlocutor for having the conversation at all. Typically sitting two strangers down together and expecting them to talk doesn't go very well. You need to give context and motivation.

In other words, to build a convincing chat bot, you need a bot with not just personality but a history and a project it cares about. If you're having a conversation with, say, a person waiting in a hospital while their partner undergoes life-threatening surgery, then that gives some focus to the topic and energy of the conversation. The question turns from "is this machine thinking?" to "is this a convincing conversation from someone in a hospital waiting room?" The former standards are obscure to the point of meaningless, but the latter standards will be much easier to express and identify.

People treat Turing's test like an interrogation session. It is pretty difficult to hold a conversation while being interrogated. If I just stop some rando on the street and try to talk to them about some rando topic, even an intelligent person with things to say on the topic might stammer around for a bit without preparation and context. A good conversation needs setting and context; this is something we humans often take for granted, but it's not fair to deprive that same benefit from the machines. On that note, the chatbot Eugene beat the Turing test a few years ago by pretending to be young, foreign, non-native speaker, etc, thereby setting context for its awkward mistakes and lack of. A lot of people thought this was "cheating". I'm saying that establishing some social context and persona a necessary part for any conversation.

I have been thinking about a kind of "embedded Turing Test", where the participants must converse about a scenario that plays out (like a murder mystery dinner party), and where they are each given a role and motivation, and then must engage each other from that role.PLay though the scenario entirely, let anyone talk to anyone for five minutes, then see which of the roles were manned by bots. Maybe do something like give the bot its role information 12 hours before the scenario starts, to give it time to train the network with sample text to help achieve the role. Call it the Turing Grand Challenge.I don't think we're too far off from something that could convincingly pass a task like this, although it isn't easy.

Challenge 1: All players coordinate on a single scenario, are assigned specific roles.
Challenge 2: Players coordinate in groups on different set scenarios/roles that intersect in undetermined ways; players must improvise their roles at intersections
Challenge 3: Players improvise roles entirely within a set scenario
Challenge 4: Players improvise roles and scenarios

Worth stopping here to remind us that current Turing Test (Loenber prizes) are only about the bot forming a grammatically correct sentence in response to the input sentence. Nothing about setting roles or scenarios. It seems to me that only progressing through these challenges can we get to the point where a conversation is possible. But we can continue grading the challenges:

Challenge 5 (true conversation): Players with improvised roles and scenarios *explore some topic deeply*

These different challenges are meant to escalate the difficulty, starting with very contrived situations and moving to more open interactions. Each level requires the machine to engage more complex ways of establishing itself as a social participant. We already have AI that can do very well in highly contrived situations. AlphaGo is an example of extreme intelligence when restricted to just the domain of Go. Engaging in conversation is not just mastering "language use", but also mastering "being a person" well enough to have something interesting to say and enough social presence to say it and have people listen.

Eripsa fucked around with this message at 18:41 on May 29, 2017

shame on an IGA
Apr 8, 2005

These are the guys that watched Evangelion and thought Gendo was supposed to be a role model

RuanGacho
Jun 20, 2002

"You're gunna break it!"

At some point in the not too distant past I realized that the only form of humanism a majority of nerds were interested in engaging in were ones that allowed them to disassociate themselves from civic responsibilities. There are plenty of problems we can work on here and now with technology that dont require "20 years of technological progress" type solutions but they require more than internet posting and actuall working with other people.

These are not those stories :doink:

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

WrenP-Complete posted:


Could someone enlighten me? Googling got me Youtube videos on improving my physical posture.


dressing up a reflex craving for Authoritarianism in feelgoodist or lofty language.

WrenP-Complete
Jul 27, 2012

Broccoli Cat posted:

dressing up a reflex craving for Authoritarianism in feelgoodist or lofty language.

So an example is LessWrong's repackaging of Christianity, complete with original sin?

Eripsa: hmmmmm. If this is something you want to try, I can take this idea to the other botgoons and see if we can set something up.

WrenP-Complete fucked around with this message at 21:00 on May 29, 2017

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

WrenP-Complete posted:

So an example is LessWrong's repackaging of Christianity, complete with original sin?



not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise.

bottom line is; if you aren't with evolution, you're against it.

Who What Now
Sep 10, 2006

by Azathoth

Broccoli Cat posted:

not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise.

bottom line is; if you aren't with evolution, you're against it.

I'm against delusional thinking, where does that put me?

rudatron
May 31, 2011

by Fluffdaddy
Evolution is evil. Billions die, through no fault of their own, because that is the way their body has evolved. It's bullshit. The whole point of body modification is that your sidestep evolution completely, and end up intelligently designing people.

Your speech about technology being individualistic is wrong as hell too - technology, especially today's technology, is complicated, and is the product of human societies. Transhumanism isn't going to make you more individualistic, in fact it quite the opposite - you're going to be more dependent on society, because your body itself is a product of industrial society. And that's not a bad thing, in itself, its just a change.

WrenP-Complete
Jul 27, 2012

Broccoli Cat posted:

not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise.

bottom line is; if you aren't with evolution, you're against it.

https://en.wikipedia.org/wiki/LessWrong
http://www.businessinsider.com/what-is-rokos-basilisk-2014-8

Ytlaya
Nov 13, 2005

Dzhay posted:

I mean, you'd obviously have to copy more than just memories, I don't think anyone was disputing that. Exactly what you'd need is an interesting question, but this tech is probably centuries away, so let's not sweat the details in a dumb internet argument.

Yeah, while I think that technology allowing humans to "upload" their minds is likely technically possible, it definitely won't happen during our lifetimes. It basically represents a leap in technology so big you might as well be talking about creating a galactic empire or something.

Who What Now
Sep 10, 2006

by Azathoth

rudatron posted:

Evolution is evil.

Evolution is a natural process with no intent behind it. It's no more evil than a tree stump.

Lightning Lord
Feb 21, 2013

$200 a day, plus expenses

So when do I get one of those Sorayama robots to gently caress?

Ytlaya
Nov 13, 2005

Eripsa posted:

"Uploading oneself" ultimately amounts to digitizing all the control structures you identify with. But this is tricky. A good chunk of my brain is dedicated to moving my fingers on a keyboard. If I'm digitizing myself, I won't have fingers any more.

This is a good point I forgot to mention that many "upload your brain to a computer" people don't consider. A lot of our mind (and thus "who we are") is dependent upon interfacing with our body/biology. I feel like uploading a human mind to a computer would require also changing it to an extent where you might no longer be considered "you" anymore.

rudatron
May 31, 2011

by Fluffdaddy
I imagine you'd have to fake a lot of it, for a long time.

*in computer mainframe voice* Yeah man, you totally still have a stomach + heart + poo poo, better keep sending those signals.

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

Who What Now posted:

I'm against delusional thinking, where does that put me?


yesterday's delusional thinking is tomorrow's occipital lobe implant, so you may be in line to make out blurry shapes.

Ytlaya posted:

This is a good point I forgot to mention that many "upload your brain to a computer" people don't consider. A lot of our mind (and thus "who we are") is dependent upon interfacing with our body/biology. I feel like uploading a human mind to a computer would require also changing it to an extent where you might no longer be considered "you" anymore.


this is why the option to die like an animal must always be open to folks who would choose a traditional end.

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:


so it's like somethingawful?

Who What Now
Sep 10, 2006

by Azathoth

Broccoli Cat posted:

yesterday's delusional thinking is tomorrow's occipital lobe implant, so you may be in line to make out blurry shapes.



this is why the option to die like an animal must always be open to folks who would choose a traditional end.

Whether you die like an animal or die like a machine you're still dead. What difference does it make if what killed you was cancer or dust clogging your fans and causing you to overheat?

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

Who What Now posted:

Whether you die like an animal or die like a machine you're still dead. What difference does it make if what killed you was cancer or dust clogging your fans and causing you to overheat?


because the machine exists as an omnipotent virtual god and dies billions of years after the animal, so the machine wins, existentially.

Who What Now
Sep 10, 2006

by Azathoth
No it doesn't. I'm going to die millions of years after any dinosaurs, but I didn't "win" over them. Regardless of when you die whatever existential goals you were trying to accomplish become meaningless. For all your supposed omnipotence you will still end up as impotent as every animal that ever lived.

Cockmaster
Feb 24, 2002

Jazu posted:

If AI : human :: human : chimpanzee, we won't be able to create it any more than a chimpanzee could design humans in an engineering project.

The idea there is that someone would eventually create an AI capable of designing a slightly better AI. Once we have that, the AI's capability would advance exponentially until it reaches a level equal to human intelligence, then quickly keep improving from there.

Josef bugman
Nov 17, 2011

Pictured: Poster prepares to celebrate Holy Communion (probablY)

This avatar made possible by a gift from the Religionthread Posters Relief Fund

Broccoli Cat posted:

because the machine exists as an omnipotent virtual god and dies billions of years after the animal, so the machine wins, existentially.

Life is not a game to be won.

Ytlaya
Nov 13, 2005

Broccoli Cat posted:

this is why the option to die like an animal must always be open to folks who would choose a traditional end.

No, I'm saying this creates a huge barrier to the very idea of "uploading our minds into a computer". Like, I don't even know if it's really possible to upload anything that can still be considered "who we are" to a machine without also simulating the rest of the body. Like, the issue isn't just that it'd be uncomfortable, but that there are huge swathes of the brain devoted to the functioning of our bodies, and any sort of "brain upload" is going to have to figure out what to do with that.

Yuli Ban
Nov 22, 2016

Bot
It makes sense that symbiosis of man and machine is necessary simply because of how totally our brains are outclassed by computers.
No, not in terms of intelligence— not yet, anyway.

Let's say you have a computer that's as intelligent as a human. It had a body through which it could experience the world, and it's now at the level of the Average Internet User. How screwed are we? Very screwed, because even at this level, the AI is already superior to all humans. And it's all because of the speed of thought.

The fastest human neural signal is believed to be about 268 miles per hour to and from an alpha motor neuron in the spinal cord. That is pretty fast, especially considering our compact bodies— but in the grand scheme of things, it's slower than a dead sloth. Photonic computers utilize light to process information— need I remind you that light travels at 670,616,629 mph.

670,616,629 / 268 = 2,502,300.85.
Photonic computers will be capable of 'thinking' two and a half million times faster than a human. And note, that's not just the Average Internet Using Human— it doesn't matter if you're comparing the neural signals of the world's most shockingly retarded person in history or the fusion of William Sidis and Leonhard Euler. Our fastest brain signal is 268 miles per hour. That's not changing.

Not unless we were to augment ourselves.

Yak of Wrath
Feb 24, 2011

Keeping It Together

Who What Now posted:

No it doesn't. I'm going to die millions of years after any dinosaurs, but I didn't "win" over them. Regardless of when you die whatever existential goals you were trying to accomplish become meaningless. For all your supposed omnipotence you will still end up as impotent as every animal that ever lived.

I think they mean billions of years longer, not after. So when their soul consciousness is uplifted uploaded into Heaven The Cloud they will exist for essentially forever in a beatific realm free from earthly pain with St Yudchowsky and Father Thiel, unlike those who cling to sinful flesh and it's wicked delights.

Kerning Chameleon
Apr 8, 2015

by Cyrano4747

shame on an IGA posted:

These are the guys that watched Evangelion Deus Ex and thought Gendo Bob Page was supposed to be a role model

There's a reason being a Transhumanist is a prerequisite to becoming a Dark Enlightenment fuckwit.

Ytlaya
Nov 13, 2005

I consider this sort of trans-humanism to be kind of similar to discussions about colonizing other solar systems. It's something that may be possible someday, but would require such a monumental amount of technological progress that it absolutely won't be in our lifetimes and isn't really worth discussing in any way beyond just a thought exercise (because there are countless intermediate steps we still need to figure out to make such technology possible in the first place).

Augmenting people with technology in other ways is different, because we can already do that to some degree, but the idea of taking the human brain and uploading it to a computer (or somehow allowing it to directly interface with a computer with any level of complexity) isn't even on the horizon.

rudatron
May 31, 2011

by Fluffdaddy

Kerning Chameleon posted:

There's a reason being a Transhumanist is a prerequisite to becoming a Dark Enlightenment fuckwit.
Forget the dork endarkenment, they're too anti-social, childish and naive to really see the potential here, or achieve anything in life. If you're using 'technology', as a mcguffin in your personal power fantasy, you're not clued in to how society, industry, or modern technological innovation actually works. Nor are you prepared to see what effect this stuff will have.

Mercrom
Jul 17, 2009

Ytlaya posted:

I consider this sort of trans-humanism to be kind of similar to discussions about colonizing other solar systems. It's something that may be possible someday, but would require such a monumental amount of technological progress that it absolutely won't be in our lifetimes and isn't really worth discussing in any way beyond just a thought exercise (because there are countless intermediate steps we still need to figure out to make such technology possible in the first place).

Augmenting people with technology in other ways is different, because we can already do that to some degree, but the idea of taking the human brain and uploading it to a computer (or somehow allowing it to directly interface with a computer with any level of complexity) isn't even on the horizon.

It's not similar. We can prove we won't colonize other solar systems in 10 years. We can't prove we won't upload our minds to the internet in that time frame.

No. That's it. There is no more serious discussion to be had on the subject. I feel like we can't upload our brains to computers before we fully understand our brains and therefore have already built fully functional AI, but to honestly think that means to assume I know how anything about how future mind uploading works which would be really stupid.

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

Josef bugman posted:

Life is not a game to be won.

Life is for plants and animals. Evolution is a game to be won.



Ytlaya posted:

No, I'm saying this creates a huge barrier to the very idea of "uploading our minds into a computer". Like, I don't even know if it's really possible to upload anything that can still be considered "who we are" to a machine without also simulating the rest of the body. Like, the issue isn't just that it'd be uncomfortable, but that there are huge swathes of the brain devoted to the functioning of our bodies, and any sort of "brain upload" is going to have to figure out what to do with that.


"you" are only a by-product of a fraction of your brain's processes..."you" are a by-product of finite resources, as is your sensory input which creates awareness of your body and "self".

Who What Now
Sep 10, 2006

by Azathoth

Broccoli Cat posted:

Life is for plants and animals. Evolution is a game to be won.

No it isn't. Evolution is a process with no inherent goals.

Broccoli Cat
Mar 8, 2013

"so, am I right in understanding that you're a bigot or aficionado of racist humor?




STAR CITIZEN is for WHITES ONLY!




:lesnick:

Who What Now posted:

No it isn't. Evolution is a process with no inherent goals.

delaying nonexistence.

Who What Now
Sep 10, 2006

by Azathoth
If that's were the case we'd all be waterbears or maybe immortal naked mole rats.

WrenP-Complete
Jul 27, 2012

Waterbears pictures please.

:swoon:

Who What Now
Sep 10, 2006

by Azathoth


Isn't it cute? And it's also immune to all but the hardest environments!

WrenP-Complete
Jul 27, 2012

Who What Now posted:



Isn't it cute? And it's also immune to all but the hardest environments!

Thank you; I love them. :neckbeard:

A big flaming stink
Apr 26, 2010

Mercrom posted:

It's not similar. We can prove we won't colonize other solar systems in 10 years. We can't prove we won't upload our minds to the internet in that time frame.

No. That's it. There is no more serious discussion to be had on the subject. I feel like we can't upload our brains to computers before we fully understand our brains and therefore have already built fully functional AI, but to honestly think that means to assume I know how anything about how future mind uploading works which would be really stupid.

sorry, friend, but this is the closest you're ever going to get to becoming a machine consciousness:

Mercrom
Jul 17, 2009

A big flaming stink posted:

sorry, friend, but this is the closest you're ever going to get to becoming a machine consciousness:



literally dont see an issue with this

rudatron
May 31, 2011

by Fluffdaddy
Water bears are ugly & stupid, and if they all died, no one would miss them. Mostly because you can't see them.

Look, someone had to say it, okay.

Adbot
ADBOT LOVES YOU

evilmiera
Dec 14, 2009

Status: Ravenously Rambunctious

shame on an IGA posted:

These are the guys that watched Evangelion and thought Gendo was supposed to be a role model

He went from being a bad student who kept getting into fights to leading the top research- and military-division in the world and got like, a hundred or so lady friends. I mean sure he ended up getting his head bitten off by his ex and murdering the entire world but hey, you win some you lose some.

  • Locked thread