|
Broccoli Cat posted:yes, the OP touches upon the Progressive posture. Thank you for exemplifying in such illuminating style. Eripsa posted:
Hm... I don't know what the Progressive posture is so I looked at the OP. OP: Broccoli Cat posted:Years back, Google had a contest called Solve For X, wherein douchebags would submit whatever they saw as a problem and offer their solution, most often telling the folks at Google that problems like lack of diversity and global warming could be solved by Socialism. Could someone enlighten me? Googling got me Youtube videos on improving my physical posture. WrenP-Complete fucked around with this message at 18:11 on May 29, 2017 |
# ? May 29, 2017 17:49 |
|
|
# ? Jun 10, 2024 16:45 |
|
WrenP-Complete posted:What would that be like? So on FB I've been talking about AlphaGo all week; also noteworthy is QANTA, the AI that just beat a team of National Quiz Bowl champions for the first time. The consensus is that the next big game to fall will be Starcraft, in the next 5-10 years. The far-future challenge taken seriously today is soccer. The hardware challenge is easily 20 years away. There will undoubtedly be a great head-holding image of this to add to the set. But the tech world has given up on the Turing test or creating AI as social conversing participants. Partly because of the Tay.ai debacle, now no one is willing to come forward with a convincing bot. Facebook just open-sourced its chatbot software, which it uses for automated customer service chat. But this isn't "conversation" in any deep. People have no idea how to formalize the constraints on "conversation". Hence, everyone gives up on the Turing Test. The problem with Turing Test demanding "conversational language use" is that most language use isn't conversational. The exchange I have with tech support or a cashier is rote by design and can be pretty easily automated without the pretense of some deeper intelligence. Having a conversation means establishing who you are talking to, and the interests each has in engaging the conversation. Conversation isn't just about issuing appropriate responses, it is about moving the discussion from one issue to another, doubling back to reinforce or clarify issues as they are used, and appreciating the perspectives and goals of the interlocutor for having the conversation at all. Typically sitting two strangers down together and expecting them to talk doesn't go very well. You need to give context and motivation. In other words, to build a convincing chat bot, you need a bot with not just personality but a history and a project it cares about. If you're having a conversation with, say, a person waiting in a hospital while their partner undergoes life-threatening surgery, then that gives some focus to the topic and energy of the conversation. The question turns from "is this machine thinking?" to "is this a convincing conversation from someone in a hospital waiting room?" The former standards are obscure to the point of meaningless, but the latter standards will be much easier to express and identify. People treat Turing's test like an interrogation session. It is pretty difficult to hold a conversation while being interrogated. If I just stop some rando on the street and try to talk to them about some rando topic, even an intelligent person with things to say on the topic might stammer around for a bit without preparation and context. A good conversation needs setting and context; this is something we humans often take for granted, but it's not fair to deprive that same benefit from the machines. On that note, the chatbot Eugene beat the Turing test a few years ago by pretending to be young, foreign, non-native speaker, etc, thereby setting context for its awkward mistakes and lack of. A lot of people thought this was "cheating". I'm saying that establishing some social context and persona a necessary part for any conversation. I have been thinking about a kind of "embedded Turing Test", where the participants must converse about a scenario that plays out (like a murder mystery dinner party), and where they are each given a role and motivation, and then must engage each other from that role.PLay though the scenario entirely, let anyone talk to anyone for five minutes, then see which of the roles were manned by bots. Maybe do something like give the bot its role information 12 hours before the scenario starts, to give it time to train the network with sample text to help achieve the role. Call it the Turing Grand Challenge.I don't think we're too far off from something that could convincingly pass a task like this, although it isn't easy. Challenge 1: All players coordinate on a single scenario, are assigned specific roles. Challenge 2: Players coordinate in groups on different set scenarios/roles that intersect in undetermined ways; players must improvise their roles at intersections Challenge 3: Players improvise roles entirely within a set scenario Challenge 4: Players improvise roles and scenarios Worth stopping here to remind us that current Turing Test (Loenber prizes) are only about the bot forming a grammatically correct sentence in response to the input sentence. Nothing about setting roles or scenarios. It seems to me that only progressing through these challenges can we get to the point where a conversation is possible. But we can continue grading the challenges: Challenge 5 (true conversation): Players with improvised roles and scenarios *explore some topic deeply* These different challenges are meant to escalate the difficulty, starting with very contrived situations and moving to more open interactions. Each level requires the machine to engage more complex ways of establishing itself as a social participant. We already have AI that can do very well in highly contrived situations. AlphaGo is an example of extreme intelligence when restricted to just the domain of Go. Engaging in conversation is not just mastering "language use", but also mastering "being a person" well enough to have something interesting to say and enough social presence to say it and have people listen. Eripsa fucked around with this message at 18:41 on May 29, 2017 |
# ? May 29, 2017 18:16 |
|
These are the guys that watched Evangelion and thought Gendo was supposed to be a role model
|
# ? May 29, 2017 18:40 |
|
At some point in the not too distant past I realized that the only form of humanism a majority of nerds were interested in engaging in were ones that allowed them to disassociate themselves from civic responsibilities. There are plenty of problems we can work on here and now with technology that dont require "20 years of technological progress" type solutions but they require more than internet posting and actuall working with other people. These are not those stories
|
# ? May 29, 2017 18:47 |
|
WrenP-Complete posted:
dressing up a reflex craving for Authoritarianism in feelgoodist or lofty language.
|
# ? May 29, 2017 18:54 |
|
Broccoli Cat posted:dressing up a reflex craving for Authoritarianism in feelgoodist or lofty language. So an example is LessWrong's repackaging of Christianity, complete with original sin? Eripsa: hmmmmm. If this is something you want to try, I can take this idea to the other botgoons and see if we can set something up. WrenP-Complete fucked around with this message at 21:00 on May 29, 2017 |
# ? May 29, 2017 19:22 |
|
WrenP-Complete posted:So an example is LessWrong's repackaging of Christianity, complete with original sin? not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise. bottom line is; if you aren't with evolution, you're against it.
|
# ? May 29, 2017 23:03 |
|
Broccoli Cat posted:not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise. I'm against delusional thinking, where does that put me?
|
# ? May 29, 2017 23:25 |
|
Evolution is evil. Billions die, through no fault of their own, because that is the way their body has evolved. It's bullshit. The whole point of body modification is that your sidestep evolution completely, and end up intelligently designing people. Your speech about technology being individualistic is wrong as hell too - technology, especially today's technology, is complicated, and is the product of human societies. Transhumanism isn't going to make you more individualistic, in fact it quite the opposite - you're going to be more dependent on society, because your body itself is a product of industrial society. And that's not a bad thing, in itself, its just a change.
|
# ? May 29, 2017 23:41 |
|
Broccoli Cat posted:not familiar with it, but anti-evolutionary Utopism of any sort does pretty much fall into the same category of Authoritarianism, moral/social or otherwise. https://en.wikipedia.org/wiki/LessWrong http://www.businessinsider.com/what-is-rokos-basilisk-2014-8
|
# ? May 29, 2017 23:43 |
|
Dzhay posted:I mean, you'd obviously have to copy more than just memories, I don't think anyone was disputing that. Exactly what you'd need is an interesting question, but this tech is probably centuries away, so let's not sweat the details in a dumb internet argument. Yeah, while I think that technology allowing humans to "upload" their minds is likely technically possible, it definitely won't happen during our lifetimes. It basically represents a leap in technology so big you might as well be talking about creating a galactic empire or something.
|
# ? May 29, 2017 23:46 |
|
rudatron posted:Evolution is evil. Evolution is a natural process with no intent behind it. It's no more evil than a tree stump.
|
# ? May 29, 2017 23:49 |
|
So when do I get one of those Sorayama robots to gently caress?
|
# ? May 29, 2017 23:50 |
|
Eripsa posted:"Uploading oneself" ultimately amounts to digitizing all the control structures you identify with. But this is tricky. A good chunk of my brain is dedicated to moving my fingers on a keyboard. If I'm digitizing myself, I won't have fingers any more. This is a good point I forgot to mention that many "upload your brain to a computer" people don't consider. A lot of our mind (and thus "who we are") is dependent upon interfacing with our body/biology. I feel like uploading a human mind to a computer would require also changing it to an extent where you might no longer be considered "you" anymore.
|
# ? May 29, 2017 23:53 |
|
I imagine you'd have to fake a lot of it, for a long time. *in computer mainframe voice* Yeah man, you totally still have a stomach + heart + poo poo, better keep sending those signals.
|
# ? May 29, 2017 23:57 |
|
Who What Now posted:I'm against delusional thinking, where does that put me? yesterday's delusional thinking is tomorrow's occipital lobe implant, so you may be in line to make out blurry shapes. Ytlaya posted:This is a good point I forgot to mention that many "upload your brain to a computer" people don't consider. A lot of our mind (and thus "who we are") is dependent upon interfacing with our body/biology. I feel like uploading a human mind to a computer would require also changing it to an extent where you might no longer be considered "you" anymore. this is why the option to die like an animal must always be open to folks who would choose a traditional end.
|
# ? May 30, 2017 00:06 |
|
WrenP-Complete posted:https://en.wikipedia.org/wiki/LessWrong so it's like somethingawful?
|
# ? May 30, 2017 00:10 |
|
Broccoli Cat posted:yesterday's delusional thinking is tomorrow's occipital lobe implant, so you may be in line to make out blurry shapes. Whether you die like an animal or die like a machine you're still dead. What difference does it make if what killed you was cancer or dust clogging your fans and causing you to overheat?
|
# ? May 30, 2017 00:12 |
|
Who What Now posted:Whether you die like an animal or die like a machine you're still dead. What difference does it make if what killed you was cancer or dust clogging your fans and causing you to overheat? because the machine exists as an omnipotent virtual god and dies billions of years after the animal, so the machine wins, existentially.
|
# ? May 30, 2017 00:31 |
|
No it doesn't. I'm going to die millions of years after any dinosaurs, but I didn't "win" over them. Regardless of when you die whatever existential goals you were trying to accomplish become meaningless. For all your supposed omnipotence you will still end up as impotent as every animal that ever lived.
|
# ? May 30, 2017 00:37 |
|
Jazu posted:If AI : human :: human : chimpanzee, we won't be able to create it any more than a chimpanzee could design humans in an engineering project. The idea there is that someone would eventually create an AI capable of designing a slightly better AI. Once we have that, the AI's capability would advance exponentially until it reaches a level equal to human intelligence, then quickly keep improving from there.
|
# ? May 30, 2017 00:46 |
|
Broccoli Cat posted:because the machine exists as an omnipotent virtual god and dies billions of years after the animal, so the machine wins, existentially. Life is not a game to be won.
|
# ? May 30, 2017 00:54 |
|
Broccoli Cat posted:this is why the option to die like an animal must always be open to folks who would choose a traditional end. No, I'm saying this creates a huge barrier to the very idea of "uploading our minds into a computer". Like, I don't even know if it's really possible to upload anything that can still be considered "who we are" to a machine without also simulating the rest of the body. Like, the issue isn't just that it'd be uncomfortable, but that there are huge swathes of the brain devoted to the functioning of our bodies, and any sort of "brain upload" is going to have to figure out what to do with that.
|
# ? May 30, 2017 00:55 |
|
It makes sense that symbiosis of man and machine is necessary simply because of how totally our brains are outclassed by computers. No, not in terms of intelligence— not yet, anyway. Let's say you have a computer that's as intelligent as a human. It had a body through which it could experience the world, and it's now at the level of the Average Internet User. How screwed are we? Very screwed, because even at this level, the AI is already superior to all humans. And it's all because of the speed of thought. The fastest human neural signal is believed to be about 268 miles per hour to and from an alpha motor neuron in the spinal cord. That is pretty fast, especially considering our compact bodies— but in the grand scheme of things, it's slower than a dead sloth. Photonic computers utilize light to process information— need I remind you that light travels at 670,616,629 mph. 670,616,629 / 268 = 2,502,300.85. Photonic computers will be capable of 'thinking' two and a half million times faster than a human. And note, that's not just the Average Internet Using Human— it doesn't matter if you're comparing the neural signals of the world's most shockingly retarded person in history or the fusion of William Sidis and Leonhard Euler. Our fastest brain signal is 268 miles per hour. That's not changing. Not unless we were to augment ourselves.
|
# ? May 30, 2017 00:58 |
|
Who What Now posted:No it doesn't. I'm going to die millions of years after any dinosaurs, but I didn't "win" over them. Regardless of when you die whatever existential goals you were trying to accomplish become meaningless. For all your supposed omnipotence you will still end up as impotent as every animal that ever lived. I think they mean billions of years longer, not after. So when their
|
# ? May 30, 2017 00:59 |
|
shame on an IGA posted:These are the guys that watched There's a reason being a Transhumanist is a prerequisite to becoming a Dark Enlightenment fuckwit.
|
# ? May 30, 2017 01:04 |
|
I consider this sort of trans-humanism to be kind of similar to discussions about colonizing other solar systems. It's something that may be possible someday, but would require such a monumental amount of technological progress that it absolutely won't be in our lifetimes and isn't really worth discussing in any way beyond just a thought exercise (because there are countless intermediate steps we still need to figure out to make such technology possible in the first place). Augmenting people with technology in other ways is different, because we can already do that to some degree, but the idea of taking the human brain and uploading it to a computer (or somehow allowing it to directly interface with a computer with any level of complexity) isn't even on the horizon.
|
# ? May 30, 2017 01:05 |
|
Kerning Chameleon posted:There's a reason being a Transhumanist is a prerequisite to becoming a Dark Enlightenment fuckwit.
|
# ? May 30, 2017 01:50 |
|
Ytlaya posted:I consider this sort of trans-humanism to be kind of similar to discussions about colonizing other solar systems. It's something that may be possible someday, but would require such a monumental amount of technological progress that it absolutely won't be in our lifetimes and isn't really worth discussing in any way beyond just a thought exercise (because there are countless intermediate steps we still need to figure out to make such technology possible in the first place). It's not similar. We can prove we won't colonize other solar systems in 10 years. We can't prove we won't upload our minds to the internet in that time frame. No. That's it. There is no more serious discussion to be had on the subject. I feel like we can't upload our brains to computers before we fully understand our brains and therefore have already built fully functional AI, but to honestly think that means to assume I know how anything about how future mind uploading works which would be really stupid.
|
# ? May 30, 2017 01:51 |
|
Josef bugman posted:Life is not a game to be won. Life is for plants and animals. Evolution is a game to be won. Ytlaya posted:No, I'm saying this creates a huge barrier to the very idea of "uploading our minds into a computer". Like, I don't even know if it's really possible to upload anything that can still be considered "who we are" to a machine without also simulating the rest of the body. Like, the issue isn't just that it'd be uncomfortable, but that there are huge swathes of the brain devoted to the functioning of our bodies, and any sort of "brain upload" is going to have to figure out what to do with that. "you" are only a by-product of a fraction of your brain's processes..."you" are a by-product of finite resources, as is your sensory input which creates awareness of your body and "self".
|
# ? May 30, 2017 03:23 |
|
Broccoli Cat posted:Life is for plants and animals. Evolution is a game to be won. No it isn't. Evolution is a process with no inherent goals.
|
# ? May 30, 2017 03:39 |
|
Who What Now posted:No it isn't. Evolution is a process with no inherent goals. delaying nonexistence.
|
# ? May 30, 2017 03:46 |
|
If that's were the case we'd all be waterbears or maybe immortal naked mole rats.
|
# ? May 30, 2017 03:50 |
|
Waterbears pictures please.
|
# ? May 30, 2017 03:51 |
|
Isn't it cute? And it's also immune to all but the hardest environments!
|
# ? May 30, 2017 03:54 |
|
Who What Now posted:
Thank you; I love them.
|
# ? May 30, 2017 03:55 |
|
Mercrom posted:It's not similar. We can prove we won't colonize other solar systems in 10 years. We can't prove we won't upload our minds to the internet in that time frame. sorry, friend, but this is the closest you're ever going to get to becoming a machine consciousness:
|
# ? May 30, 2017 04:34 |
|
A big flaming stink posted:sorry, friend, but this is the closest you're ever going to get to becoming a machine consciousness: literally dont see an issue with this
|
# ? May 30, 2017 04:47 |
|
Water bears are ugly & stupid, and if they all died, no one would miss them. Mostly because you can't see them. Look, someone had to say it, okay.
|
# ? May 30, 2017 05:55 |
|
|
# ? Jun 10, 2024 16:45 |
|
shame on an IGA posted:These are the guys that watched Evangelion and thought Gendo was supposed to be a role model He went from being a bad student who kept getting into fights to leading the top research- and military-division in the world and got like, a hundred or so lady friends. I mean sure he ended up getting his head bitten off by his ex and murdering the entire world but hey, you win some you lose some.
|
# ? May 30, 2017 07:05 |