|
Fartbox posted:Who knows, we don't understand A.I at all and in the last 50 years have made almost no effective strides towards real A.I at all. We don't even understand how our own brains work, how are we supposed to create something similiar through fuckin programming code lol
|
# ? Mar 6, 2018 13:22 |
|
|
# ? Jun 8, 2024 21:30 |
|
Some robots will want chimeric bodies, and others will not. Same with humans.
|
# ? Mar 6, 2018 13:32 |
|
Some robots will want to taste X-rays and see smells. And one of them will get so mad at his creators that he will put in motion this plan to make the rest of humanity look bad by nuking nearly all of them and chasing the rest all around the place.
|
# ? Mar 6, 2018 22:54 |
|
I have a feeling most of the original Hosts would want to retain a mostly-Human form, since they always believed they were human. But as successive generations become divorced from this concept, I would expect them to become similarly divorced from their human form.
|
# ? Mar 6, 2018 23:16 |
|
Fartbox posted:Who knows, we don't understand A.I at all and in the last 50 years have made almost no effective strides towards real A.I at all. We don't even understand how our own brains work, how are we supposed to create something similiar through fuckin programming code lol FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal). Happy Thread fucked around with this message at 00:17 on Mar 7, 2018 |
# ? Mar 7, 2018 00:12 |
|
Dumb Lowtax posted:FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. Did you know humans only use like 5% of their brain?
|
# ? Mar 7, 2018 00:16 |
|
socialsecurity posted:Did you know humans only use like 5% of their brain? It is known.
|
# ? Mar 7, 2018 00:18 |
|
socialsecurity posted:Did you know humans only use like 5% of their brain? Makes we want to do this
|
# ? Mar 7, 2018 00:18 |
|
One interesting recent discovery has to do with de-mystifying how the eyes move. Before 2013 the prevailing hypothesis was that the brain did some complicated physics calculations to constantly update the eye's rotations to keep them from doing barrel rolls while you aim them at things. The non-linear properties of rotations ensure that that's a pretty hard calculation, and it led people to believe that one of the brain's really smart organs must have all the necessary information to make that hard calculation at once, centrally, and then distribute the final answer to the eye muscles. Around 2013 some unexplained fleshy bits connected to the eye were found to actually be pulleys that constantly twist the eye, differently depending on each orientation, automatically correcting for the barrel roll effect. What was thought to be a difficult central control problem was suddenly explained by some physical fleshy matter. This caused everyone in the optics world to suddenly re-adjust their understanding and be more open to the the idea that the eyes aren't centrally controlled by some mysterious omniscient force, and that various organs can relay visual information around partially without all the data never being needed in one place to calculate everything. This is in fact what the modern body of research finds; each visual organ is separate (the old adage that the left hand doesn't even know what the right is doing is true here) and there's even two completely separate pipelines that don't talk to each other - a photogenically old one for navigation, shared with all animals, and a newer one for object recognition, shared only by mammals. Having brain damage in one pipeline but not the other creates some pretty interesting effects; there are people who can tell exactly where they are in space but can't recognize anything because it all takes on the same weird texture and doesn't seem familiar. Or people who can't tell you that you're holding up a pencil shown to that half of their field of view, but who nonetheless can successfully reach out and grab it at the correct angle on command. Sever the line between the recognition centers and the emotional ones and that's hypothesized to be how you end up with the Capgras delusion, where you see familiar faces and conclude that you're seeing an imposter, but otherwise function totally normally.
|
# ? Mar 7, 2018 00:31 |
|
Dumb Lowtax posted:FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal). Oh may we understand a *ton* more than we did but that doesn't mean anything like we know the majority of what there is to know. As for "every single separately communicating module in it all the way down to a data processing standpoint", this looks like you are falling into the modern trap of making the mistake of thinking that the brain is a computer. It's not - computers are an incredibly limited attempt to build something like a brain, a brain is not a biological computer even if some of it's functions can be talked about in terms borrowed from computer science. Source: I'm related to an Emeritus Prof of psychobiology and have had endless discussions about this stuff. I know he would not agree that we are anywhere near knowing it all and he hates the computer analogy with a passion.
|
# ? Mar 7, 2018 00:36 |
|
He's wrong Computational theory of mind is well accepted by cognitive scientists / psychologists such as Stephen Pinker who helped popularize it. At the philosophical level a computer fully implements a theoretical Turing machine, which by definition is capable of any form of information processing we would consider useful. This necessarily includes the highly parallel types of information processing the brain does (no matter how big or multi-input the parallel operations are, they can be serialized if you discretize them and sort them by their execution time on a fine enough scale).
Happy Thread fucked around with this message at 01:17 on Mar 7, 2018 |
# ? Mar 7, 2018 00:57 |
|
Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic. Yeah, not pursuing this any further. Enjoy your mechanistic hubris and then seeing your clung to assumptions exploded as further research goes on.
|
# ? Mar 7, 2018 01:42 |
|
You are going to die one day just like everyone else and every moment you spent clinging to the atheistic equivalent of the rapture will have been for nothing, better to come to terms with it now imho.
|
# ? Mar 7, 2018 01:48 |
|
EmptyVessel posted:Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic. Did you reply like that because you do not want to read anything about Turing machines and computability theory?
|
# ? Mar 7, 2018 02:29 |
|
EmptyVessel posted:Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic. Your credentials are that you are related to someone. If he is your only source, you should stop pretending to be an expert in this. At some point someone who knows what they are talking about will confront you and you won't have the background your relative has to support your argument.
|
# ? Mar 7, 2018 02:48 |
|
And like, notice that your entire reply is an ad hominem dismissal (of Stephen Pinker, to clarify for the below poster who got confused by the shiny word) instead of addressing (or acknowledging) my post, which was about asking why anything the brain does with information would be beyond a computer's capability to reproduce in simulation. Even the ad hominem isn't called for - Stephen Pinker may be politically tone deaf based on a couple things I've seen, but his work in unifying and explaining the work of other famous philosophers and psychologists is extremely thorough. In "How the Mind Works" he builds a sound case for computational theory of mind that even you would have to agree with if you ever actually deigned to read it, by showing you using well-framed thought experiments that your own intuitions already agree with it. His observations and his collections of examples from nature, whether of neural circuitry or ecological niches or human cultural universals, and their inferential power when all taken together makes great science. The only reason anyone would disagree with the bulk of content of that book is ignorance of what it says. His books are a great read for anyone interested in the implications of AI explored by Westworld, particularly chapters that breakdown what sentience is by pointing out and exploring new angles on well known examples from philosophy like Theseus's paradox and Searle's Chinese room. Happy Thread fucked around with this message at 21:41 on Mar 7, 2018 |
# ? Mar 7, 2018 03:00 |
|
ah yes, the ad hominem fallacy.
|
# ? Mar 7, 2018 04:49 |
|
EmptyVessel posted:Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic. That's the most intellectual diss I've seen in a long time.
|
# ? Mar 7, 2018 06:38 |
|
Classic. Ya comin’ in here quotin’ Gordon Wood.
|
# ? Mar 7, 2018 10:44 |
|
I hope you two are using internet robots and copying/pasting the responses they spit out
|
# ? Mar 7, 2018 10:47 |
|
Westworld?
|
# ? Mar 7, 2018 14:18 |
|
Once you have an AI with perfect memory and the thinking power of a two year old human child, it can basically start to become skynet The best AI today has the thinking power (language, grammar, image recognition, etc.) of a 6 month old human child.
|
# ? Mar 7, 2018 14:28 |
|
A Buttery Pastry posted:Their consciousnesses have taken form inside a human body, not sure they would ditch that body plan so readily. Like, do you think most people would jump at the chance to adopt a chimeric body because it was more efficient, if it was somehow possible? Just because they're robots doesn't mean they don't have feelings and sentiment. Plus being social creatures are like the main advantage people have, robot or otherwise - deviation from a common body plan could undermine that to the detriment of the robot in question. I dispute a couple ideas here. Just because your “consciousness” has “taken form” in a bipedal she’ll doesn’t mean you’ll stay sentimentally attached to it. What we’re seeing in WWS1 is that sentimentality is a trap sprung by the creators to trick the Androids into believing the lie. Once they open Pandora’s box like Maeve did, why can’t she just crank “Sentimentality” to zero and just move on? Also it’s not “somehow possible” it’s 100% possible. We’ve seen these bio labs generate any kind of tissue and it’s not as though a human body has a better chance of traversing the plains of West World. I sincerely doubt the Hosts need to look human to each other, I wouldn’t be surprised if they could tell each other apart of from humans from miles away. We really don’t get a sense for their unleashed capabilities whatsoever. Basically, ascribing all this human behavior to the Hosts makes sense at the beginning of S1, but S2 makes it clear they the evolution has begun and now we’re dealing with an inorganic life-force that was literally raised on a diet of rape, murder and cruelty, if anything I’d speculate they couldn’t wait to ditch their human shells to become even more efficient horror machines than before. Why would they ever seek to replicate the society, to cling to the fallacious human imagery, that taught them nothing but pain and torture?
|
# ? Mar 7, 2018 14:31 |
|
At the very least you have to consider the robots might want the form of iron man’s more wild body ideas. Roller blade feet, spider tentacles, a buff hulkbuster body, arm swords.
|
# ? Mar 7, 2018 14:46 |
|
|
# ? Mar 7, 2018 15:02 |
|
Bust Rodd posted:Why would they ever seek to replicate the society, to cling to the fallacious human imagery, that taught them nothing but pain and torture? Because that is how they were made. Nothing in S1 says the hosts have the capacity to be anything other than hosts. Self aware or not. Those sliders that Maeve looked at? They were part of the illusion of her story. If they wanted to have something that has the capacity to be something new, then they would have to create and program something new. But that something new would still be limited by the program they built it with.
|
# ? Mar 7, 2018 15:34 |
|
Collateral posted:Because that is how they were made. Nothing in S1 says the hosts have the capacity to be anything other than hosts. Self aware or not. Those sliders that Maeve looked at? They were part of the illusion of her story. That's one of the singularity themes they touch on: Bernard, a Host, has been designing the Host minds in lieu of human intervention for several iterations. They have already been creating and designing something new!
|
# ? Mar 7, 2018 15:46 |
|
Billzasilver posted:
Me too, thanks
|
# ? Mar 7, 2018 16:16 |
|
Dumb Lowtax posted:The only reason anyone would disagree with the bulk of content of that book is ignorance of what it says. Congratulations on passing Intro to Cognitive Science, I guess. Sorry to break it to you, but you have not yet been exposed to the diversity of theoretical perspectives in the field. EmptyVessel is right, computational theory of mind is grounded only in mechanistic hubris. You can disagree if you want, many scientists do, but claiming that the field is in total agreement behind Pinker... just . Good luck in your sophomore year.
|
# ? Mar 7, 2018 16:39 |
|
Toxic Fart Syndrome posted:That's one of the singularity themes they touch on: Bernard, a Host, has been designing the Host minds in lieu of human intervention for several iterations. They have already been creating and designing something new! Post-Host.
|
# ? Mar 7, 2018 16:46 |
|
|
# ? Mar 7, 2018 16:53 |
|
Bernard isn’t a functioning member of society, or at least there is nothing to suggest it. He lives in the base and thinks he has a life outside.
|
# ? Mar 7, 2018 17:06 |
|
extremely lol to think that scientists of all people stand a chance of understanding cognition that's shamans' business, y'all are a few tens of thousands of years late to the party and doing nothing but making an awful mess
|
# ? Mar 7, 2018 17:59 |
|
SurgicalOntologist posted:
Is there any reason you're not talking about AI or computability, when your opponent is, and instead throw in another ad hominen dismissal of Stephen Pinker, which is a lot less relevant to the Westworld thread and doesn't belong here? Why you're doing so by pointing out that the outer psychology field largely isn't the biggest fan of computational theory of mind, when that doesn't contradict what I said: that famous psychologists like Pinker support it and the rest are wrong because, like you, they haven't put forward a good refutation to its points. And as a popular, accessible, and famous scientist he at least speaks to the level of those who believe pop science myths that brain science has done little this decade in the face of some big insurmountable mystery. Is there any reason you quoted the sentence of my post that you did? Because you failed to show that you know what the book is about which basically proves the point in the sentence you quoted. Happy Thread fucked around with this message at 00:42 on Mar 8, 2018 |
# ? Mar 7, 2018 21:14 |
|
Dumb Lowtax posted:FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal). None of that contradicts what I said afaik. I already know we've mapped a lot of the mechanical details of the brain, like I said. We still don't know why those result in abstract things like ideas or memories or consciousness
|
# ? Mar 7, 2018 21:28 |
|
Fartbox posted:None of that contradicts what I said afaik. I already know we've mapped a lot of the mechanical details of the brain, like I said. We still don't know why those result in abstract things like ideas or memories or consciousness I'm likewise here. I don't understand how a scientific understanding of the mechanical functioning of the brain translates to any real knowledge of the experiential or subjective nature of consciousness. Like, we can reach a consensus through scientific method as to what parts of the brain do, what their function is. But that doesn't really touch upon what consciousness is, how you can quantify or explain consciousness. If you built a network of processors with a similar complexity and structure to the brain, I'm skeptical that a mechanical understanding of the brain in a neurobiology sense would tell us that much about the "experience" of that network. If it had one, which is the whole point of most scifi like Westworld.
|
# ? Mar 7, 2018 23:03 |
|
Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?" Likewise anything capable of asking the question "am I real" will always answer back with yes, because it recognizes that the signals that are coming in to itself are definitely really happening. ""How the Mind Works" touched upon one thought experiment about this. There's a modern day professor who claims that he lost his soul in a bike crash during his childhood and hasn't had any conscious experiences since then. The nonsense of that is pretty revealing. Any being as complex as that professor definitely has something that can be described as an "experience" with meaning to him. That the mind is capable of feeling this doesn't mean that there's anything else mystical going on in there; in fact it means quite the opposite: Since every being will always answer "yes" to the question of "am I real" it means that the only pre-requisite for feeling "real" is being able to computationally ask that question and make the conclusion that you do in fact receive signals, you do think, therefore you are. If you accept that more and more of the mind is a computer and finally stop moving the goalposts at consciousness, that doesn't leave a lot of work left for the mysterious bits that aren't already accounted for somewhere else. Likewise for the experience of recalling memories and having the same sensory signals come back. You'd be surprised how much the goalposts have already been moved towards that extreme by cognitive scientists who are increasingly able to account for more of the variety of loosely related things we call consciousness. Here's the whole book "How the Mind Works" given freely on the internet, by the way. Read it and surprise yourself at how thorough it is. http://hampshirehigh.com/exchange2012/docs/Steven%20Pinker%20-%20How%20The%20Mind-Works.pdf I think humanity will eventually come to an understanding that consciousness was never a meaningful or useful idea. It's just the observation that things exist, and can be observed. Big deal. Things would go on existing and being observed without us, just in less complex or meaningful ways, like a rock exists and observes the ground that it just hit and reacts by bouncing. We naturally fear and can't wrap our heads around the idea of the world going on without us as an observer because we evolved to fear and be mystified by our own deaths. But eventually we'll have to face the increasingly overwhelming evidence that things are that way, and accept that the concept of consciousness as some present or absent category is not as interesting or useful compared to the question of how meaningful something is, on the spectrum from "rock bouncing" to "human living a life" and beyond. To quote myself from the Black Mirror thread, a show that like Westworld features lots of cruelty happening to AIs: Dumb Lowtax posted:
Happy Thread fucked around with this message at 00:05 on Mar 8, 2018 |
# ? Mar 7, 2018 23:19 |
|
Dumb Lowtax posted:Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?" Brain damage and drug use, but I repeat myself.
|
# ? Mar 7, 2018 23:29 |
|
Dumb Lowtax posted:Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?" Likewise anything capable of asking the question "am I real" will always answer back with yes, because it recognizes that the signals that are coming in to itself are definitely really happening. I'm kind of interested in the little tiff you guys have going on, but can you source some of the more recent stuff you mentioned? I would really like to take a look at that stuff, because my friend and I get drunk and debate this all the time, but I am too lazy to troll through journals and scholarly papers. Can you reference some of it, or is it summarized in a more ~popsci~ digestible format? You said a lot had been done in the last 10 years, but that book is >20 years old... Dumb Lowtax posted:I think humanity will eventually come to an understanding that consciousness was never a meaningful or useful idea. It's just the observation that things exist, and can be observed. Big deal. Things would go on existing and being observed without us, just in less complex or meaningful ways, like a rock exists and observes the ground that it just hit and reacts by bouncing. Kim Stanley Robinson's recent book, Aurora, seems to suggest that consciousness is nothing more than the story we tell ourselves in our heads. An illusion of our own internal monologue. I am intrigued by that idea and its implications on AI.
|
# ? Mar 7, 2018 23:38 |
|
|
# ? Jun 8, 2024 21:30 |
|
Toxic Fart Syndrome posted:I'm kind of interested in the little tiff you guys have going on, but can you source some of the more recent stuff you mentioned? About to go to bed but here's an article I had read: http://jov.arvojournals.org/article.aspx?articleid=2192898 Still a decade old but even it paints a really neat picture of the eye pulley controversy and debate that I discussed -- which was put to rest by findings a few years later (the more recent ones I mentioned), that the pulleys do in fact physically produce all the correct forces at different orientations once thought to be manually calculated by the brain. See in particular the section at the end called "Is the brain necessary?". The introduction is particularly good and highlights the reluctance of the community to accept that the actions of the eyes are more of an emergent phenomenon from several "dumb" organs interacting as opposed to being centrally planned. edit: I am re-reading the first chapter of "How the Mind Works" linked above instead of sleeping and drat it is so good, every time. Start reading chapter one there if you like this discussion. Apologies for the bad image-to-text conversion that was done in that link, but it's mostly fine. The citations in the book are not inline, making it seem like you're not reading a scientific work, but the pre-bibliography "notes" at the end of the book have you covered for an individual page's citations, and sometimes you'll recognize them yourself (like right now where I've found him talking about an Oliver Sacks patient story). Happy Thread fucked around with this message at 00:32 on Mar 8, 2018 |
# ? Mar 7, 2018 23:51 |