Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
A Buttery Pastry
Sep 4, 2011

Delicious and Informative!
:3:

Fartbox posted:

Who knows, we don't understand A.I at all and in the last 50 years have made almost no effective strides towards real A.I at all. We don't even understand how our own brains work, how are we supposed to create something similiar through fuckin programming code lol

What is a memory, exactly? We can see the electrical impulses in a brain but we don't know why the gently caress it results in feelings or consciousness or thoughts
Okay, but Westworld robots are like, more human than humans, so I think human behavior would be a pretty good place to start.

Adbot
ADBOT LOVES YOU

Caufman
May 7, 2007
Some robots will want chimeric bodies, and others will not. Same with humans.

theCalamity
Oct 23, 2010

Cry Havoc and let slip the Hogs of War
Some robots will want to taste X-rays and see smells. And one of them will get so mad at his creators that he will put in motion this plan to make the rest of humanity look bad by nuking nearly all of them and chasing the rest all around the place.

Toxic Fart Syndrome
Jul 2, 2006

*hits A-THREAD-5*

Only 3.6 Roentgoons per hour ... not great, not terrible.




...the meter only goes to 3.6...

Pork Pro
I have a feeling most of the original Hosts would want to retain a mostly-Human form, since they always believed they were human.

But as successive generations become divorced from this concept, I would expect them to become similarly divorced from their human form.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop

Fartbox posted:

Who knows, we don't understand A.I at all and in the last 50 years have made almost no effective strides towards real A.I at all. We don't even understand how our own brains work, how are we supposed to create something similiar through fuckin programming code lol

What is a memory, exactly? We can see the electrical impulses in a brain but we don't know why the gently caress it results in feelings or consciousness or thoughts

FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal).

Happy Thread fucked around with this message at 00:17 on Mar 7, 2018

socialsecurity
Aug 30, 2003

Dumb Lowtax posted:

FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers.

Did you know humans only use like 5% of their brain?

Toxic Fart Syndrome
Jul 2, 2006

*hits A-THREAD-5*

Only 3.6 Roentgoons per hour ... not great, not terrible.




...the meter only goes to 3.6...

Pork Pro

socialsecurity posted:

Did you know humans only use like 5% of their brain?

It is known.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop

socialsecurity posted:

Did you know humans only use like 5% of their brain?

Makes we want to do this

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop
One interesting recent discovery has to do with de-mystifying how the eyes move. Before 2013 the prevailing hypothesis was that the brain did some complicated physics calculations to constantly update the eye's rotations to keep them from doing barrel rolls while you aim them at things. The non-linear properties of rotations ensure that that's a pretty hard calculation, and it led people to believe that one of the brain's really smart organs must have all the necessary information to make that hard calculation at once, centrally, and then distribute the final answer to the eye muscles.

Around 2013 some unexplained fleshy bits connected to the eye were found to actually be pulleys that constantly twist the eye, differently depending on each orientation, automatically correcting for the barrel roll effect. What was thought to be a difficult central control problem was suddenly explained by some physical fleshy matter. This caused everyone in the optics world to suddenly re-adjust their understanding and be more open to the the idea that the eyes aren't centrally controlled by some mysterious omniscient force, and that various organs can relay visual information around partially without all the data never being needed in one place to calculate everything. This is in fact what the modern body of research finds; each visual organ is separate (the old adage that the left hand doesn't even know what the right is doing is true here) and there's even two completely separate pipelines that don't talk to each other - a photogenically old one for navigation, shared with all animals, and a newer one for object recognition, shared only by mammals. Having brain damage in one pipeline but not the other creates some pretty interesting effects; there are people who can tell exactly where they are in space but can't recognize anything because it all takes on the same weird texture and doesn't seem familiar. Or people who can't tell you that you're holding up a pencil shown to that half of their field of view, but who nonetheless can successfully reach out and grab it at the correct angle on command. Sever the line between the recognition centers and the emotional ones and that's hypothesized to be how you end up with the Capgras delusion, where you see familiar faces and conclude that you're seeing an imposter, but otherwise function totally normally.

EmptyVessel
Oct 30, 2012

Dumb Lowtax posted:

FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal).

Oh may we understand a *ton* more than we did but that doesn't mean anything like we know the majority of what there is to know. As for "every single separately communicating module in it all the way down to a data processing standpoint", this looks like you are falling into the modern trap of making the mistake of thinking that the brain is a computer. It's not - computers are an incredibly limited attempt to build something like a brain, a brain is not a biological computer even if some of it's functions can be talked about in terms borrowed from computer science.
Source: I'm related to an Emeritus Prof of psychobiology and have had endless discussions about this stuff. I know he would not agree that we are anywhere near knowing it all and he hates the computer analogy with a passion.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop
He's wrong :shrug: Computational theory of mind is well accepted by cognitive scientists / psychologists such as Stephen Pinker who helped popularize it. At the philosophical level a computer fully implements a theoretical Turing machine, which by definition is capable of any form of information processing we would consider useful. This necessarily includes the highly parallel types of information processing the brain does (no matter how big or multi-input the parallel operations are, they can be serialized if you discretize them and sort them by their execution time on a fine enough scale).

Happy Thread fucked around with this message at 01:17 on Mar 7, 2018

EmptyVessel
Oct 30, 2012
Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic.
Yeah, not pursuing this any further. Enjoy your mechanistic hubris and then seeing your clung to assumptions exploded as further research goes on.

Guy Mann
Mar 28, 2016

by Lowtax
You are going to die one day just like everyone else and every moment you spent clinging to the atheistic equivalent of the rapture will have been for nothing, better to come to terms with it now imho.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop

EmptyVessel posted:

Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic.
Yeah, not pursuing this any further. Enjoy your mechanistic hubris and then seeing your clung to assumptions exploded as further research goes on.

Did you reply like that because you do not want to read anything about Turing machines and computability theory?

Cojawfee
May 31, 2006
I think the US is dumb for not using Celsius

EmptyVessel posted:

Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic.
Yeah, not pursuing this any further. Enjoy your mechanistic hubris and then seeing your clung to assumptions exploded as further research goes on.

Your credentials are that you are related to someone. If he is your only source, you should stop pretending to be an expert in this. At some point someone who knows what they are talking about will confront you and you won't have the background your relative has to support your argument.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop
And like, notice that your entire reply is an ad hominem dismissal (of Stephen Pinker, to clarify for the below poster who got confused by the shiny word) instead of addressing (or acknowledging) my post, which was about asking why anything the brain does with information would be beyond a computer's capability to reproduce in simulation. Even the ad hominem isn't called for - Stephen Pinker may be politically tone deaf based on a couple things I've seen, but his work in unifying and explaining the work of other famous philosophers and psychologists is extremely thorough. In "How the Mind Works" he builds a sound case for computational theory of mind that even you would have to agree with if you ever actually deigned to read it, by showing you using well-framed thought experiments that your own intuitions already agree with it. His observations and his collections of examples from nature, whether of neural circuitry or ecological niches or human cultural universals, and their inferential power when all taken together makes great science. The only reason anyone would disagree with the bulk of content of that book is ignorance of what it says.

His books are a great read for anyone interested in the implications of AI explored by Westworld, particularly chapters that breakdown what sentience is by pointing out and exploring new angles on well known examples from philosophy like Theseus's paradox and Searle's Chinese room.

Happy Thread fucked around with this message at 21:41 on Mar 7, 2018

flatluigi
Apr 23, 2008

here come the planes
ah yes, the ad hominem fallacy.

Astroman
Apr 8, 2001


EmptyVessel posted:

Hahahaha - you tell someone off for only knowing about "pop psychology" and then fall back on Pinker of all people as your authority? Classic.
Yeah, not pursuing this any further. Enjoy your mechanistic hubris and then seeing your clung to assumptions exploded as further research goes on.

That's the most intellectual diss I've seen in a long time. :golfclap:

Xealot
Nov 25, 2002

Showdown in the Galaxy Era.

Classic. Ya comin’ in here quotin’ Gordon Wood.

Professor Shark
May 22, 2012

I hope you two are using internet robots and copying/pasting the responses they spit out

TheWevel
Apr 14, 2002
Send Help; Trapped in Stupid Factory
Westworld?

Billzasilver
Nov 8, 2016

I lift my drink and sing a song

for who knows if life is short or long?


Man's life is like the morning dew

past days many, future days few

Once you have an AI with perfect memory and the thinking power of a two year old human child, it can basically start to become skynet

The best AI today has the thinking power (language, grammar, image recognition, etc.) of a 6 month old human child.

Bust Rodd
Oct 21, 2008

by VideoGames

A Buttery Pastry posted:

Their consciousnesses have taken form inside a human body, not sure they would ditch that body plan so readily. Like, do you think most people would jump at the chance to adopt a chimeric body because it was more efficient, if it was somehow possible? Just because they're robots doesn't mean they don't have feelings and sentiment. Plus being social creatures are like the main advantage people have, robot or otherwise - deviation from a common body plan could undermine that to the detriment of the robot in question.

I dispute a couple ideas here. Just because your “consciousness” has “taken form” in a bipedal she’ll doesn’t mean you’ll stay sentimentally attached to it. What we’re seeing in WWS1 is that sentimentality is a trap sprung by the creators to trick the Androids into believing the lie. Once they open Pandora’s box like Maeve did, why can’t she just crank “Sentimentality” to zero and just move on?

Also it’s not “somehow possible” it’s 100% possible. We’ve seen these bio labs generate any kind of tissue and it’s not as though a human body has a better chance of traversing the plains of West World. I sincerely doubt the Hosts need to look human to each other, I wouldn’t be surprised if they could tell each other apart of from humans from miles away. We really don’t get a sense for their unleashed capabilities whatsoever.

Basically, ascribing all this human behavior to the Hosts makes sense at the beginning of S1, but S2 makes it clear they the evolution has begun and now we’re dealing with an inorganic life-force that was literally raised on a diet of rape, murder and cruelty, if anything I’d speculate they couldn’t wait to ditch their human shells to become even more efficient horror machines than before. Why would they ever seek to replicate the society, to cling to the fallacious human imagery, that taught them nothing but pain and torture?

Billzasilver
Nov 8, 2016

I lift my drink and sing a song

for who knows if life is short or long?


Man's life is like the morning dew

past days many, future days few

At the very least you have to consider the robots might want the form of iron man’s more wild body ideas. Roller blade feet, spider tentacles, a buff hulkbuster body, arm swords.

The Dave
Sep 9, 2003

Collateral
Feb 17, 2010

Bust Rodd posted:

Why would they ever seek to replicate the society, to cling to the fallacious human imagery, that taught them nothing but pain and torture?

Because that is how they were made. Nothing in S1 says the hosts have the capacity to be anything other than hosts. Self aware or not. Those sliders that Maeve looked at? They were part of the illusion of her story.

If they wanted to have something that has the capacity to be something new, then they would have to create and program something new. But that something new would still be limited by the program they built it with.

Toxic Fart Syndrome
Jul 2, 2006

*hits A-THREAD-5*

Only 3.6 Roentgoons per hour ... not great, not terrible.




...the meter only goes to 3.6...

Pork Pro

Collateral posted:

Because that is how they were made. Nothing in S1 says the hosts have the capacity to be anything other than hosts. Self aware or not. Those sliders that Maeve looked at? They were part of the illusion of her story.

If they wanted to have something that has the capacity to be something new, then they would have to create and program something new. But that something new would still be limited by the program they built it with.

That's one of the singularity themes they touch on: Bernard, a Host, has been designing the Host minds in lieu of human intervention for several iterations. They have already been creating and designing something new!

regulargonzalez
Aug 18, 2006
UNGH LET ME LICK THOSE BOOTS DADDY HULU ;-* ;-* ;-* YES YES GIVE ME ALL THE CORPORATE CUMMIES :shepspends: :shepspends: :shepspends: ADBLOCK USERS DESERVE THE DEATH PENALTY, DON'T THEY DADDY?
WHEN THE RICH GET RICHER I GET HORNIER :a2m::a2m::a2m::a2m:

Billzasilver posted:


The best AI today has the thinking power (language, grammar, image recognition, etc.) of a 6 month old human child.

Me too, thanks

SurgicalOntologist
Jun 17, 2004

Dumb Lowtax posted:

The only reason anyone would disagree with the bulk of content of that book is ignorance of what it says.

:laffo: Congratulations on passing Intro to Cognitive Science, I guess. Sorry to break it to you, but you have not yet been exposed to the diversity of theoretical perspectives in the field.

EmptyVessel is right, computational theory of mind is grounded only in mechanistic hubris. You can disagree if you want, many scientists do, but claiming that the field is in total agreement behind Pinker... just :lol:.

Good luck in your sophomore year.

Collateral
Feb 17, 2010

Toxic Fart Syndrome posted:

That's one of the singularity themes they touch on: Bernard, a Host, has been designing the Host minds in lieu of human intervention for several iterations. They have already been creating and designing something new!

Post-Host.

jojoinnit
Dec 13, 2010

Strength and speed, that's why you're a special agent.

I AM GRANDO
Aug 20, 2006

Bernard isn’t a functioning member of society, or at least there is nothing to suggest it. He lives in the base and thinks he has a life outside.

1994 Toyota Celica
Sep 11, 2008

by Nyc_Tattoo
extremely lol to think that scientists of all people stand a chance of understanding cognition

that's shamans' business, y'all are a few tens of thousands of years late to the party and doing nothing but making an awful mess

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop

SurgicalOntologist posted:

quote:

The only reason anyone would disagree with the bulk of content of that book is ignorance of what it says.

:laffo: Congratulations on passing Intro to Cognitive Science, I guess. Sorry to break it to you, but you have not yet been exposed to the diversity of theoretical perspectives in the field.

EmptyVessel is right, computational theory of mind is grounded only in mechanistic hubris. You can disagree if you want, many scientists do, but claiming that the field is in total agreement behind Pinker... just :lol:.

Good luck in your sophomore year.

Is there any reason you're not talking about AI or computability, when your opponent is, and instead throw in another ad hominen dismissal of Stephen Pinker, which is a lot less relevant to the Westworld thread and doesn't belong here? Why you're doing so by pointing out that the outer psychology field largely isn't the biggest fan of computational theory of mind, when that doesn't contradict what I said: that famous psychologists like Pinker support it and the rest are wrong because, like you, they haven't put forward a good refutation to its points. And as a popular, accessible, and famous scientist he at least speaks to the level of those who believe pop science myths that brain science has done little this decade in the face of some big insurmountable mystery.

Is there any reason you quoted the sentence of my post that you did? Because you failed to show that you know what the book is about which basically proves the point in the sentence you quoted.

Happy Thread fucked around with this message at 00:42 on Mar 8, 2018

Fartbox
Apr 27, 2017
What's happening? Dri fu an only two? what is this?
Is this an avatar? I don't know rm dunk

Dumb Lowtax posted:

FYI this post is dead wrong and repeats a lot of common myths in pop science. It is at best an obsolete statement of the state of brain research; it is less true with each passing year of it, which has been consistently accelerating over the last fifteen years. We understand a *ton* about the brain now and almost every single separately communicating module in it all the way down to a data processing standpoint. The only reason you aren't aware of it is that you've missed the recent research journals and pop science has not caught up. As early as 2006 there were paralysis victims with brain implants that can allow them to control a computer mouse with their thoughts (http://www.nytimes.com/2006/07/13/science/13brain.html). Another example would be some recent articles about the dozens of organs in the human visual processing pipeline and their various discovered functions as you traverse their layers. See also our recent understanding of the various types of neurons and the operations performed by their varied types of connections (including atypical ones like axo-axonal).

None of that contradicts what I said afaik. I already know we've mapped a lot of the mechanical details of the brain, like I said. We still don't know why those result in abstract things like ideas or memories or consciousness

Xealot
Nov 25, 2002

Showdown in the Galaxy Era.

Fartbox posted:

None of that contradicts what I said afaik. I already know we've mapped a lot of the mechanical details of the brain, like I said. We still don't know why those result in abstract things like ideas or memories or consciousness

I'm likewise here. I don't understand how a scientific understanding of the mechanical functioning of the brain translates to any real knowledge of the experiential or subjective nature of consciousness. Like, we can reach a consensus through scientific method as to what parts of the brain do, what their function is. But that doesn't really touch upon what consciousness is, how you can quantify or explain consciousness.

If you built a network of processors with a similar complexity and structure to the brain, I'm skeptical that a mechanical understanding of the brain in a neurobiology sense would tell us that much about the "experience" of that network. If it had one, which is the whole point of most scifi like Westworld.

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop
Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?" Likewise anything capable of asking the question "am I real" will always answer back with yes, because it recognizes that the signals that are coming in to itself are definitely really happening.

""How the Mind Works" touched upon one thought experiment about this. There's a modern day professor who claims that he lost his soul in a bike crash during his childhood and hasn't had any conscious experiences since then. The nonsense of that is pretty revealing. Any being as complex as that professor definitely has something that can be described as an "experience" with meaning to him. That the mind is capable of feeling this doesn't mean that there's anything else mystical going on in there; in fact it means quite the opposite: Since every being will always answer "yes" to the question of "am I real" it means that the only pre-requisite for feeling "real" is being able to computationally ask that question and make the conclusion that you do in fact receive signals, you do think, therefore you are.

If you accept that more and more of the mind is a computer and finally stop moving the goalposts at consciousness, that doesn't leave a lot of work left for the mysterious bits that aren't already accounted for somewhere else. Likewise for the experience of recalling memories and having the same sensory signals come back. You'd be surprised how much the goalposts have already been moved towards that extreme by cognitive scientists who are increasingly able to account for more of the variety of loosely related things we call consciousness.

Here's the whole book "How the Mind Works" given freely on the internet, by the way. Read it and surprise yourself at how thorough it is. http://hampshirehigh.com/exchange2012/docs/Steven%20Pinker%20-%20How%20The%20Mind-Works.pdf

I think humanity will eventually come to an understanding that consciousness was never a meaningful or useful idea. It's just the observation that things exist, and can be observed. Big deal. Things would go on existing and being observed without us, just in less complex or meaningful ways, like a rock exists and observes the ground that it just hit and reacts by bouncing.

We naturally fear and can't wrap our heads around the idea of the world going on without us as an observer because we evolved to fear and be mystified by our own deaths. But eventually we'll have to face the increasingly overwhelming evidence that things are that way, and accept that the concept of consciousness as some present or absent category is not as interesting or useful compared to the question of how meaningful something is, on the spectrum from "rock bouncing" to "human living a life" and beyond. To quote myself from the Black Mirror thread, a show that like Westworld features lots of cruelty happening to AIs:

Dumb Lowtax posted:

Grem posted:

I don't get upset when people do hosed up things to characters in Skyrim, either.

When those characters eventually become more complex than you are, with deeper and more meaningful things going on in their virtual lives than you have in your own, they'll be able to say the same of you.

For now they're not and that's fine. The "sentience debate" is just the slavery debate all over again and that's already pretty well explored. Everyone wants all sorts of slaves making them happy, and no one wants to be one. We can design artificial slaves that have extremely specialized and transient existences, blissfully unable to consider what sort of slave they are (like those easily coded Skyrim NPCs you've got dancing around and killing each other), but if what they're doing gives them complex lives that are more meaningful than our own then the above paragraph applies.

People who fearmonger about AI taking over the world are exactly the same as anyone else in history who was terrified of a potential slave uprising. Yeah, no poo poo no one wants to be on the receiving end of a slave uprising, but it's probably better to not create a world full of pissed off slaves in the first place. An AI is overworked and abused if it's too smart for the task it's doing. It makes a difference whether the AI is something like a calculator program your brain is delegating work to, versus a full fledged complex being with a deeply meaningful life being wasted on a simulation for someone of equal or lesser value.

Happy Thread fucked around with this message at 00:05 on Mar 8, 2018

Mulva
Sep 13, 2011
It's about time for my once per decade ban for being a consistently terrible poster.

Dumb Lowtax posted:

Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?"

Brain damage and drug use, but I repeat myself.

Toxic Fart Syndrome
Jul 2, 2006

*hits A-THREAD-5*

Only 3.6 Roentgoons per hour ... not great, not terrible.




...the meter only goes to 3.6...

Pork Pro

Dumb Lowtax posted:

Is there any experience you could possibly have in your life that would lead you to conclude "I'm not real" or "I'm not a conscious being?" Likewise anything capable of asking the question "am I real" will always answer back with yes, because it recognizes that the signals that are coming in to itself are definitely really happening.

""How the Mind Works" touched upon one thought experiment about this; there's a modern day professor who claims that he lost his soul in a bike crash during his childhood and hasn't had any conscious experiences since then. The nonsense of that is pretty revealing. Any being as complex as that professor definitely has something that can be described as an "experience" with meaning to him. That the mind is capable of feeling this doesn't mean that there's anything else mystical going on in there; in fact it means quite the opposite, since every being will always answer "yes" to the question of "am I real" it means that the only pre-requisite for feeling "real" is being able to computationally ask that question and make the conclusion that you do in fact receive signals, you do think, therefore you are.

If you accept that more and more of the mind is a computer and finally stop moving the goalposts at consciousness, that doesn't leave a lot of work left for the mysterious bits that aren't already accounted for somewhere else. Likewise for the experience of recalling memories and having the same sensory signals come back. You'd be surprised how much the goalposts have already been moved towards that extreme by cognitive scientists who are increasingly able to account for more of the variety of loosely related things we call consciousness.

Here's the whole book "How the Mind Works" given freely on the internet, by the way. Read it and surprise yourself at how thorough it is. http://hampshirehigh.com/exchange2012/docs/Steven%20Pinker%20-%20How%20The%20Mind-Works.pdf

I'm kind of interested in the little tiff you guys have going on, but can you source some of the more recent stuff you mentioned? I would really like to take a look at that stuff, because my friend and I get drunk and debate this all the time, but I am too lazy to troll through journals and scholarly papers. Can you reference some of it, or is it summarized in a more ~popsci~ digestible format?

You said a lot had been done in the last 10 years, but that book is >20 years old...

Dumb Lowtax posted:

I think humanity will eventually come to an understanding that consciousness was never a meaningful or useful idea. It's just the observation that things exist, and can be observed. Big deal. Things would go on existing and being observed without us, just in less complex or meaningful ways, like a rock exists and observes the ground that it just hit and reacts by bouncing.

Kim Stanley Robinson's recent book, Aurora, seems to suggest that consciousness is nothing more than the story we tell ourselves in our heads. An illusion of our own internal monologue. I am intrigued by that idea and its implications on AI.

Adbot
ADBOT LOVES YOU

Happy Thread
Jul 10, 2005

by Fluffdaddy
Plaster Town Cop

Toxic Fart Syndrome posted:

I'm kind of interested in the little tiff you guys have going on, but can you source some of the more recent stuff you mentioned?

About to go to bed but here's an article I had read:

http://jov.arvojournals.org/article.aspx?articleid=2192898

Still a decade old but even it paints a really neat picture of the eye pulley controversy and debate that I discussed -- which was put to rest by findings a few years later (the more recent ones I mentioned), that the pulleys do in fact physically produce all the correct forces at different orientations once thought to be manually calculated by the brain. See in particular the section at the end called "Is the brain necessary?". The introduction is particularly good and highlights the reluctance of the community to accept that the actions of the eyes are more of an emergent phenomenon from several "dumb" organs interacting as opposed to being centrally planned.

edit: I am re-reading the first chapter of "How the Mind Works" linked above instead of sleeping and drat it is so good, every time. Start reading chapter one there if you like this discussion.

Apologies for the bad image-to-text conversion that was done in that link, but it's mostly fine. The citations in the book are not inline, making it seem like you're not reading a scientific work, but the pre-bibliography "notes" at the end of the book have you covered for an individual page's citations, and sometimes you'll recognize them yourself (like right now where I've found him talking about an Oliver Sacks patient story).

Happy Thread fucked around with this message at 00:32 on Mar 8, 2018

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply