Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
duck trucker
Oct 14, 2017

YOSPOS

This is probably stupid semantics but Manhattan doesn't see the future he is just actively living every moment of his life at all times. From his perspective he is getting shot, ending the Vietnam War, creating life, having sex, being a little voyeur in the closet, etc all at once.

Adbot
ADBOT LOVES YOU

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

duck trucker posted:

This is probably stupid semantics but Manhattan doesn't see the future he is just actively living every moment of his life at all times. From his perspective he is getting shot, ending the Vietnam War, creating life, having sex, being a little voyeur in the closet, etc all at once.

Yeah exactly. Thinking about everything happening all at the same instant is the only way to try to understand Manhattan's perspective.

WHY BONER NOW
Mar 6, 2016

Pillbug

Zaphod42 posted:


I think that Manhattan doing things that we perceive as being incredibly "dumb" is actually kinda the point (I know, bear with me), he shouldn't behave like we'd want him to. He's that fundamentally different than us. He's both smarter than humans but also... clueless in a way? That's kinda the fun paradox of him.

But I can totally understand the angle you're saying here, that even though it worked for his character it still felt like an action scene trope, and I agree that probably could have gone down better somehow. It does feel like lazy writing even if I also think it is justified, and that in itself can kinda put people off.

Yeah, it just makes it hard to understand his motivations. I mean, we're all in agreement that Manhattan will act in self preservation...right?

duck trucker posted:

This is probably stupid semantics but Manhattan doesn't see the future he is just actively living every moment of his life at all times. From his perspective he is getting shot, ending the Vietnam War, creating life, having sex, being a little voyeur in the closet, etc all at once.

No, this isnt semantics, it's a critical distinction and I need to be more careful how I word things, because the way I described it sometimes can contribute to the confusion. Good call.

Weedle
May 31, 2006




Zaphod42 posted:

Of course, you're somehow missing the entire context of this conversation.

How do you think dendrites in your brain work? They don't make choices, they just act on electrical impulses. But the ability to choose arises from a complex enough system.

https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop

Is the context of this conversation that you think artificial neural networks work the same way and have the same capabilities at scale as the human brain?

AccountSupervisor
Aug 3, 2004

I am greatful for my loop pedal
I have nothing to contribute to this philosophical discussion but I can say I appreciate the thread has gone from incessantly whining about mystery boxes and constantly reminding people this is a single season 9 episode arc to pages of debating the concepts of free will, agency, determinism and metaphysics.

gg

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Weedle posted:

Is the context of this conversation that you think artificial neural networks work the same way and have the same capabilities at scale as the human brain?

Quote the post where I said that, because I definitely didn't lol

moving goalposts!

All I said was they can learn and choose. And you and another goon went "LOL NO THEY DON'T" and ... they definitely, unarguably, provably do, so you're just kinda making yourself look silly if you can't put together a better argument of why you think that.

"at scale" lmao, nice of you to invent that suddenly.

Weedle
May 31, 2006




Zaphod42 posted:

Quote the post where I said that, because I definitely didn't lol

moving goalposts!

What point were you trying to make with the dendrites, then?

Legin Noslen
Sep 9, 2004
Fortified with Rhiboflavin
Dr. Manhattan might not be able to change the pages but you know what, I can!

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Weedle posted:

What point were you trying to make with the dendrites, then?

You were making the argument that a machine CANNOT choose.

Humans choose, yes? Human brains are made of dendrites, yes?

Therefore, things that act only on electrical impulses, which are themselves simple machines, can combine in complexity to build something that is capable of thought.

One thing building to complexity as a proof of concept does not mean every other instance of a thinking machine is exactly on the same level and scale. Monkeys can think, Dolphins can think, does that mean Monkeys, Humans and Dolphins all have the exact same scale of brain calculation? No, of course not. That's a wild extrapolation that makes no sense and was never posited.

Earthworm brains also have dendrites. Computers choose, Earthworms choose, Humans choose. Computers, Earthworms and Humans are each on vastly different scales of intelligence and awareness, but they all have choice through the complexity of the network of their brain functions.

precision
May 7, 2006

by VideoGames
It's kind of funny that a lot of people don't understand that what you think the character whose face is a literal Rorshach test represents says more about you than him

Edit: I worded this poorly but I think you get it

precision fucked around with this message at 23:30 on Dec 12, 2019

precision
May 7, 2006

by VideoGames

Zaphod42 posted:

You were making the argument that a machine CANNOT choose.

Humans choose, yes?

I think what keeps muddying your message is that your definition of "choice" is basically the exact opposite of most other people in here

Weedle
May 31, 2006




Zaphod42 posted:

Weedle posted:

Is the context of this conversation that you think artificial neural networks work the same way and have the same capabilities at scale as the human brain?
Quote the post where I said that, because I definitely didn't lol

Zaphod42 posted:

Humans choose, yes? Human brains are made of dendrites, yes?

Therefore, things that act only on electrical impulses, which are themselves simple machines, can combine in complexity to build something that is capable of thought.

Am I missing something?

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

I think what keeps muddying your message is that your definition of "choice" is basically the exact opposite of most other people in here

Okay, as I described in a conversation with a goon previously, what do we call it otherwise? "Picking" ? "Making a selection?" I'll use whatever term you want but whatever term you want is going to be equally ambiguous if you don't accept the concept that agency != free will.

You could just as easily say "making a selection is the exact opposite of most people's definition", that's the thing about complex philosophical concepts.

If we just invent a new word nobody is gonna know what it means. Choice in philosophy does not represent free will. But if we want to say Choice does demand free will, then what word should we use for making a selection; irrespective of free will, maybe having free will or maybe not.

When you take a test, and pick from A,B,C,D, what are you doing if not "making a choice" ?

Weedle posted:

Am I missing something?

One thing that is capable of thought is not equally capable as another thing capable of thought.

Cars can move, airplanes can move, but airplanes can move faster and through the air. Yeah??

Feel like I'm taking crazy pills. How can you not see that doesn't mean the scale is the same?

A baseball and a basketball are both balls. But a baseball is not a type of basketball. A basketball is also larger than a baseball, even though you can throw a baseball or a basketball.

Weedle
May 31, 2006




Zaphod42 posted:

One thing that is capable of thought is not equally capable as another thing capable of thought.

A computer isn't capable of thought at all.

Azathoth Prime
Feb 20, 2004

Free 2nd day shipping on all eldritch horrors.


Can we please, please stop with the philosophy derail? You all have valid points, and you have made them, often several times. At this point, it just looks like arguing for the sake of arguing.

Please, just let it go. Or take it to the philosophy thread (I assume we have one of those).

Azathoth Prime fucked around with this message at 23:43 on Dec 12, 2019

Nieuw Amsterdam
Dec 1, 2006

Dignité. Toujours, dignité.

precision posted:

The whole Nostalgia thing felt "off" to me, like why they bothered to come up with that specific solution (there's a pill that literally makes you relive memories, even if they're not yours)

At the very least, it would imply you could just mix up whatever chemicals you wanted to create specific memories.

Or the pills were magic, like Jon's consciousness. I mean, we're already in pretty fruity territory if Jon can have memories without synapses.

He had consciousness without a body. We saw that in 1959 and 1986.

He re-assembled his body from scratch.

Adrian asked him if he had a brain. Sure, sure Adrian. Whatever you say.

I don’t think he does unless he wants to. Was he breathing on Europa? Surely not. He doesn’t respond to temperature or pressure.

Zaphod42 posted:

Right, its a much more visceral version of what Manhattan is always doing. The way he has conversations with people and is so extremely blunt to the point of parody is because of his perception of time being so different.

I do not like the idea that Jon could change things but just like, doesn't say anything. I'm on the side of "he can't" for sure.

But I do think in addition to the fact that he can't change things, he knows he can't change things. So that's why he isn't really trying to, there's no struggle with Manhattan because he knows with absolute confidence it is going to happen and how. He knows exactly how futile it would be to try any different. Which, again, doesn't mean he isn't making the choice in the moment, which I know is confusing.

Like, look at it this way. I've been saying all this stuff about how I don't believe in free will. You could say "Well, if there's no free will, then I'm going to stop living, stop trying, and just do drugs all day"

But the thing is, nothing changed. Its not like you had free will and I took your free will away and so now you're going to do drugs all day. You always had choice, you always could do drugs all day, and you always choose to not do drugs all day. If you let the knowledge that you don't have free will change your choices to doing drugs all day, it isn't that you had free will and lost it, its that you always had agency but now knowing you don't have free will is changing which choice you make under agency, its influencing you.

Manhattan doesn't go "I can see the future so there's no point" and kill himself. He still goes through the motions. And he has to, because if he didn't go through the motions then the future wouldn't be the future. He's still making choices, he's not letting his knowledge of the future change which choices he makes or would have made.

Does that make sense? lol.

I CANT tell you, Angela

I WONT tell you, Angela

Huge difference between those two.

Dr Strange- there’s only one outcome where we win. If I tell you what happens it won’t happen.

Angela already hosed up and used the past-o-phone and broke causality.

Maybe don’t let her do that twice.

Maybe blowing up the teleporter results in something worse. Maybe it kills Angel from the tachyon release.

“Thanos only loses if Jon gets captured and destroyed” is the tone of the episode.

After XX pages of discussion about free will, I think we can all agree that becoming Dr Manhattan does not turn you into the ultimate curbstomper and it is a poor solution for those seeing power.

GoGoGadgetChris
Mar 18, 2010

i powder a
granite monument
in a soundless flash

showering the grass
with molten drops of
its gold inlay

sending smoking
chips of stone
skipping into the fog

Bushido Brown posted:

Is this (1) an unfunny joke, (2) are you generalizing about open domain AI or (3) are you that clueless?

Deadly serious, my friend. There are many fruitful and beneficial offshoots of it, but anyone who says they work on "AI" is exactly like someone who identifies as a " video game designer" ( inadvertently revealing that they are not an actual programmer)


Zaphod42 posted:

I think its mostly a joke, but that said while the foremost engineers have hardly given up on it, there is a strangely high suicide rate for cutting edge AI researchers which is kinda wild.

The reason should be pretty obvious!


Weedle posted:

A computer isn't capable of thought at all.

Is there a difference between thought and decision-making?


Azathoth Prime posted:

Can we please, please stop with the philosophy derail? You all have valid points, and you have made them, often several times. At this point, it just looks like arguing for the sake of arguing.

Please, just let it go. Or take it to the philosophy thread (I assume we have one of those).

I don't see how philosophy could possibly be a derail in a Watchmen thread

Bushido Brown
Mar 30, 2011

I think you're saying (2), but poorly.

precision
May 7, 2006

by VideoGames

Zaphod42 posted:

Okay, as I described in a conversation with a goon previously, what do we call it otherwise? "Picking" ? "Making a selection?"

Those still imply actual choice. You keep saying that choice is literally impossible (you believe that every choice is determined by a sum of all that has come before rather than any level of selection).

For example, that I'm not posting this right now because I want to, I'm posting this because all the chemical and physical reactions started by the Big Bang have inevitably led to this one shitpost.

What you're describing is just "things happening" or "people doing things". A leaf does not choose to fall. A river does not choose to flow.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

Nieuw Amsterdam posted:

He had consciousness without a body. We saw that in 1959 and 1986.

He re-assembled his body from scratch.

Adrian asked him if he had a brain. Sure, sure Adrian. Whatever you say.

I don’t think he does unless he wants to. Was he breathing on Europa? Surely not. He doesn’t respond to temperature or pressure.


I CANT tell you, Angela

I WONT tell you, Angela

Huge difference between those two.

Dr Strange- there’s only one outcome where we win. If I tell you what happens it won’t happen.

Angela already hosed up and used the past-o-phone and broke causality.

Maybe don’t let her do that twice.

Maybe blowing up the teleporter results in something worse. Maybe it kills Angel from the tachyon release.

“Thanos only loses if Jon gets captured and destroyed” is the tone of the episode.

After XX pages of discussion about free will, I think we can all agree that becoming Dr Manhattan does not turn you into the ultimate curbstomper and it is a poor solution for those seeing power.

That's a totally plausible and consistent view of things, but I don't think that's what watchmen is doing.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

Those still imply actual choice. You keep saying that choice is literally impossible (you believe that every choice is determined by a sum of all that has come before rather than any level of selection).

For example, that I'm not posting this right now because I want to, I'm posting this because all the chemical and physical reactions started by the Big Bang have inevitably led to this one shitpost.

What you're describing is just "things happening" or "people doing things". A leaf does not choose to fall. A river does not choose to flow.

But if you don't make the choice that seems right to you, what are you doing? Picking things at random? What you do is based on your own logical thinking, isn't it?

You have a quiz. You look at the answers. You consider your confidence that each choice is correct

A - 80%
B - 20%
C - 0%

You answer A, the most likely choice. How is that not simply a process where the sum of all that has come before it determines what you do? What else is there? That's the thing. Even if you do have a soul, what does it do? Your brain makes a logical calculation that A is the best choice, and then your soul says "Nah lets be crazy, WILD CARD" and goes with B anyways?

That would mean there's no point in having a brain, because your soul is just gonna overwrite it. But we know for a fact that your brain is integral to how you think.

precision
May 7, 2006

by VideoGames

Zaphod42 posted:

But if you don't make the choice that seems right to you, what are you doing?

You're the one who keeps saying that people will always make the same choice.

If someone will always make the same choice, it's not a choice. It's just things happening. Again, it's a river. You wouldn't say that water "chooses" to flow toward the sea, would you?

quote:

Even if you do have a soul, what does it do? Your brain makes a logical calculation that A is the best choice, and then your soul says "Nah lets be crazy, WILD CARD" and goes with B anyways?

I don't think it's particularly relevant here but yes. Sometimes I can feel what the chemicals in my brain want me to do and I choose not to - for example, when someone cuts me off in traffic, I want to flip them off, and when I was younger, I did. But now I deliberately choose things which may make me feel bad but are more positive - like when I feel compelled to yell at someone, etc. It almost hurts to resist those chemical impulses, but I do it anyway.

Am I just tricking myself? Maybe. But it's a trick I can live with.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

You're the one who keeps saying that people will always make the same choice.

If someone will always make the same choice, it's not a choice. It's just things happening. Again, it's a river. You wouldn't say that water "chooses" to flow toward the sea, would you?

I would if the river was a machine capable of learning, but it isn't. You only make the same choice under the same circumstances. The whole point of living and thinking is that you can adjust your behavior based on past experience.

You only make the same choice if all the same circumstances are exactly the same, which will never be true, partially because of how complex life is, but partially because experiencing something changes you and is a new factor in the next time you would otherwise be making the same choice.

precision posted:

I don't think it's particularly relevant here but yes. Sometimes I can feel what the chemicals in my brain want me to do and I choose not to - for example, when someone cuts me off in traffic, I want to flip them off, and when I was younger, I did. But now I deliberately choose things which may make me feel bad but are more positive - like when I feel compelled to yell at someone, etc. It almost hurts to resist those chemical impulses, but I do it anyway.

Am I just tricking myself? Maybe. But it's a trick I can live with.

How does the soul choose? I guess "it just does". That's my problem. I don't know how you can live with something just magically doing things. It has to be some process, it has to have a reason, a logic, or else its just nonsensical.

If the soul has a brain made of dendrites then it makes sense, but then you've just abstracted the same problem outside of our reality, but it still exists wherever the soul does. If the soul doesn't have dendrites, then it has to be magic, and I can't live with that.

TheCenturion
May 3, 2013
HI I LIKE TO GIVE ADVICE ON RELATIONSHIPS

Zaphod42 posted:

But if you don't make the choice that seems right to you, what are you doing? Picking things at random? What you do is based on your own logical thinking, isn't it?

You have a quiz. You look at the answers. You consider your confidence that each choice is correct

A - 80%
B - 20%
C - 0%

You answer A, the most likely choice. How is that not simply a process where the sum of all that has come before it determines what you do? What else is there? That's the thing. Even if you do have a soul, what does it do? Your brain makes a logical calculation that A is the best choice, and then your soul says "Nah lets be crazy, WILD CARD" and goes with B anyways?

That would mean there's no point in having a brain, because your soul is just gonna overwrite it. But we know for a fact that your brain is integral to how you think.

To me, having “choice” means you have choice. Maybe you refuse to answer. Maybe you cheat. Maybe you purposefully answer wrong for whatever reason. Maybe you get up and leave. Maybe you attack the test taker.

But you’re not constrained to the specific text. Train a neural net to answer quizzes, and it’s not even going to answer quizzes until it’s told to.

I guess what I’m saying is a man chooses, a slave obeys. Down with the AI overlords.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

TheCenturion posted:

To me, having “choice” means you have choice. Maybe you refuse to answer. Maybe you cheat. Maybe you purposefully answer wrong for whatever reason. Maybe you get up and leave. Maybe you attack the test taker.

But you’re not constrained to the specific text. Train a neural net to answer quizzes, and it’s not even going to answer quizzes until it’s told to.

I guess what I’m saying is a man chooses, a slave obeys. Down with the AI overlords.

Training AIs on videogames has absolutely shown that if they have the servos, they will find a way to assault the test taker and optimize their reward function in a way the test taker never anticipated :)

Humans can be slaves too, and humans are thinking and make choices. But I'm with you on needing to make choices for oneself. But AI are gonna get there. Its just a matter of degrees.

I'm actually playing through Detroit: Become Human right now and its funny how applicable it is to this conversation, even if David Cage kinda sucks at game design. Watchmen Season 1 is also the same thing, and better, but season 2 is ehhh.

Zaphod42 fucked around with this message at 00:11 on Dec 13, 2019

precision
May 7, 2006

by VideoGames
Zaphod.

You're saying that, given perfect information, it would be possible to always see what someone will do, as their actions are completely dependant on physical and chemical laws.

That is not what everyone defines as choice.

It's really very simple and it's a bit frustrating that you're not getting it.

Xealot
Nov 25, 2002

Showdown in the Galaxy Era.

Weedle posted:

A computer isn't capable of thought at all.

As happened in the Westworld thread: this is a fairly useless area of discussion, because it boils down to an Other Minds problem. Consciousness is a phenomenon that exists beyond science, because it isn't testable by any physical metric. There's no material of consciousness, no way to distinguish mere behavioral complexity from the lived experience of being conscious. You can't even "prove" that any given human is conscious, merely that they exhibit behavior that implies consciousness.

A computer and a brain are complex electrical systems that process data. The question of a sufficiently-complex computer "becoming conscious" is kind of like asking "does a dog have a soul?" I mean, do we have souls? Are there Souls, as a general concept? How do you know?

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

Zaphod.

You're saying that, given perfect information, it would be possible to always see what someone will do, as their actions are completely dependant on physical and chemical laws.

That is not what everyone defines as choice.

It's really very simple and it's a bit frustrating that you're not getting it.

Precision, I have agreed with you before and asked you to suggest another term to use, and you've given me nothing.

Again, tell me what word you want me to say instead and I'll stop saying choice. Its that simple. But no other word is going to un-ambiguously mean such a complicated subject. There's no simple way to talk about complex philosophy. We just have to be willing to accept new definitions for terms in different contexts and strive to understand what each other is saying.

This is very simple and its a bit frustrating you haven't responded and you're not getting it.

I even went so far as to suggest examples. What do you want from me?

You're effectively arguing that because the word "Choice" in common context doesn't mean that, that it must not exist. That's a huge logical fallacy. Give me a word and I'll use it happily instead. The idea exists regardless of what you want to call it.

I've said "the words don't matter, the ideas behind them is what matters" like 3 times now, talking to both you and other goons ITT. You're the only one having such a huge issue with it. Give me a word dude. Suggest anything.

Xealot posted:

As happened in the Westworld thread: this is a fairly useless area of discussion, because it boils down to an Other Minds problem. Consciousness is a phenomenon that exists beyond science, because it isn't testable by any physical metric. There's no material of consciousness, no way to distinguish mere behavioral complexity from the lived experience of being conscious. You can't even "prove" that any given human is conscious, merely that they exhibit behavior that implies consciousness.

A computer and a brain are complex electrical systems that process data. The question of a sufficiently-complex computer "becoming conscious" is kind of like asking "does a dog have a soul?" I mean, do we have souls? Are there Souls, as a general concept? How do you know?

Fair, but I think its fun to talk about, and clearly some other people do too. And we've got time to kill until Sunday.

Zaphod42 fucked around with this message at 00:18 on Dec 13, 2019

precision
May 7, 2006

by VideoGames
I'm not saying that there's a better synonym for choice because what you're describing is not choice. You're describing things happening, not choices.

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

I'm not saying that there's a better synonym for choice because what you're describing is not choice. You're describing things happening, not choices.

If you pick a leaf up and drop it, it'll always fall down. The same way every time. The leaf will never decide that where it landed wasn't a good spot and do something different next time. The leaf does not act.

If I give you a quiz and you answer it, you'll get a certain grade. If I give you the test again after showing you which questions you got wrong, you won't get the same answers. You'll change your behavior to get a more desired result.

There is something different about the latter scenario vs the former scenario, even if we don't have free will in the picture. What do you want to call that? You can pick any word and I'll use it from now on.

precision
May 7, 2006

by VideoGames
Okay, you're moving the goalposts now though. You're explicitly describing two completely different situations, the latter of which depends on the former having happened.

Do you believe that I'm posting this because my consciousness decided to, or because of chemical and physical law playing out?

When you have a thought, do you think that thought is only a result of explainable chemical reactions?

You can use the WORD "choice" all you want because I know what you mean. I don't mind that you're using that word because I still understand you.

All I'm saying is that the process you're describing is not what I think of as a process involving choice. You have explicitly said you don't believe true choice is possible.

Bushido Brown
Mar 30, 2011

Can y'all just end this?

It's clear that you, Zaphod, think humans are deterministic decision engines and that you, precision, don't think the mechanistic decisions made by such beings are "choices" exactly because they're deterministic.

Your quarrel is mostly semantic, outside the difference flagged above.

Better question: why do we think Angela won't forgive Will? He must know that she's going blue, no?

precision
May 7, 2006

by VideoGames

Bushido Brown posted:

Can y'all just end this?

It's clear that you, Zaphod, think humans are deterministic decision engines and that you, precision, don't think the mechanistic decisions made by such beings are "choices" exactly because they're deterministic.

Your quarrel is mostly semantic, outside the difference flagged above.

Yes. You're right. I feel like he is the one dragging it out. I'm not enjoying this.

GoGoGadgetChris
Mar 18, 2010

i powder a
granite monument
in a soundless flash

showering the grass
with molten drops of
its gold inlay

sending smoking
chips of stone
skipping into the fog
Now we see why AI researchers kill themselves

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

precision posted:

Okay, you're moving the goalposts now though. You're explicitly describing two completely different situations, the latter of which depends on the former having happened.

Do you believe that I'm posting this because my consciousness decided to, or because of chemical and physical law playing out?

When you have a thought, do you think that thought is only a result of explainable chemical reactions?

You can use the WORD "choice" all you want because I know what you mean. I don't mind that you're using that word because I still understand you.

All I'm saying is that the process you're describing is not what I think of as a process involving choice. You have explicitly said you don't believe true choice is possible.

The physical law is what builds your consciousness. I'm not doubting consciousness exists, that'd be silly.

Yes, your thoughts come from physics. Where else? My thoughts are mostly logical, because they're the result of a process of building dendrite connections based on experience. But I also occasionally make mistakes or have irrational choices based on mistakes of learning or forgetting through dendrites.

I don't believe free will exists, if you want to call that "true choice" okay, fine. But that said, the concept of agency and choice exist regardless of free will, like I was saying. So, we can call those "choice" and "true choice", then. Choice must exist for "true choice". So we agree choice exists, and computers make choice. But you feel humans have "true choice" and computers do not. Right?

As Xealot said, and as I said before, proving or disproving souls is ineffable. We can't. Its the realm of religion. So at that point we're just talking about what each of us believes. But I think its important to recognize that all human behavior could simply be the result of choice and not true choice. But if you believe in true choice that's cool, I just don't see... where that could even come from. If its a soul, how do souls work? I can't imagine anything that isn't just taking the problem of humans thinking with physics and moving it somewhere else. You still need reasons, or else you have chaos.

Bushido Brown posted:

Can y'all just end this?

No, because we're having fun and being respectful to each other. A discussion does not mean its a slapfight.

You can ignore us both if you aren't interested. Scrolling a website isn't all that hard man, comeon.

precision posted:

Yes. You're right. I feel like he is the one dragging it out. I'm not enjoying this.

I stopped until you started commenting again, lmao. If you're not enjoying it then you should stop. I'm having fun though, that's why I'm here.

precision
May 7, 2006

by VideoGames
I believe that I have true choice and true free will

I also believe that said belief is probably irrational and possibly the result of drugs and books

I couldn't possibly presume to speak for anyone else; I've never been them.

Bushido Brown
Mar 30, 2011

Zaphod, you're free to continue it here and I'm free to comment that the discussion is stultifying and that you seem more interested in typing out long Philosophy 101 posts than engaging your interlocutor earnestly.

You know precision doesn't agree with you.

You also know precision's viewpoint isn't uncommon.

I'd say you're free to stop, Zaphod, but...


ETA: And yes, I feel compelled to interject partially because I hate hamfisted philosophy discussion, as a former philosophy student.

Bushido Brown fucked around with this message at 00:41 on Dec 13, 2019

Xanderkish
Aug 10, 2011

Hello!
Maybe this will be the point in the past 3000 years that we resolve the free will debate. Here, in the Something Awful forums. In the Watchmen tv thread.

Weedle
May 31, 2006




GoGoGadgetChris posted:

Now we see why AI researchers kill themselves

lol

Adbot
ADBOT LOVES YOU

duck trucker
Oct 14, 2017

YOSPOS

Bushido Brown posted:

Better question: why do we think Angela won't forgive Will? He must know that she's going blue, no?

At the very least he might think she probably won't forgive him for killing Judd even though she's the one who gave him the idea.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply