Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Sham bam bamina!
Nov 6, 2012

ƨtupid cat

PresidentBeard posted:

I think fade was joking.
So was Lottery of Babylon.

Adbot
ADBOT LOVES YOU

bewilderment
Nov 22, 2007
man what



The idea of bringing someone back from the dead just as they were is a bit silly, as is the prediction that immortality will happen just around whenever the predictor feels their old age. But I don't think the pursuit of immortality is an inherently terrible or unworkable ideal.

Whatever the 'human condition' is changes from generation to generation (at least in developed nations), I don't think it's meaningful to say "this will always be part of humanity, we will never get rid of it" unless you'd consider anything lacking that trait to be no longer human.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



bewilderment posted:

The idea of bringing someone back from the dead just as they were is a bit silly, as is the prediction that immortality will happen just around whenever the predictor feels their old age. But I don't think the pursuit of immortality is an inherently terrible or unworkable ideal.

Whatever the 'human condition' is changes from generation to generation (at least in developed nations), I don't think it's meaningful to say "this will always be part of humanity, we will never get rid of it" unless you'd consider anything lacking that trait to be no longer human.
I think there's a big difference between 'hypothetically, we'll eventually be able to use some combination of genetics, drugs and cybernetic bullshit to make it so that people don't die due to having too many birthdays' and 'we will abolish death and computer-resurrect everyone who has always lived in a manner curiously similar to Judeo-Christian images of the afterlife.'

I think the former is theoretically possible, though there may be some limits on what can be done to humans in a single generation (you might have to engineer their brains to release longevium or something, and children born from THAT POINT FORWARDS could be clinically immortal, for instance) and it's possible that some things may be "doable" but so damnably expensive that only the very rich can eke out a few extra decades. I don't think a guy who lived to be 140 because of brain gene drugs and a cybernetic heart is somehow fundamentally not human.

However, I don't think the digital millennium is gonna happen, ever. But often you get the, I don't remember what the fallacy is called, but saying that because you think the digital millennium is a bunch of crackpot horseshit, that you're somehow against people getting cyberhearts and so forth.

e: Like even with an indefinitely prolonged lifespan with no bugs in it - itself pretty goddamn unlikely - people would still die from infectious diseases, injuries, self-harm, etc. and would still possibly have greatly decreased quality of life if they had brain damage or something.

Iunnrais
Jul 25, 2007

It's gaelic.
I read somewhere, and I'd love it if I could find it again, some actuary stating that if old age and disease were no longer valid causes of death, people would tend to live an average of somewhere in the 900-1100 year range due to injury from accidents, murder, war, etc. Anyone know who said that, and where?

RPATDO_LAMD
Mar 22, 2013

🐘🪠🍆

Iunnrais posted:

I read somewhere, and I'd love it if I could find it again, some actuary stating that if old age and disease were no longer valid causes of death, people would tend to live an average of somewhere in the 900-1100 year range due to injury from accidents, murder, war, etc. Anyone know who said that, and where?

Well the CDC releases info on death rates. According to this pdf from 2011, the death rate for 25-29 year olds is about 100 per 100,000, so if you sprinkled fairy dust on them and kept them from ever aging, they'd live an average of about 1000 years.

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

bewilderment posted:

But I don't think the pursuit of immortality is an inherently terrible or unworkable ideal.

But these crackpots' particular visions of immortality are inherently terrible and unworkable. One wants to rebuild his dead father out of old Facebook posts and lego bricks, and the other wants to sit around waiting for the laws of physics to magically change in a way that suits him. Immortality would be nice, but the fact of the matter is that you almost certainly won't live forever, and deluding yourself about that is just stupid and unhealthy.

e: I mean, I think time travel would be cool to have, but if Big Yud started raving about how we should dedicate every resource in the world solely to inventing time travel because something something Pascal's Wager shut up and multiply eight lives per dollar I want to rewind to meet my dead dad, then that would be dumb and mockworthy and I wouldn't feel any need to include a disclaimer of "oh but if time travel were real then that would be pretty neat" in every mocking post.

Lottery of Babylon fucked around with this message at 10:47 on Dec 8, 2014

SubG
Aug 19, 2004

It's a hard world for little things.

Nessus posted:

I think one of the key Yudkowsky Thought ideas is that at a certain level of resolution, a simulation of you is you, which is why the prospect of a future super-computer that will torture a million copies of you is terrifying. You don't know that you yourself are not a simulation, subjectively - perhaps everything happening NOW, apparently, is actually just a memory that the AI gives you so you are established as "you" in its simulation, and any minute now it's going to wheel in the Excruciator.
I would decline to be shot in the stomach even if you convinced me that you had a perfect copy of me in the next room. Because that copy is not me in the sense that it will not have to put up with being shot in the stomach if I am. If we wish to stipulate that both the copy and I are simulations or pod people or whatever the gently caress that doesn't really change the argument.

If somehow or other I was simultaneously experiencing being both here and in the other room, then perhaps you'd have an argument, but that doesn't appear to be what's happening.

Triple Elation
Feb 24, 2012

1 + 2 + 4 + 8 + ... = -1
Thought ethics was a tough nut? Just wait until you get your hands on the new and improved Ethics in Situations Where Your Naive Concept of Self No Longer Applies! BTW we have the answers to all the resulting conundrums already, we got them using Bayes' theorem

Chamale
Jul 11, 2010

I'm helping!



SubG posted:

I would decline to be shot in the stomach even if you convinced me that you had a perfect copy of me in the next room. Because that copy is not me in the sense that it will not have to put up with being shot in the stomach if I am. If we wish to stipulate that both the copy and I are simulations or pod people or whatever the gently caress that doesn't really change the argument.

If somehow or other I was simultaneously experiencing being both here and in the other room, then perhaps you'd have an argument, but that doesn't appear to be what's happening.

Yeah, I don't understand the logic where having a perfect copy of me means I don't die. If I could step into a cloning machine and get an immortal copy of myself, my self does not become immortal, only a copy does. I see the same problem with any kind of consciousness-uploading immortality idea, unless my self somehow can travel from my brain into a computer. It's the same problem as the teleporter from Star Trek, only even nerdier.

Tiggum
Oct 24, 2007

Your life and your quest end here.


SubG posted:

I would decline to be shot in the stomach even if you convinced me that you had a perfect copy of me in the next room. Because that copy is not me in the sense that it will not have to put up with being shot in the stomach if I am. If we wish to stipulate that both the copy and I are simulations or pod people or whatever the gently caress that doesn't really change the argument.

If somehow or other I was simultaneously experiencing being both here and in the other room, then perhaps you'd have an argument, but that doesn't appear to be what's happening.
The point of the simulated copies being threatened is that you don't know if you are one. The real you and the fake you are both threatened. The real you obviously won't be punished in any way for refusing, but the fake you will be. If you can't be sure that you're not the fake, you can't be sure it's an empty threat.

Chamale posted:

Yeah, I don't understand the logic where having a perfect copy of me means I don't die. If I could step into a cloning machine and get an immortal copy of myself, my self does not become immortal, only a copy does. I see the same problem with any kind of consciousness-uploading immortality idea, unless my self somehow can travel from my brain into a computer. It's the same problem as the teleporter from Star Trek, only even nerdier.
It's the same problem in that it isn't a problem, unless you believe you have a soul. But if you have a soul then you don't need any technology to make you immortal because you already are. The Star Trek transporters just do more rapidly what happens anyway. All the bits of you are constantly being replaced and your mind and personality are constantly changing. The "you" that continues to exist from year to year or even one moment to the next is just an idea, not something you can actually identify or isolate. If somehow a perfect copy of you is created and the original is destroyed, the copy is just as much you as the original was.

mediocre dad okay
Jan 9, 2007

The fascist don't like life then he break other's
BEAT BEAT THE FASCIST

Tiggum posted:

It's the same problem in that it isn't a problem, unless you believe you have a soul. But if you have a soul then you don't need any technology to make you immortal because you already are. The Star Trek transporters just do more rapidly what happens anyway. All the bits of you are constantly being replaced and your mind and personality are constantly changing. The "you" that continues to exist from year to year or even one moment to the next is just an idea, not something you can actually identify or isolate. If somehow a perfect copy of you is created and the original is destroyed, the copy is just as much you as the original was.

The problem isn't whether the thing is "you", though, the problem is consciousness. We have no empirical evidence of what happens to your consciousness when you die (how could we, anyway?). The assumption is that it simply stops existing, and so if you destroy the body of (and so effectively kill) an individual only to recreate it somewhere else, then your consciousness ends there. It may be that by doing this you are creating a consciousness in the new body/program/robot/whatever, and if so, from that entity's point of view consciousness will never have been lost at all, merely transferred. From the original consciousness' point of view, however, it kept going until the body was destroyed at which point it disappeared.

Of course, this line of thinking makes many assumptions about consciousness. The thing is, consciousness is extremely hard to understand and analyse, and philosophers have been struggling with it literally since the dawn of time. Sure, you may not believe in souls (I don't) but consciousness is a thing, and unless we can manage to understand what it is we can't really talk about "consciousness uploading" or anything like that in any way other than pure science-fictional speculation.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Tiggum posted:

It's the same problem in that it isn't a problem, unless you believe you have a soul. But if you have a soul then you don't need any technology to make you immortal because you already are. The Star Trek transporters just do more rapidly what happens anyway. All the bits of you are constantly being replaced and your mind and personality are constantly changing. The "you" that continues to exist from year to year or even one moment to the next is just an idea, not something you can actually identify or isolate. If somehow a perfect copy of you is created and the original is destroyed, the copy is just as much you as the original was.
Bullshit. Of course I'm changing; I'm changing in a continuity. Don't pretend that it's meaningless to terminate that continuity just because you can hypothetically start a new instance.

If I kill you, the you that I kill will not, in fact, magically resume life like before if I make a copy. It's a loving copy. It may be equally valid as a Tiggum, but you are still dead and aren't coming back.

Sham bam bamina! fucked around with this message at 15:03 on Dec 8, 2014

Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?
You guys are bad at philosophy. If a perfect copy of you is brought in to existence the same femtosecond you are annihilated, and no one ever knows about the change, again, because it's a perfect copy, is it just as good as being you? If you don't believe in the essential essence of a soul, why isn't it just as good as being you? You can talk about termination of consciousness, but if the copy believes itself to be you, and didn't experience the disruption, why is it not you?

No. Really. If a perfect copy of you exists and never knew of an interruption of consciousness, why is it not you?

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Avenging_Mikon posted:

No. Really. If a perfect copy of you exists and never knew of an interruption of consciousness, why is it not you?
It is me. But I'm also me. And now I'm dead.

pentyne
Nov 7, 2012
All this AI and uploading stuff has annoyed my to the point where I really want to know what the gently caress happened in Permutation City.


The climax of the 'real' world is when nerdy mc-sperger hooks up with that computer simulation woman, and when he gets access to the world computing grid for a few seconds before it ends, they celebrated by having terrible sex then he kills himself. The point seemed to be that in those few seconds the simulation of the world elite operates at a insanely low efficiency so for everyone in the simulation will technically experience an eternity of existence despite the simulation ending in reality shortly after it started.


Is that really the future people like LW want? Some weirdo techo-mystery where actual reality takes second place to what can be accomplished by changing the subjective human experience?

CheesyDog
Jul 4, 2007

by FactsAreUseless

Sham bam bamina! posted:

It is me. But I'm also me. And now I'm dead.

Anyone grappling with this issue should consider reading Metzinger's "Being No One".

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

CheesyDog posted:

Anyone grappling with this issue should consider reading Metzinger's "Being No One".
There is no issue. When you die, you're dead; it's true by definition and self-evident.

Terrible Opinions
Oct 18, 2013



Avenging_Mikon posted:

You guys are bad at philosophy. If a perfect copy of you is brought in to existence the same femtosecond you are annihilated, and no one ever knows about the change, again, because it's a perfect copy, is it just as good as being you? If you don't believe in the essential essence of a soul, why isn't it just as good as being you? You can talk about termination of consciousness, but if the copy believes itself to be you, and didn't experience the disruption, why is it not you?

No. Really. If a perfect copy of you exists and never knew of an interruption of consciousness, why is it not you?

My continued existence is important because of the world's affects on my specific first person perspective. The perfect copy has a different perspective that I cannot reach as the first person. Thus it is another person to me. Though my affects on the world may remain the same the worlds affects on me no longer exist if the copy replaces me. I cannot feel or experience what the copy does and thus it has no value to me.

VictualSquid
Feb 29, 2012

Gently enveloping the target with indiscriminate love.

Sham bam bamina! posted:

There is no issue. When you die, you're dead; it's true by definition and self-evident.
That means that the uploading branch of imomrtalism is not concerned with avoiding death?

Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?

PresidentBeard posted:

My continued existence is important because of the world's affects on my specific first person perspective. The perfect copy has a different perspective that I cannot reach as the first person. Thus it is another person to me. Though my affects on the world may remain the same the worlds affects on me no longer exist if the copy replaces me. I cannot feel or experience what the copy does and thus it has no value to me.

Failure. I'm not talking about you being the copy, but the copy being you. Not engaging with the topic, rewrite and resubmit.

Sham bam bamina! posted:

It is me. But I'm also me. And now I'm dead.

Ibid.

Terrible Opinions
Oct 18, 2013



Avenging_Mikon posted:

Failure. I'm not talking about you being the copy, but the copy being you. Not engaging with the topic, rewrite and resubmit.

I'm not arguing that you're wrong. I'm just stating that there is no purpose to perfect copy immortality. Why should I care about continuing to impact others once my consciousness is gone? It would take an enormous amount of hubris to believe that my existence inherently benefits the world at large, and by the time I die most of the individuals who care about me are going to also be dying. It only real use I could see would be for the benefit of family members of those who die from accidents or violence, and even that seems like something that would be massively complicated by the concreteness of the originals death.

CheesyDog
Jul 4, 2007

by FactsAreUseless

Sham bam bamina! posted:

There is no issue. When you die, you're dead; it's true by definition and self-evident.

The cessation of consciousness (or the cessation of self-perception of consciousness) does not equal death.

Triple Elation
Feb 24, 2012

1 + 2 + 4 + 8 + ... = -1
The concept of self, as we know it, is not capable of dealing with this sort of situation

A person has my memories, my thoughts, my hopes and fears, he looks exactly like me, he is identical to me in every way. But if you hurt him I will not bleed, if he learns something I will stay ignorant of it, if you kill him I will live on. Is he me? Is he not me? What is a "me"?

Arguments about semantics are the worst

mediocre dad okay
Jan 9, 2007

The fascist don't like life then he break other's
BEAT BEAT THE FASCIST

Triple Elation posted:

The concept of self, as we know it, is not capable of dealing with this sort of situation

A person has my memories, my thoughts, my hopes and fears, he looks exactly like me, he is identical to me in every way. But if you hurt him I will not bleed, if he learns something I will stay ignorant of it, if you kill him I will live on. Is he me? Is he not me? What is a "me"?

Arguments about semantics are the worst

My thoughts exactly. Before we can even begin to address these issues we need to have a grasp on what consciousness and the self actually are - which we don't.

CheesyDog
Jul 4, 2007

by FactsAreUseless

Kulebri posted:

My thoughts exactly. Before we can even begin to address these issues we need to have a grasp on what consciousness and the self actually are - which we don't.

Seriously, read one of Metzinger's books on the subject. We have a pretty good idea.

Triple Elation
Feb 24, 2012

1 + 2 + 4 + 8 + ... = -1
Do we have a definite answer to "somebody shoots a perfect copy of you, do you shrug or recoil in abject horror"?

e: Personally, I cringe

Serious Cephalopod
Jul 1, 2007

This is a Serious post for a Serious thread.

Bloop Bloop Bloop
Pillbug

CheesyDog posted:

The cessation of consciousness (or the cessation of self-perception of consciousness) does not equal death.

It sure as hell is what makes people afraid of death.

If I am copied and not destroyed in a teleporter accident, and my exact copy starts loving my boyfriend, I'm not getting laid, even though the experience is the same for him, and even if a resulting child would be genetically mine. In my subjective experience, I wouldn't even be pregnant.

A copy is me, and technically extends my life and actions for all purposes except continued consciousness for me prime- no problem at all for my living copy, friends, or relatives (if they don't know or subscribe to your idea of death), but me prime experiences permanent loss of consciousness, and the mind fades to nonexistence, which is what scares people about death in the first place.

Death is the termination of all biological functions that sustain a living organism. Atomizing me prime would result in me prime's death. Me prime's ability to continue experiencing reality would cease. That last part is what is worrisome.

Serious Cephalopod fucked around with this message at 17:07 on Dec 8, 2014

Duckbill
Nov 7, 2008

Nice weather for it.
Grimey Drawer
As I understand it it's much simpler than 'is the copy me?' Yud's actual argument is closer to a computer saying "I have someone just like you in the next room. One of these two people is a simulation/clone/whatever, and doesn't know it. Give me all your money or I'll shoot the fake."

Yud then expands it to infinitely many fakes and an eternity of torture, but he always does that.

CheesyDog
Jul 4, 2007

by FactsAreUseless
Is someone suffering from Cothard's syndrome dead because they no longer self-perceive consciousness?

Consciousness is an emergent self-model built through experience and neurological development shaping the brain. An exact copy of you would manifest your exact consciousness/self-model up until the point where a deviation in your experiences occurs. It would be you up until that point.

There is limited continuity of consciousness-it's essentially constructed as-needed from your neurological structures. Someone knocked out for surgery wakes up with the same "consciousness"/self-model as before, but that consciousness ceased when they were put under and re-emerged from existing neurological structures when they came to. There was no "I" during that time.

The problem isn't what would be happen after the exact copy exists (the emergent self-model would be you by any definition) it's that you can't make an exact copy of something/someone, and that any emulation of the human brain would need to emulate the body and the environment for that body at a level of detail that becomes staggeringly complex. Even assuming you could host and emulate a copy of the human brain, it's still running all the unconscious stuff while you're knocked out, so how do you pick it apart at a molecular level non-destructively and without altering its performance while measuring it?

HMS Boromir
Jul 16, 2011

by Lowtax
Personally, if you killed all my friends and replaced them with immortal copies I would probably learn to live with it fairly quickly, whereas if you told me you were going to kill me and replace me with an immortal copy of me I would be less than enthused. The distinction is pretty obvious and I don't think there's any contradiction in wanting Actual You to live on in AI upload land while accepting Fake New Dad as being just as good as Dead Dad.

Serious Cephalopod
Jul 1, 2007

This is a Serious post for a Serious thread.

Bloop Bloop Bloop
Pillbug

CheesyDog posted:

Is someone suffering from Cothard's syndrome dead because they no longer self-perceive consciousness?


As someone who has experienced the feeling of self death, while dead I did not feel unconscious. I did not feel unaware. Perception of being alive may be a part of our concept of self, but it is not the whole of consciousness. Perception of being and belonging to your body is a different part. Perception of self identity is still another. Some combo of all of these makes me feel that an identical copy is not me, and destruction of this body or mind would result in my death, even if another identical me existed. Because I could never perceive my twin's experiences first hand, I cannot identify my twin as me. Things that happen to myself do not happen to my twin. Things that happen to my twin do not happen to me. So, if one of us were to die, my ( and her) own consciousness would perceive another person dying, and perceive our own death still awaiting. The other would be dead.

If I were to be copied, both of me would have an existential meltdown, anyway.

Also, humans have an authenticity dilemma. If a loved one of mine were to die, and be replaced with an identical copy , and I knew, I would not perceive the copy as the authentic person. Maybe with time, but not right away.

Piell
Sep 3, 2006

Grey Worm's Ken doll-like groin throbbed with the anticipatory pleasure that only a slightly warm and moist piece of lemoncake could offer


Young Orc

CheesyDog
Jul 4, 2007

by FactsAreUseless
At the point where a copy has a divergent experience it ceases to be a copy, because that will result in neurological shaping that leads to a different consciousness. The point is that it's the construction of the copy that's impossible, and that even such a copy being constructed would at best result in a temporary split-consciousness that would diverge promptly into two unique individuals. I'm not disagreeing and saying the copy would Be You, but I am taking the position that this is only because the copying process is unlikely to be perfect and because differing experience leads to neurological restructuring. Yes, you would not experience what it experienced, but the act of experiencing something is itself inherently part of shaping consciousness.

I bring up Colthards to demonstrate that consciousness is emergent from structure, and that consciousness exists along a range and is frequently impaired or disrupted temporarily. Consciousness only maintains continuity in the sense that it emerges from structures that continue to exist - and if those structures could be copied as sufficient resolution, the emergent consciousness would be "you" in the same sense that "you" wake up after surgery.

To be clear, I believe the Singulatarians are wrong, because I do not believe that a person could be emulated in a sufficient level of detail for processing reasons, and because I don't believe that you could non-destructively examine the human body at a level sufficient to creating a copy.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



I do think that continuity matters, even if it seems as if this could be generally disproven. I went under general anesthetic once, which is probably the closest practical approach to 'dying,' and I don't feel like there was a discontinuity even if I suppose in a sense my brain stopped working for a while there. However, if I was put under general anesthetic, thrown through the YUDKOWSKY PROCESSOR, and then my perfect duplicate was brought up into consciousness, I don't feel as though there would be continuity.

Similarly, destructive uploading as some kind of "now you become the computer" thing would seem like it creates a duplicate of me, it doesn't upload "me, my subjective experiences." Some kind of brain implant that gradually interleaved my brain's activities with my iPhone might do it because it would be like a transition.

Anyway all of this is interesting, but I think Yudkowsky deals with this by saying that, first, the AI God of the future can just make a copy of you, yes YOU, in all meaningful ways, purely from historical records, internet posts, and so on, and what is more, it inevitably WILL do that, at least if it is a MIRI Approved Friendly AI.

I think the basilisk concept got put together as a sort of mockery of the absurdity of this, his Timeless Decision Theory horseshit, and the 'shut up and multiply' crap.

Telarra
Oct 9, 2012

If there's one genuinely scary thing about a future that has brain uploading, it's that the only person who actually cares about your continuity is yourself.

Munin
Nov 14, 2004


CheesyDog posted:

The cessation of consciousness (or the cessation of self-perception of consciousness) does not equal death.

Strange psychological disorders aside sleep is an everyday cessation of consciousness. I presume you were also including that in your statement.

The whole continuity of consciousness thing as a differentiation between two selves opens up other cans of worms. You then go into the continuity of the body which itself has issues.

I remember reading this little article where they go through the various ways in which people define death and the various ways one could (hypothetically) break one whilst not "killing" someone according to the other definitions. It closed on the hypothetical where they put someone through each of these in turn and then asked what you would consider the relationship between the resultant person and the original would be.

Serious Cephalopod
Jul 1, 2007

This is a Serious post for a Serious thread.

Bloop Bloop Bloop
Pillbug
Wouldn't brain death be a significant divergent experience?

Triple Elation
Feb 24, 2012

1 + 2 + 4 + 8 + ... = -1

quote:

it inevitably WILL do that, at least if it is a MIRI Approved Friendly AI.

Wait, Yudkowsky actually thinks that a "friendly" AI will do such a thing? I get that this Roko person brought up the idea, but did Yudkowsky actually concur? This is his idea of 'friendly'?

A true friendly AI would obviously first engage in acausal conversation with you and convince you to submit to the torture willingly.

Wandering Knitter
Feb 5, 2006

Meow
I kind of want to see the "takes 30 supplements a day" guy team up with Soylent's "I no longer poop" guy. Imagine what they could create! :allears:

Adbot
ADBOT LOVES YOU

Chamale
Jul 11, 2010

I'm helping!



A copy of myself is not me, it's another person with a different consciousness. Continuity of the brain is crucial to having the same consciousness, and the brain gradually replacing itself is different from building a new brain. If I create a perfect copy of myself, that copy is like my child - I'd care about him more than a random stranger because he's related to me, but I don't get to experience what the copy experiences.

  • Locked thread