I made a similar comment, but put my brain in a robotic/cyborg body or at least hook it up to a computer. I don’t care to have a copy of my brain made.
Brain VR would be pretty cool if it’s not digital real-estate/micro-transactions etc.
What if it’s ship of theseus uploading? Every few moments, one neuron in your brain and all its pathways are replaced by a digital equivalent. One single neuron. Are you still yourself? If yes, repeat.
What’s more likely, in practice is an expansion of self into hardware.
Basically, if you bolted a computer to your brain, you would still be you, just with better memory etc. If, at this point, your brain died, then you die. However, if you kept adding to the computer side, more and more of “you” would be software based. What happens at 90% (10x capacity) or 99%? If your brain were to die, “you” only lose 10% of your capabilities. So long as everything critical is duplicated in software, the pure software version of “you” can go on thinking.
Critically, there is no neuron duplication here. It relies on both the plasticity of our brains to adapt, and the fact our self is continually updated.
Interestingly, the first cases might not even be truly planned. What happens when your “neural AI assistant” can continue to function after your death. If it is self aware, and believes it is you, at what point is it still you?
In that case I think it would heavily rely on the whole “everything critical is duplicated in software” part. It’s the same ship of theseus, just one level up - instead of neurons and pathways, it’s entire sections of the brain. If my biological short term memory withers away from dementia, boom, the chip already has that capability backed up and running. If my long term memory starts going because of Alzheimers, boom, also have those uploaded to the cloud on demand. Got ALS? Motor functions already rerouted through the chip.
At no point is the entire biological brain destroyed at once and “transferred”; there is a continuity of consciousness throughout the process, a continuity of self and pattern. That’s the part most critical to me.
Yeah this. Doesn’t particularly if it’s neuron by neuron or larger scale repairs. So long as not too much is replaced at once, and everything is backed up in software before very the switch, then I’d still be mostly me. I can’t imagine the changes wouldn’t change my personality, capabilities, etc, but I feel like I’d still be me so long as nothing fucks up in the process. Much better than whole brain backup/cloning, even with neuron-by-neuron copy+destruction.
EDIT: where it gets sketchy is handling conscious (especially internal monologue) and nearly conscious sections. Those would need to be replaced at a slow rate IMO.
Just load my consciousness into a computer already. Im tired.
No problem. You’re now stuck in Rimworld as an NPC.
With someone using that mod
Oh boy, I can’t wait to get gangraped by genetically modified insects which then use me as an incubator.
Don’t threaten me with a good time.
I made a similar comment, but put my brain in a robotic/cyborg body or at least hook it up to a computer. I don’t care to have a copy of my brain made.
Brain VR would be pretty cool if it’s not digital real-estate/micro-transactions etc.
What if it’s ship of theseus uploading? Every few moments, one neuron in your brain and all its pathways are replaced by a digital equivalent. One single neuron. Are you still yourself? If yes, repeat.
What’s more likely, in practice is an expansion of self into hardware.
Basically, if you bolted a computer to your brain, you would still be you, just with better memory etc. If, at this point, your brain died, then you die. However, if you kept adding to the computer side, more and more of “you” would be software based. What happens at 90% (10x capacity) or 99%? If your brain were to die, “you” only lose 10% of your capabilities. So long as everything critical is duplicated in software, the pure software version of “you” can go on thinking.
Critically, there is no neuron duplication here. It relies on both the plasticity of our brains to adapt, and the fact our self is continually updated.
Interestingly, the first cases might not even be truly planned. What happens when your “neural AI assistant” can continue to function after your death. If it is self aware, and believes it is you, at what point is it still you?
In that case I think it would heavily rely on the whole “everything critical is duplicated in software” part. It’s the same ship of theseus, just one level up - instead of neurons and pathways, it’s entire sections of the brain. If my biological short term memory withers away from dementia, boom, the chip already has that capability backed up and running. If my long term memory starts going because of Alzheimers, boom, also have those uploaded to the cloud on demand. Got ALS? Motor functions already rerouted through the chip.
At no point is the entire biological brain destroyed at once and “transferred”; there is a continuity of consciousness throughout the process, a continuity of self and pattern. That’s the part most critical to me.
Yeah this. Doesn’t particularly if it’s neuron by neuron or larger scale repairs. So long as not too much is replaced at once, and everything is backed up in software before very the switch, then I’d still be mostly me. I can’t imagine the changes wouldn’t change my personality, capabilities, etc, but I feel like I’d still be me so long as nothing fucks up in the process. Much better than whole brain backup/cloning, even with neuron-by-neuron copy+destruction.
EDIT: where it gets sketchy is handling conscious (especially internal monologue) and nearly conscious sections. Those would need to be replaced at a slow rate IMO.