Consciousness and conscience are not the same thing, this naming is horrible
What if you do it in a ship of theseus type of way. Like, swapping each part of the brain with an electronic one slowly until there is no brain left.
Wonder if that will work.
Right? Like what if as cells die or degrade instead of being replaced by the body naturally they are replaced by nanites/cybernetics/tech magic. If the process of fully converting took place over the course of 10 years, then I don’t see how the subject would even notice.
It’s an interesting thing to ponder.
The subject also doesn’t notice if you end their consciousness either.
Even if it were possible to scan the contents of your brain and reproduce them in a digital form, there’s no reason that scan would be anything more than bits of data on the digital system. You could have a database of your brain… but it wouldn’t be conscious.
No one has any idea how to replicate the activity of the brain. As far as I know there aren’t any practical proposals in this area. All we have are vague theories about what might be going on, and a limited grasp of neurochemistry. It will be a very long time before reproducing the functions of a conscious mind is anything more than fantasy.
You could have a database of your brain… but it wouldn’t be conscious.
Where is the proof of your statement?
Well there’s no proof, it’s all speculative and even the concept of scanning all the information in a human brain is fantasy so there isn’t going to be a real answer for awhile.
But just as a conceptual argument, how do you figure that a one-time brain scan would be able to replicate active processes that occur over time? Or would you expect the brain scan to be done over the course of a year or something like that?
You make a functional model of a neuron that can behave over time like other neurons do. Then you get all the synapses and their weights. The synapses and their weights are a starting point, and your neural model is the function that produces subsequent states.
Problem is brians don’t have “clock cycles”, at least not as strictly as artificial neural networks do.
Consciousness might not even be “attached” to the brain. We think with our brains but being conscious could be a separate function or even non-local.
Thank you for this. That was a fantastic survey of some non-materialistic perspectives on consciousness. I have no idea what future research might reveal, but it’s refreshing to see that there are people who are both very interested in the questions and also committed to the scientific method.
I read that and the summary is, “Here are current physical models that don’t explain everything. Therefore, because science doesn’t have an answer it could be magic.”
We know consciousness is attached to the brain because physical changes in the brain cause changes in consciousness. Physical damage can cause complete personality changes. We also have a complete spectrum of observed consciousness from the flatworm with 300 neurons, to the chimpanzee with 28 billion. Chimps have emotions, self reflection and everything but full language. We can step backwards from chimps to simpler animals and it’s a continuous spectrum of consciousness. There isn’t a hard divide, it’s only less. Humans aren’t magical.
And we know the flatworm and chimp don’t have non-local brains because?
I’m just saying, it didn’t seem like anyone was arguing that humans were special, just that consciousness may be non-local. Many quantum processes are, and we still haven’t ruled out the possibility of Quantum phenomena happening in the brain.
Because flatworm neurons can be exactly modeled without adding anything extra.
It’s like if you said, “And we know a falling ball isn’t caused by radiation because?” If you can model a ball dropping in a vacuum without adding any extra variables to your equations, why claim something extra? It doesn’t mean radiation couldn’t affect a falling ball. But adding radiation isn’t needed to explain a falling ball.
The neurons in a flatworm can be modeled without adding quantum effects. So why bother adding in other effects?
And a minor correction, “non local” means faster than light. Quantum effects do not allow faster than light information transfer. Consciousness by definition is information. So even if quantum processes affected neurons macroscopically, there still couldn’t be non local consciousness.
We already have seen “non-local” Quantum Effects though - https://today.ucsd.edu/story/quantum-material-mimics-non-local-brain-function
“that electrical stimuli passed between neighboring electrodes can also affect non-neighboring electrodes. Known as non-locality, this discovery is a crucial milestone”
That’s not quantum non locality. The journalist didn’t know how to interpret the actual data.
"Quantum nonlocality does not allow for faster-than-light communication,[6] "
https://en.m.wikipedia.org/wiki/Quantum_nonlocality
Quantum non locality is like taking two playing cards, sealing them in envelopes, mailing one to your friend across the country and then asking him to open it. You will know faster than light which card is in your envelope. But that doesn’t allow information transfer.
I understand your point. But science has also shown us over time that things we thought were magic were actually things we can figure out. Consciousness is definitely up there in that category of us not fully understanding it. So what might seem like magic now, might be well-understood science later.
Not able to provide links at the moment, but there are also examples on the other side of the argument that lead us to think that maybe consciousness isn’t fully tied to physical components. Sure, the brain might interface with senses, consciousness, and other parts to give us the whole experience as a human. But does all of that equate to consciousness? Is the UI of a system the same thing as the user?
I think we’re going to learn how to mimic a transfer of consciousness before we learn how to actually do one. Basically we’ll figure out how to boot up a new brain with all of your memories intact. But that’s not actually a transfer, that’s a clone. How many millions of people will we murder before we find out the Zombie Zuckerberg Corp was lying about it being a transfer?
What’s the difference between the two?
A. You die and a copy exists
B. You move into a new body
Right, how is moving into a new body not dying?
In one scenario you continue. In the other you die but observers think you continue because it’s a copy of you.
We don’t even know what consciousness is, let alone if it’s technically “real” (as in physical in any way.) It’s perfectly possible an uploaded brain would be just as conscious as a real brain because there was no physical thing making us conscious, and rather it was just a result of our ability to think at all.
Similarly, I’ve heard people argue a machine couldn’t feel emotions because it doesn’t have the physical parts of the brain that allow that, so it could only ever simulate them. That argument has the same hole in that we don’t actually know that we need those to feel emotions, or if the final result is all that matters. If we replaced the whole “this happens, release this hormone to cause these changes in behavior and physical function” with a simple statement that said “this happened, change behavior and function,” maybe there isn’t really enough of a difference to call one simulated and the other real. Just different ways of achieving the same result.My point is, we treat all these things, consciousness, emotions, etc, like they’re special things that can’t be replicated, but we have no evidence to suggest this. It’s basically the scientific equivalent of mysticism, like the insistence that free will must exist even though all evidence points to the contrary.
Also, some of what happens in the brain is just storytelling. Like, when the doctor hits your patellar tendon, just under your knee, with a reflex hammer. Your knee jerks, but the signals telling it to do that don’t even make it to the brain. Instead the signal gets to your spinal cord and it “instructs” your knee muscles.
But, they’ve studied similar things and have found out that in many cases where the brain isn’t involved in making a decision, the brain does make up a story that explains why you did something, to make it seem like it was a decision, not merely a reaction to stimulus.
That seems like a lot of wasted energy, to produce that illusion. Doesn’t nature select out wasteful designs ruthlessly?
TLDR:
Nature can’t simply select out consciousness because it emerges from hardware that is useful in other ways. The brain doesn’t waste energy on consciousness, it uses energy for computation, which is useful in a myriad ways.The usefulness of consciousness from an evolutionary fitness perspective is a tricky question to answer in general terms. An easy intuition might be to look at the utility of pain for the survival of an individual.
I personally think that, ultimately, consciousness is a byproduct of a complex brain. The evolutionary advantage is mainly given by other features enabled by said complexity (generally more sophisticated and adaptable behavior, social interactions, memory, communication, intentional environment manipulation, etc.) and consciousness basically gets a free ride on that already-useful brain.
Species with more complex brains have an easier time adapting to changes in their environment because their brains allow them to change their behavior much faster than random genetic mutations would. This opens up many new ecological niches that simpler organisms wouldn’t be able to fill.I don’t think nature selects out waste. As long as a species is able to proliferate its genes, it can be as wasteful as it “wants”. It only has to be fit enough, not as fit as possible. E.g. if there’s enough energy available to sustain a complex brain, there’s no pressure to make it more economical by simplifying its function. (And there are many pressures that can be reacted to without mutation when you have a complex brain, so I would guess that, on the whole, evolution in the direction of simpler brains requires stronger pressures than other adaptations)
Yeah. This is related to supernatural beliefs. If the grass moves it might just be a gust of wind, or it might be a snake. Even if snakes are rare, it’s better to be safe than sorry. But, that eventually leads to assuming that the drought is the result of an angry god, and not just some random natural phenomenon.
So, brains are hard-wired to look for causes, even inventing supernatural causes, because it helps avoid snakes.
let alone if it’s technically “real” (as in physical in any way.)
This right here might already be a flaw in your argument. Something doesn’t need to be physical to be real. In fact, there’s scientific evidence that physical reality itself is an illusion created through observation. That implies (although it cannot prove) that consciousness may be a higher construct that exists outside of physical reality itself.
If you’re interested in the philosophical questions this raises, there’s a great summary article that was published in Nature: https://www.nature.com/articles/436029a
On the contrary, it’s not a flaw in my argument, it is my argument. I’m saying we can’t be sure a machine could not be conscious because we don’t know that our brain is what makes us conscious. Nor do we know where the threshold is where consciousness arises. It’s perfectly possible all we need is to upload an exact copy of our brain into a machine, and it’d be conscious by default.
I see that’s certainly a different way of looking at it :) Of course I can’t say with any authority that it must be wrong, but I think it’s a flaw because it seems you’re presuming that consciousness arises from physical properties. If the physical act of copying a brain’s data were to give rise to consciousness, that would imply consciousness is a product of physical reality. But my position (and that of the paper I linked) is that physical reality is a product of mental consciousness.
It’s not a flaw to not be batshit like you.
Do elaborate on the batshit part :) It’s a scientific fact that physical matter does not exist in its physical form when unobserved. This may not prove the existence of consciousness, but it certainly makes it plausible. It certainly invalidates physical reality as the “source of truth”, so to say. Which makes the explanation that physical reality is a product of consciousness not just plausible, but more likely than the other way around. Again, not a proof, but far from batshit.
It’s a scientific fact that physical matter does not exist in its physical form when unobserved.
No, it’s not. The quantum field and the quantum wave exist whether or not you observe it, only the particle behavior changes based on interaction. Note how I specifically used the word “interaction”, not “observation”, because that’s what a quantum physicist means when they say the wave-particle duality depends on the observer. They mean that a wave function collapses once it interacts definitely, not only when a person looks at it.
It certainly invalidates physical reality as the “source of truth”, so to say
How so, when the interpretation you’re citing is specifically dependant on the mechanics of quantum field fluctuation? How can physical reality not exist when it is physical reality that gives you the means to (badly) justify your hypothesis?
I think you’re a little confused about what observed means and what it does.
When unobserved, elementary particles behave like a wave, but they do not stop existing. A wave is still a physical thing. Additionally, observation does not require consciousness. For instance, a building, such as a house, when nobody is looking at it, does not begin to behave like a wave. It’s still a physical building. Therefore, observation is a bit of a misnomer. It really means a complex interaction we don’t understand causes particles to behave like a particle and not a wave. It just happens that human observation is one of the possible ways this interaction can take place.
An unobserved black hole will still feed, an unobserved house is still a house.
To be clear, I’m not insulting you or your idea like the other dude, but I wanted to clear that up.