A computer may or may not be a good analogy-model-replication of the human mind, but it is only an analogy-model-replication
To actually improve on its rules would require a sort of magic, a set of rules that are get better when you apply them to themselves.
For those who don't want to read the whole thing, I think this is the most interesting and challenging part:
But what if the neuron is not a black box? ...
Consciousness is irrelevant to a discussion of brains? Consciousness is not an irrelevant "feature" of brain activity: it is brain activity.
You don't actually believe other people are unconscious do you, and that you are the only person with consciousness? To draw such a conclusion, one would have to disregard Occam's Razor and believe that whenever two or more people witnessed a shared event they were actually victims of a persistent collective group hallucination or illusion.
That other humans have consciousness seems an uncontroversial position, p-zombies notwithstanding.
Nope, certainly not the magic that "strong AI" proponents are talking about. A genetic algorithm would be able to do exactly what I said a rational agent program could conceivably do: debug itself. With a sufficiently complicated rule set, this could be quite valuable. The agent could determine that certain sets of rules are contradictory, or it could discover other, hidden rules that exist in the rule set. What it can't do is magically transcend its input, which is what people are implying when they say "strong AI." Any problem the computer solves would have already been latent in the algorithm design, it just needed computation and comparison to determine which route to take.
Consciousness is just the state of being sentient. It certainly is biological, unless you think non-sentient beings have it. That it's not a behavior or cannot be measured does not make it any less real.
It's not a terribly complicated concept: consciousness is the embodied sentience experienced and exhibited (through behavior, yes) by living beings with brains.
I'm not in vehement denial
I'm just pointing out that being like something (and I would say only partially like, at best) is not the same as being something.
If you're going to retreat to a position of hard solipsism, at least do so without being condescending to those who find your position counter-intuitive.
Now, I think that what people mean when they say that a brain is a computer is not that brains are the same thing as, say, what we're typing on now.
Finally, if you don't consider yourself a p-zombie, and you accept that qualia are real, yet you posit a p-zombie that is physically identical, lacking only qualia, then you're arguing that qualia aren't material, ergo you're a dualist.
not an irrelevant "feature" of brain activity: it is brain activity
I'm not sure I am following your argument, but as someone who believes that what makes the brain have a mind is it's information processing abilities, I think that this also implies that p-zombies cannot exist.
I don't think it is a derail,
The point of p-zombies is that they're physically identical, but philosophical zombies. Which means that consciousness would not be a purely physical fact, which means that physicality wouldn't account for all facts, which requires a non-physical explanation, which requires dualism.
the way computers actually work and what they're made to do
« Older Mindfulness for Stress Reduction | Think you've read Madame Bovary? Newer »
This thread has been archived and is closed to new comments