Zombies on the web
June 21, 2005 8:20 PM   Subscribe

There are actually three different kinds of zombies. All of them are like humans in some ways, and all of them are lacking something crucial (something different in each case).

Hollywood zombies. These are found in zombie B-movies...
Haitian zombies. These are found in the voodoo (or vodou) tradition in Haiti...
Philosophical zombies. These are found in philosophical articles on consciousness...


Zombies on the web
posted by y2karl (127 comments total)
 
Makes me hungry for.... BRAINS!
posted by Balisong at 8:34 PM on June 21, 2005


Oh if there's an original thought out there, I could use it right now.

Brownsville Girl, by Bob Dylan and Sam Shepard
posted by y2karl at 8:39 PM on June 21, 2005


Academics with too much time on their hands; but I repeat myself.
posted by warbaby at 8:39 PM on June 21, 2005


Frickin. Not all philosophers are after your children's brains. You alls some creeps.
posted by nervousfritz at 8:49 PM on June 21, 2005


Umm . . . brains.
posted by yhbc at 8:51 PM on June 21, 2005


It has recently been discovered that phenomenological zombies do have one difference: their philosophy. Judging by Dan Dennett's writings, for instance, he's clearly out for brains.
posted by abcde at 8:56 PM on June 21, 2005


A standard lookup is the Stanford Encyclopedia of Philosophy entry. There was a recent thread about a consciometer - a detector of consciousness. In that thread, painquale argued that consciousness was public and hence detectable, thus rendering most speculation about zombies moot. I argued against. Anyone want to continue?
posted by Gyan at 9:01 PM on June 21, 2005


What's in your heeeeeeey-uuuudd,
In your heeeeeyyuuuuuud,
ZOMbeh, ZOMbeh, ZOMbay-aY-AY?
posted by keswick at 9:09 PM on June 21, 2005


Unnngaahh ...Gaaahhh ...Ummannaaa ...Bbbbrringg mee baaack... For 08!!
posted by Balisong at 9:17 PM on June 21, 2005


there was a nice write-up on MR recently :D

cheers!
posted by kliuless at 9:18 PM on June 21, 2005


more zombies
carson's lame-ass joke of the year: what did Jesus want for breakfast on Easter?


"Braaaiiinnnsss."

posted by carsonb at 9:20 PM on June 21, 2005


having actually followed the link for this thread now....David Chalmers rocks. he was my best prof in a 7-year run of bad educational experiences. the hair, the wild-eyed look, the sense of eagerness that emanated from him like rayguns; a great philosopher and a better teacher.

this is basically a lecture he gave in my freshman-year philosophy 101-type course. good times!
posted by carsonb at 9:27 PM on June 21, 2005


Did anyone here read the thread on another site where they discussed (in a pretty serious tone) the best strategy for surviving a zombie outbreak? I really want to find that discussion again.
posted by SkinnerSan at 9:34 PM on June 21, 2005


A good way to start would be defining our terms. A ton of debates about consciousness are wrecked by semantics. Consciousness is the existence of subjective, internal sensations associated with the stimuli and operation of the brain. These are called "qualia," and exist, no matter what painquale says. For example, "red," "pain" and "A sharp" are all qualia.

Less fundamental, many argue, is self-consciousness the awareness of yourself as an entity just as the other creatures around you are.

Us dualists argue that consciousness constitutes a much harder problem than the more contingent aspects of cognitive science, for several reasons. Let's take the zombie argument, since it's on topic and it's my favorite one anyway. Imagine two guys. They both have the same physical characteristics, but try to envision one of them not having an internal life and sensation - faking it, as it were.

While everyone agrees these zombies are highly unlikely to impossible to actually occur, I'd argue that since you probably had no problem imagining that, that experience isn't a physical quality. Think about it - you can envision all the atoms in someone being the same but they're not aware, so that must mean that while obviously it's caused by the physical, you're thinking of something else than the physical when you think of consciousness. Hence, it's a different type of thing.

This creates some problems with, for instance, if experience is a basic part of the universe why is it so unique to life on Earth, to which one answer is it's a natural effect of anything with information-processing capacity (so even a thermostat, or an expanding and contracting rock has it in infinitesimal amounts) and animals, especially humans, happen to be the ones that do that the most.
posted by abcde at 9:39 PM on June 21, 2005


haven't read it, SkinnerSan, but here's something that might help:
The Zombie Survival Guide
posted by carsonb at 9:39 PM on June 21, 2005


Chalmers' blog is really good readin' even if, like me, you are not a philosopher.
posted by mokujin at 9:42 PM on June 21, 2005


carsonb: The Survival Guide is excellent. Funny as hell, perhaps because it has no jokes at all. It takes its conceit completely seriously and gives some very good advice (assuming the eventuality of the undead rising).
posted by brundlefly at 9:53 PM on June 21, 2005


abcde: Consciousness is the existence of subjective, internal sensations associated with the stimuli and operation of the brain.

This is unnecessarily qualified. Consciousness is the state/realm within which phenomenal events occur. Maybe, there's a world where conscious events aren't private nor subjective nor brain-based.
posted by Gyan at 9:59 PM on June 21, 2005


abcde : "These are called 'qualia,' and exist, no matter what painquale says. For example, 'red,' 'pain' and 'A sharp' are all qualia."

I agree, but the above is an assertion, even if self-evidently obvious to us. Let painquale, or whomsoever else, defend the opposite.
posted by Gyan at 10:02 PM on June 21, 2005


Think about it - you can envision all the atoms in someone being the same but they're not aware, so that must mean that while obviously it's caused by the physical, you're thinking of something else than the physical when you think of consciousness. Hence, it's a different type of thing.

Replace "consciousness" with "soul" and you'll get an idea of how silly this sounds.

Seriously, I think most people just read the words "imagine a person who acts like a person but isn't a person," flash to a quick mental shot of Reagan, and go "Sure." I'd say this isn't actually equivalent to truly imagining such a thing. The mental shorthand, in other words, is a person who acts like a person, but wearing a t-shirt that says "not a person."

I'd like to think that imagination and repeating word-sounds are two different things. I can nod my head when you tell me to think of both a square circle and a zombie, and I can, with sufficient prodding, burp out the sounds "I aym thing king af uh zawm bee rye ding ay skwayr sir kull" but since heterophenomenology doesn't allow you (we're assuming) to figure out even if I have an internal state, let alone what that internal state is, how can you tell I'm thinking of a zombie and not a square circle?

This kind of conversational topic is what let me to drink and then transfer out of the Philosophy department.
posted by Coda at 10:03 PM on June 21, 2005


Gyan: Yeah, that was qualified but I wanted to be a little down to earth - though in the process I did give a skewed definition, yeah. I was kidding about painquale, though note MonkishVirtue is equally brash ;)

Coda, I can't find out for sure, but it's a safe assumption. I'm not pandering as far as I know to people's superficial reading of the zombie argument, I certainly hope they get it. If you say my phenomenological take makes it impossible to predict your sensations, that's an interesting idea but there's already a lot of things we have to assume in life, such as the falsity of solipsism.
posted by abcde at 10:14 PM on June 21, 2005


of course there's the kind where you fork() but don't wait()
posted by polyglot at 10:36 PM on June 21, 2005


Coda : "'d like to think that imagination and repeating word-sounds are two different things."

That's just what dualists say.
posted by Gyan at 10:38 PM on June 21, 2005


I imagine that nature is consciousness, and we each experience it in our own subjective way, an individual's awareness, tuned to that aspect of nature that we are are interested in or attracted to. By nature I mean the entire unbounded physical universe.
posted by hortense at 10:52 PM on June 21, 2005


Who would win this: Zombie VS Pirate
posted by ackeber at 1:27 AM on June 22, 2005


is the pirate riding a flaming shark? I think zombies are flamable.
posted by Balisong at 1:39 AM on June 22, 2005


Bookmarked--thanks, y2karl.
posted by Prospero at 6:12 AM on June 22, 2005


Who would win this: Zombie VS Pirate

Well, a pirate by himself is just a man. Normaly you need a whole ship full of them to do anything, with cannons.

Now a Ninja vs. a Zombie...

Also I agree with Coda, just because you can "imagine" something dosn't mean it really can exist. I do think that if you could build a fully formed golem, with a brain that was exactly the same as a human brain, then it would be just as consious.
posted by delmoi at 7:22 AM on June 22, 2005


abcde : "Think about it - you can envision all the atoms in someone being the same but they're not aware, so that must mean that while obviously it's caused by the physical, you're thinking of something else than the physical when you think of consciousness. Hence, it's a different type of thing."

I'm with you up to your conclusion, "Hence, it's a different type of thing". Shouldn't that just be "Hence, you think it's a different type of thing". You may be wrong.

Also, for reference sake, I can't imagine a zombie containing all the atoms arranged in the same way as a regular person, but not having consciousness. The whole thing seems circular: if you think consciousness is separate from physical construction, then you can imagine the zombie, hence you're thinking of something other than physical when you think of consciousness, hence it's a different thing. But if you think consciousness is not separate from physical construction, then you can't imagine the zombie, hence you're thinking the same as physical when you think of consciousness, hence it's the same thing.
posted by Bugbread at 8:06 AM on June 22, 2005


I like bugbread's point. I also would like to cast some doubt on the use of 'imagination' in philosophical argument. We tend to define what's logically possible by what we can possibly imagine. We cannot imagine round-squares, married bachelors or objects that are both red and green all over, therefore, it is said, they are logically impossible. But when we attempt to imagine these things, we are imagining them as concepts and strings of words, not as objects in the world itself. So in that sense we can imagine round-squares. So the response is 'we can't imagine these things as objects' but can we imagine anything as an object? Or do we imagine all things as concepts and strings of words? What is the connection between a thought and an object, logical or otherwise? It seems a rather vague trick to depend on human imagination (a priori) to make any sort of (empirical) claim about the world, an then act as that claim has any necessity at all.

Oh, and more pressing: Pirate.
posted by elwoodwiles at 9:45 AM on June 22, 2005


What I hate about this branch of philosophy (as a philosophy major) is that it assumes that the consciousness and the acts of perception and reaction are somehow separate from the brain and body, when all cognitive sciences have said that consciousness (and any higher consciousness that humans might have over other animals) is still purely a product of electrochemical stimuli in the brain, I suppose it's another sign of the failing of science education in America, though, I really wish more philosophers would just concentrate on things that are more useful, such as the ramifications of scientific discoveries and/or ethics. Arguing the possibility of philosophical zombies an whether or not the divide between water and air is meaningful (my greatest pet peeve) is just pointless, and is what makes the philosophy seem like the ultimate ivory tower to everyone else in the world.
Satyagraha
posted by thebestsophist at 11:12 AM on June 22, 2005


Well, a pirate by himself is just a man. Normaly you need a whole ship full of them to do anything, with cannons.

Dude, cutlasses? One pirate should be able to hack a few zombies into pieces.
posted by me & my monkey at 11:56 AM on June 22, 2005


So if you engage a zombie in a conversation about the nature of consciousness, what would they say?

Assuming they hadn't studied philosophy, I can't see how they could both lack conscious experience and behave indistinguishably from a conscious human.
posted by bjrubble at 12:50 PM on June 22, 2005


thebestsophist : "What I hate about this branch of philosophy (as a philosophy major) is that it assumes that the consciousness and the acts of perception and reaction are somehow separate from the brain and body"

CogSci recognises that consciousness is a private phenomenal entity. Separation is not the key claim here. Lack of identity, is.
posted by Gyan at 12:53 PM on June 22, 2005


All good points. To bugbread, well, then you don't share the intuition that most (3 out of 4 or so), scientists included, have. The zombie argument is a form of preaching to the choir, but it's still useful as it shows those who do have the intuition how it favors dualism. Incidentally, there is an argument that the problem is caused by that we have different ways of thinking about physical and phenomenal ideas and that these are incompatible - Chalmers wrote about it recently.

elwoodwiles: The argument assumes that concievability entails logical possibility; that's an argument perhaps just as big, but most would grant that it does.

thebestsophist: The topic does have practical applications; the identification of a experience as a fundamental type could have implications for ethics, since it influences the decision as to whether computers are consciousness, or more adventurously for quantum physics, which is stuck on why observation appears to influence the world. Even Crick and Koch acknowledge that even when we have the entire brain more or less pinned down on a physical level the problem of experience will remain a challenge, so they're sympathetic with Chalmers to that extent. You may feel that consciousness is as easy a problem as other brain characteristics, but many don't.
posted by abcde at 1:01 PM on June 22, 2005


bjrubble: That's another challenge - it's called the Paradox of Phenomenal Judgment, which asks why if consciousness isn't physical and doesn't influence the physical can we talk about it? The answer is complicated but it comes down to either this other consciousness has some means to influence the brain (interactionism), which would likely ruin the zombie argument anyway, or that more or less by coincidence our judgements about our own experience coincide closely with our actual experience, so a Zombie would say the same thing but he'd be wrong. This sounds bogus but Chalmers defends it plausably in his book. It's probably the biggest challenge to dualism IMO.
posted by abcde at 1:29 PM on June 22, 2005


Zombies can be used to illustrate the non-physicality of consciousness, but the distinction is evident from the nature of experience itself and doesn't really require the Z device.
posted by Gyan at 1:36 PM on June 22, 2005


Actually, to be fair, at first I thought I had imagined it, but after reading more, I realized it was along the lines of the "regular guy with a t-shirt saying 'I am a zombie'" situation. When I actually tried to imagine it, instead of just saying "physically identical, check. zombie, check. Ok, next step", it became impossible to imagine.
posted by Bugbread at 1:54 PM on June 22, 2005


Gayn: I always assumed that indentity was required for perception, linguistically we can't just say "moves" we say "this moves" or "i move." Identity is fundimentally required for coherent perceptions and therefore reactions. Therefore, humans, and animals are conscious because they take perceptions and make decisions in relation to themselves, while computers just take information and process is (unless it's programed to descern relations, then I suppose you could say it's somewhat conscious), but zombies would still need to have a sense of identity to be able to interact with the physical world.
abcdeI didn't think about it that way, I suppose it makes more sense. I'd still argue that experience is generally physical with hardening of neural pathways et cetera, I always thought that experiences was generally understood (or better understood with the discovery of neural networks).
Satyagraha
posted by thebestsophist at 7:17 PM on June 22, 2005


thebestsophist : "Therefore, humans, and animals are conscious because they take perceptions and make decisions in relation to themselves, while computers just take information and process"

a)this is an ipso facto rationalization. Physically, a human body is like any other physical object. An entity, recognised as temporally coherent and spatially bounded, only by someone conscious. An ATP molecule in one of your cells, presumably, doesn't "know" it's part of someone's body, or serving any role. It's completely a product and process of physical laws. Same goes for each and every other constituent of your body. So photons hit your ocular surface, initiate some electrochemical reactions, which cascade across networks, in accordance with physics. Don't see where the 'consciousness' has to come in.

b)"Therefore, humans, and animals are conscious because they take perceptions" - perceptions are objects of consciousness. Your statement is like saying, "the canvas exists because the painting has to be embodied on some surface".
posted by Gyan at 7:55 PM on June 22, 2005


Oh man, let's talk about zombies! They're the coolest by far! PHILOSOPHICAL zombies, that is!
posted by rafter at 8:12 PM on June 22, 2005


thebestsophist : "I always assumed that indentity was required for perception, linguistically we can't just say 'moves' we say 'this moves' or 'i move.' "

I'm not sure your example matches your statement. First, you can say just "moves" in some languages (but the thing which moves is understood from context). Second, if the argument is that it's still there, just unstated, then I'd point out that "moves" means "changes position over time", and there has to be a subject for change in position to occur. I just don't see the relation to identity and perception.
posted by Bugbread at 8:12 PM on June 22, 2005


Gyan: I've considered consciousness as the cumulative effect of all the electrochemical stimuli, for instance pain (physical or emotional, research has shown goes to many of the same areas), and therefore the only difference between computer consciousness, animal consciousness, and human consciousness is the size/complexity of the network, and therefore to act as a normal human is to be conscious, no room for an incoporial "consciousness."

bugbread: You are right, "move" is "change of position over time." However, there is an implied object that has moved, in fact there are three things implied 1) the perceiver (the one that sees the thing being moved) 2) the perceived (whatever is moving) 3) the medium (space/time). without any of the three, movement isn't possible. Movement is alway in relation to other objects, therefore there is an implicit seperation of objects, therefore identity.
Satyagraha
posted by thebestsophist at 9:37 PM on June 22, 2005


thebestsophist : "therefore to act as a normal human is to be conscious, no room for an incoporial 'consciousness."

You are redefining consciousness as behaviour. Consciousness is a private empirical phenomenon, not a label given to certain types of behaviour.
posted by Gyan at 9:47 PM on June 22, 2005


Correction on that sentence it should be "...to act as a normal human is to have consciousness..."
I'm not trying to define it as behaviour, but as having behaviour, sentiences, and self-awareness.
Satyagraha
posted by thebestsophist at 10:50 PM on June 22, 2005


Any evidence for that proposition?
posted by Gyan at 11:15 PM on June 22, 2005


I go on vacation without checking Metafilter for a few days, and when I get back I find that I've been set up as a public enemy! Time to defend myself.

Arguing against qualia is not easy, but the reason for the difficulty is that qualia-theorists tend to shift what they mean by the term at a drop of a hat. Some characterizations are perfectly unobjectionable, but when most people talk about qualia, they want to talk about things that don't exist (that couldn't possibly exist!).

Much of my position stems from a cardinal insight dating back to Kant: that the fundamental unit of awareness - the minimum graspable - is the judgment. (From the Critique of Pure Reason: "The only use which the understanding can make of concepts is to form judgments by them.") The idea here is that all concepts are irreducibly bound to our judgments of those concepts, like enjoying the taste of key lime pie or desiring to get away from pain. We have excellent physicalist explanations of judgment-making. I'm partial to Dennett's Intentional Systems theory, but any functionalist explanation will do. Qualiaphiles think that you can take an experience, winnow off all of the judgments from it, and be left with an irreducibly private residue that forms the quale. This is NOT OBVIOUS.

Try referring to your conscious experience. Try talking about it. You might be tempted to say that you can't say anything about it, because it's ineffably private, but I'll think you're just being coy. In common parlance, in psychological tests, in the world outside of too-clever philosophers, no one actually says that. The normal thing for people to do is to describe their consciousness in intentional, judgment-driven terms. Pain is unpleasant. Light blue is soothing. Et cetera. Imagine that you liked intense pain. Can you imagine genuinely believing that and pain feeling exactly as it does? Can you imagine waking up one day to find the color red hideous and abhorent to you, but you experience it as exactly the same? I can't.

Some philosophers dig in their heels on this point and claim that they think it's possible. But this is an explosive bullet to bite. I think it's a failure of imagination that leads people to make this claim - they're not fully imagining what it would really be like to have different judgments. (This isn't really an insult - it's incredibly easy to think you can imagine things that are actually unimaginable when you consider all the details.) The above argument is made in Dennett's Quining Qualia. He is more eloquent than me, so you should read his paper if you haven't. Note that Dennett doesn't want to get rid of conscious experience (he believes in it), he just finds qualia to be a muddled and confused concept.

In my previous thread with Gyan, I made a different argument (sorry I cut off the conversation, Gyan; I was a bit pressed for time). I'll make it again, but from a different angle.

Let's assume consciousness is epiphenomenal. By definition, this means that it can't cause any of our physical behaviors. This includes all our claims about being conscious. Suppose we have a really good evolutionary and physical story about concept formation (I mean 'concept' here to be physically characterized - a chess playing machine can have a concept of a rook), and we can explain how we have come to explain why we say that we're conscious. Meaning is not private - the meaning of a word is determined by its use in a language community. If epiphenomena are causally cut off from our use of the word, we must have found other ways (behavioral, functional, or physical) to successfully apply that word. That determines what the word means. So whenever you write anything about consciousness, say anything about it, or make any sort of judgment about it, you are not talking about consciousness at all - the concept derives its meaning from publicly accessible features of the world. Saying that we have epiphenomena floating above our behaviors is in the exact same standing as saying that my bicycle is accompanied by fifteen epiphenomenal propulsion gremlins, or saying to a biologist that they're missing that special elan vital that makes an alive thing truly alive. It's not just unverifiable; it's an unnecessary addition to the meaning of our concepts.

The real challenge now is to explain consciousness in terms of judgements. This is not an impossible task! Psychologists are hard at work on the subject right now. I have no doubt that in the future we'll be familiar enough with neurology and psychology that qualia will look as antiquated and misguided as elan vital.
posted by painquale at 11:17 PM on June 22, 2005


painquale : "Pain is unpleasant. Light blue is soothing. Et cetera. Imagine that you liked intense pain. Can you imagine genuinely believing that and pain feeling exactly as it does? Can you imagine waking up one day to find the color red hideous and abhorent to you, but you experience it as exactly the same? I can't."Some philosophers dig in their heels on this point and claim that they think it's possible. But this is an explosive bullet to bite. I think it's a failure of imagination that leads people to make this claim"

Hardly an argument, just an assertion that any claim to the contrary just can't be true.

painquale : "a chess playing machine can have a concept of a rook"

Explain this. A non-conscious machine enjoys no 'concepts'. It doesn't even exist as a coherent entity, except to a conscious observer.

painquale : "Try referring to your conscious experience. Try talking about it. You might be tempted to say that you can't say anything about it, because it's ineffably private, but I'll think you're just being coy."

Again, just an insinuation.

painquale : "Meaning is not private - the meaning of a word is determined by its use in a language community."

Meaning is completely private, in an epiphenomenal consciousness. So this doesn't help, either way.

painquale : "saying to a biologist that they're missing that special elan vital that makes an alive thing truly alive. It's not just unverifiable; it's an unnecessary addition to the meaning of our concepts."

If it's not verifiable, how do you know it's unnecessary? Difference between elan vital and consciousness, is that I know consciousness exists, because I exist solely in that mode (conscious).

you are not talking about consciousness at all - the concept derives its meaning from publicly accessible features of the world

Here's the key disconnect. Consciousness is accessible, but to each, his/her own. If I am speaking with you about a table, I'm referring to a perceived object, not the perception. I assume the object 'exists', that you 'really exist' (with a mind) and that my label 'table', when heard by you, maps onto the same referent. Furthermore, I hope that our representations (perception) of that referent are the same. I don't know whether any of that is true.
posted by Gyan at 12:07 AM on June 23, 2005


I think the crux of our disagreement is that we're working with differing premises, so I'll start from a more fundamental beginning. I believe the problem here is that people and many philosophers, and people in general, want to separate human consciousness from animal and computer consciousness. And rightfully so, we're different, we've invented the wheel, we've done all these amazing things that other animals haven't, we're smarter, better than monkeys. I know I'm a little off topic, it's for good reason.

I'm a hard determinist, I argue that we just happened to be born of the right species to have evolved the most complex brain this planet has ever seen, but all brains are the same, our just are bigger, which allows us to think with more complexity and (most importantly) abstractly, this is the only thing that makes us different from animals, neuroscience, biology, and medicine is showing this to be basically true. This is based on cognitivism. All internal mental states, experience, sense perception, et cetera are all neurochemical impulses that can and have been described by rules and even mathematical algorithms (which is how programs with neural nets operate), and so the more complex the neural net, the more complex the brain is. Therefore, the more complex the brain, the more cognitive ability. Starting from basic sense perception (and therefore identity as I explained earlier), to perception of pain and pleasure (which is how all neural nets operate, good stimuli strengthens a link, bad stimuli weaken), to abstract thought and higher consciousness and sentience. And so the difference between human consciousness and animal consciousness is proportionally related to size of brain matter, and therefore is a difference of degrees.

Evidence for this is actually really easy: trauma victims, people with seizures, strokes, those in persistent vegetative states, and finally people that suffer from brain death. I put these in increasing order of severity for a reason; in order the brain suffers more and more trauma, increasingly larger parts of the brain is damaged or unused, and more and more consciousness is lost (albeit temporarily for lighter trauma victims and strokes). This suggests that any consciousness is fundamentally tied to brain functions. Brain functions such as having behaviors, sentiences, and self-awareness, the more brain power, the higher the level of consciousness.

And I suppose you're right, arguments like this do have good applications, I suppose there just needs to be more explination.

For everyone else: pirate. Duh.
Satyagraha
posted by thebestsophist at 1:39 AM on June 23, 2005


thebestsophist : "I believe the problem here is that people and many philosophers, and people in general, want to separate human consciousness from animal and computer consciousness."

Not at all. In fact, this does not even enter the picture.
posted by Gyan at 1:54 AM on June 23, 2005


Gyan : "A non-conscious machine enjoys no 'concepts'."

Evidence?
posted by Bugbread at 5:29 AM on June 23, 2005


I's a fundamental problem that many people have taken, as well as many modern philosophers, and therefore contemporary philosophers who build their arguments from there. However, to return to the original question of identity and consciousness, my argument still stands, self-awareness and therefore identity is inseparable from consciousness.

Let me put it a different way that attempts to stay on topic. Consciousness, as we think of it, is the sum total of the ability to perceive, react to stimuli (internal and external) and therefore think, feel emotions, etc. All of these things are expressed by electrochemical impulses in the brain, and therefore all consciousness is expressed by electrochemical impulses, and so any separation between brain functions and consciousness is purely conceptual.

I apologize for not having any examples, I haven't read as much on cognition as I'd like because my concentration is actually in political philosophy, and so my knowledge is only just enough to get me into trouble when I attempt to discuss it.
Satyagraha
posted by thebestsophist at 8:51 AM on June 23, 2005


I believe in animal consciousness and if I subscribed to Chalmers' sort of thermodynamical view I'd be happy to grant computer- and even shoe-consciousness already exists. It's not a means for me to pedestal humans, though I think humans do have something special anyway (albeit a functional trait), which is self-consciousness, introspection etc.

To Gyan, I do think painquale has a good point by raising the paradox of phenomenal judgment again - an epiphenomenalist worldview means any thoughts you have about consciousness (not about the table, necessarily, but more importantly along the lines of "wow, consciousness is interesting") wouldn't be about the real thing, since it couldn't be influencing your thoughts to know about it. Your assertion that the sensation of your experiences is what allows you to talk about them indicates you're an interactionist like me.

Just to make clear how dualists justify things physically, Epiphenomenalism usually relies on the information-based approach, I'd venture; and interactionism uses the fact that research is starting to show that the brain has electrical activity on a level low enough to be impacted by Quantum uncertainty, which would mean the mind could interact with the brain by collapsing waveforms, and so would theorize that awareness is a product of complex entities employing quantum forces (the parallels this has with the fact that observation has a role in Quantum physics as it is are interesting but let's not push it). Though that's the sort of wild speculation that leads one to the accusation of being pseudoscience, we're making no claims, and if I recall relatively respected physicists have written about this idea.
posted by abcde at 12:05 PM on June 23, 2005


(Pirate)
posted by abcde at 12:07 PM on June 23, 2005


bugbread : "Evidence?"

A concept is a mental entity. At the very least, you would need "someone" holding the concept. If no consciousness, then it's just atoms all around. The fact that they are arranged as an object is observable and remarkable only to those who can observe i.e. conscious entities.

thebestsophist : "my argument still stands, self-awareness and therefore identity is inseparable from consciousness."

That's not what's under dispute.

"Consciousness, as we think of it, is the sum total of the ability to perceive, react to stimuli (internal and external) and therefore think, feel emotions, etc. All of these things are expressed by electrochemical impulses in the brain, and therefore all consciousness is expressed by electrochemical impulses, and so any separation between brain functions and consciousness is purely conceptual."

You have already a priori bridged the gap by basing consciousness on electrochemical impulses. Begs the question.

abcde : "an epiphenomenalist worldview means any thoughts you have about consciousness (not about the table, necessarily, but more importantly along the lines of 'wow, consciousness is interesting') wouldn't be about the real thing, since it couldn't be influencing your thoughts to know about it. Your assertion that the sensation of your experiences is what allows you to talk about them indicates you're an interactionist like me."

Not at all. Epiphenomenalism doesn't mean that there's no interaction, just that the interaction is one-way (brain -> mind). As for talking about the 'real thing', we don't know if we are. The obstacle here seems to be: how does the brain convert certain neuronal activity to thoughts about consciousness, since it can't know about it. a)The same trouble occurs for all conscious activity; your neurons don't "know" about chairs or rainbows either. All they have is photons hitting the eye..etc. 2)There's plenty of loops and spontaneous thalamocortical activity occuring. Maybe your brain's representation of such activity is 'consciousness'.

The biggest trouble for non-epiphenomenalists is that since the physical world is held to be causally closed, where's the place for interaction?
posted by Gyan at 2:48 PM on June 23, 2005


It doesn't beg the question, there is no gap to bridge, consciousness is electrochemical impulses, they are one and the same, unless you're defining consciousness as something other than the ability to percieve and react.
Satyagraha
posted by thebestsophist at 3:23 PM on June 23, 2005


thebestsophist : "unless you're defining consciousness as something other than the ability to percieve and react."

Nope, your definitions match. Which is why consciousness is not the same as electrochemical activity. In theory, I can slice open your head, setup a microscope, and observe your electrochemical activity. However, I'm not experiencing what you are.
posted by Gyan at 3:42 PM on June 23, 2005


Correction: our definitions match.
posted by Gyan at 3:43 PM on June 23, 2005


thebestsophist : "there is no gap to bridge"

The gap is, why do electrochemical impulses generate any phenomenal activity? If consciousness is physical, what the values of its physical attributes?
posted by Gyan at 3:50 PM on June 23, 2005


Okay, now what about bizzaro-zombies?

These are self-conscious beings which, despite having an atom-by-atom similarity to non-conscious objects, are aware. An inverse zombie, if you will.

For example - my pants, as we speak, are composing a sonata. It's fairly derivative stuff, sure, but still - it's more than I've done. They won't let on about this, as they're still pretending to be pants, but they're aware of everything that happens. I can imagine this (mental image: pants with thought-bubble with music notes, word "loins" frantically crossed out) about as well as I can imagine a zombie.

Now why, as grown-ups, should we be using random brain-farts as counterfactuals when speaking about the real world? If, as the repetition of the point seems to indicate, consciousness is a private experience, how the fuck could we possibly say anything meaningful about it? It's a one-person show, standing room only, regardless of how much we talk about it. So what's the point? Why talk about the ineffable?
posted by Coda at 4:30 PM on June 23, 2005


Coda : Why talk about the ineffable?"

The metaphysical itch.
posted by Gyan at 4:36 PM on June 23, 2005


Electrochemical impulses stimulate hormone production, which affect emotion, and reactions. As certain pathways are used more, the link is strengthened by repetitious use (like strengthening a muscle). That's why we are able to observe when specific parts of the brain are being used, and guess what is being done (for instance Josh Greene is doing philosophical research in moral psychology and brain activity when making moral decisions).

While you cannot experience what I'm experiencing (please give me some anesthesia before you cut me open), you can observe and (given enough information about the brain) know what thoughts and emotions I'm having. The electrochemical impulses don't generate phenomenal activity, it is phenomenal activity. Consciousness is the secretion and reception of hormones and chemicals (which is how stimulants and depressants work).
Satyagraha
posted by thebestsophist at 5:23 PM on June 23, 2005


thebestsophist : "The electrochemical impulses don't generate phenomenal activity, it is phenomenal activity"

Umm, even per cognitive scientists, not all electrochemical activity gives rise to phenomenal activity. As per the global neuronal workspace theory, only certain electrochemical activity "is" conscious. Even accepting your linguistic conflation, is electronic activity in rocks resultant in phenomenal experience for the rock?
posted by Gyan at 5:52 PM on June 23, 2005


It comes down to whether you have the intuition or not about the mind seeming aphysical. Those who don't have it are not easily convinced, but luckily most do. It makes debates about this topic interesting but totally gridlocked, as are all debates with an irreconcilable difference.
posted by abcde at 6:38 PM on June 23, 2005


No, because electronic activity in rocks do not cause rocks to perceive or react. I suppose this is a lot easier for me to accept because I don't actually believe in free will, I believe everything is determined by casual effects of everything before it, and we don't actually freely choose anything, all our choices are decided by environmental factors our histories, and probabilities effected by such. To the question of why it seems we make conscious choices and do have free will, I can only answer little more than "it just seems that way."
When it comes down to it, if we are able to design a computer neural net at the complexity of the human brain, and it acts and thinks with all the functionality of a human. Or say a zombie --saying that's what the topic was originally about-- why postulate that there is a consciousness in there? Occam's razor says the argument with the last amount of assumptions is the one favored to be correct, then why assume that there is an incorporeal "inside" that is conscious?
Satyagraha
posted by thebestsophist at 8:34 PM on June 23, 2005


thebestsophist : "No, because electronic activity in rocks do not cause rocks to perceive or react."

Again!!!!!!!!

This is just begging the question. If rocks are conscious, then they perceive. Which is what I'm asking in the first place, are rocks conscious? Does every conscious entity have to exhibit reaction i.e. motor activity?
posted by Gyan at 8:40 PM on June 23, 2005


Those who don't have it are not easily convinced, but luckily most do.
I apologize if I took that the wrong way, but I find that very condescending, I have yet to see any strong evidence to show that the mind is something beyond physical. Especially with modern computer science, there are fewer and fewer things that we're finding that can't be replicated by programs and algorithms.
Satyagraha
posted by thebestsophist at 8:42 PM on June 23, 2005


Not moter activity, any activity cogitive or motor.
Satyagraha
posted by thebestsophist at 8:43 PM on June 23, 2005


thebestsophist : "any activity"

So, are rocks conscious? There's plenty of activity going on, even in rocks.
posted by Gyan at 8:46 PM on June 23, 2005


I specifically said cognitive or motor activity, and by cognitive activity, I specifically mean electrochemical stimuli that can be considered thought, which generally (as far as we know) requires some sort of neural network, if a rock somehow shows evidence of all those factors, then I would say that rock is in fact sentient and conscious, and immediatly take it to a lab for study. Please, you're unnecessarily picking at words when the meaning is clear, I don't like arguing that way, it generally means someone is unnecessarily angry and never produces productive discussion.
If you have a strong argument saying that consciousness is metaphysical, I would really like to hear it, I haven't heard of many arguments for a dualist view that haven't pointed to a god, or suggest that humans are superanimals, and would actually like to learn how contemperaries are arguing a dualist approach.
Satyagraha
posted by thebestsophist at 9:33 PM on June 23, 2005


thebestsophist : "I specifically said cognitive or motor activity, and by cognitive activity, I specifically mean electrochemical stimuli that can be considered thought, which generally (as far as we know) requires some sort of neural network"

You are just arguing in circles. So consciousness requires "electrochemical stimuli that can be considered thought". How do you know which activity can be considered 'thought', and which not?

BTW, dualism, in the sense argued here, hasn't anything to do with a god or superiority.
posted by Gyan at 10:02 PM on June 23, 2005


I know it seems as if I am arguing in circles, I promise I am not (or at least I'm not trying to), physically there is no difference between what we think of as thought, and what is not. The difference is in the organization, and this is very important. Electrochemical stimulus in the brain is very structured, specific pathways are used to produce specific reactions, and it is done repeatedly and deliberately. While conversely electric activity in rocks generally is mostly unstructured, it may follow paths of least resistance where there are higher concentrations of metals, but that is not a product of organic reproduction governed by genes, but previous geologic activity. Where is the line between the amount of organization that is thought and what isn't? That I don't know, hopefully more studies in biology , neurology, and cognition will produce better answers.

I know dualism --in the sense we're talking about-- has nothing to do with god or superiority, that is why I want to know more about it, I've never heard it argued without arguing such. The only arguments for dualism I know use god to justify the dualist view, I want to know justifications for dualism that don't point deities as proof of the nonphysical.
Satyagraha
posted by thebestsophist at 10:42 PM on June 23, 2005


Chalmers' original paper probably serves as the classic argument, and he fleshes it out in his other papers, particularly Moving Forward.
posted by abcde at 12:58 AM on June 24, 2005


thebestsophist : "physically there is no difference between what we think of as thought, and what is not. The difference is in the organization, and this is very important
...
Where is the line between the amount of organization that is thought and what isn't? That I don't know, hopefully more studies in biology , neurology, and cognition will produce better answers."


They can't, without a priori assumptions. Let's say, you decide that entities A, B, C are conscious whereas entities D & E aren't. Then you will look for commonalities between A, B and C and define that as the threshold. But if D & E are conscious, then your answer's wrong. So, first, you have to know all that is conscious, and all that isn't. Since consciousness is private, all categorization will be arbitrary. The intuitive criteria we use, is behaviour. Everything that looks like me, moves like me...etc is assumed to have the rest i.e. a consciousness of its own. Even granting this assumption, all it establishes is that things that behave similarly are conscious, but not that things which don't behave similarly, aren't. Hence the a priori assumptions.

As for a defence of dualism, consciousness is private. All other empirical phenomena, aren't. Consciousness is the only empirical phenomenon that's privileged to its Self. Assume we're both looking at a table, I see the table and I see You, but I don't see what you're seeing. Let's get to the physicalist's rebuttal: "But I could see what you're seeing. In theory, I could conduct enough imaging and neuron recording experiments and map out your neural structure and algorithms. I'll show you pictures and movies and see what everything from your retina to LGN to visual cortex is upto, in enough detail. Having collected that data and analysed it successfully, I'll have a cognition and perception schema at hand. Then I can predict successfully what you're experiencing just by looking at the activity in your brain.". Here's why this doesn't work: all you now have is an internally consistent model. So, if fusiform gyrus lights up, all you surmise is that the perception and processing occuring is what occured earlier when the FFG lit up in a similar manner. You still don't know what the mind is experiencing. You can only relate to it in terms of your own perceptions. "Yesterday, you were shown stimulus A. The scanner showed activity X. I see activity ~X now, hence you're thinking of stimulus A." It's still You correlating your observations of one activity (imaging) to your observations of corresponding stimuli. What the other mind sees, about that, you are still in the dark. You might assume that the resulting experience is similar. But you don't know.
posted by Gyan at 3:32 AM on June 24, 2005


Gyan: Okay, that makes sense, but say we were able to design a computer (or computer program, more likely) that is able to have all the same cognative abilities of a human. Given time this does seem more and more likely, we already have senseors and algorithms that are able to replicate most of the basic sensory perceptions that animals have. Furthermore, better and better learning algorithms are being developed. So hypothetically speaking we invent this machine, we'd know everything about how it thinks, sees, and feels, no longer is it correlating observations, we'd have actual knowledge of how it works, and what it sees. Would we still be able to postulate that the consciousness is seperate?

abcde: thanks I'll have to read that later.
Satyagraha
posted by thebestsophist at 10:56 AM on June 24, 2005


thebestsophist : "we already have senseors and algorithms that are able to replicate most of the basic sensory perceptions that animals have."

You mean behaviour. The essence of the argument to this illustration is the same as that above. There's no indication of the phenomenal activity, if any. Only the schema of electrochemical activity.
posted by Gyan at 11:17 AM on June 24, 2005


Almost all dualists would say that the behavioral and physical correlates cause consciousness. Epiphenomenalists go so far as to say it has no practical role, and is hence invisible. In any case, yes, a computer that accurately simulates human intelligence consciousness could be conscious, depending on the theory you're using, but those performing attributes wouldn't be consciousness, they'd just be what caused it. (This, again, is different from something like vitalism in that life is obviously a set of impressive physical characteristics - supervenient on the physical - whereas qualia, for most, don't seem to be, for instance in that they're totally private)
posted by abcde at 12:21 PM on June 24, 2005


Gyan: I'm not only talking about behavior, I'm also talking about all sensory perceptions like shape and object recognition, taste/smell, touch, etc. All that can be artifically reproduced as well, and I'm guessing that's what you're refering to as phenomenal activity.

abcde: But then what is conscioussness other than the sum of it's parts? What makes conscoiusness more than all the things that make it?

What both of you are saying makes sense, I'm just not sure why we should be leaping to think that there is something more if there's no evidence for it, and not say...just enough enough information to understand how it could all be physical. Though I do find it a very intriguing way to look at thought.
Satyagraha
posted by thebestsophist at 5:07 PM on June 24, 2005


thebestsophist : "I'm also talking about all sensory perceptions like shape and object recognition, taste/smell, touch, etc. All that can be artifically reproduced as well, and I'm guessing that's what you're refering to as phenomenal activity."

You're overreaching. Does a thermometer need to "feel" the temperature, in order to give out a reading? We normally think, No. It's just a plain simple reaction of the agent (say, mercury) to the input. All those sensors are doing is signal processing. Let's take the example of a digital camera. Are you suggesting that the camera "sees" the pictures? It's matter being manipulated, based on input and output. We don't know if any phenomenal activity is occuring. I think what's confusing is that since our self-narrative tends to be in terms of consciousness ("I saw the car, and quickly got out of the way"), you're ascribing phenomenal consciousness to devices that exhibit humanlike behaviour. In essence, this comes down to that earlier argument I made, "The intuitive criteria we use, is behaviour. Everything that looks like me, moves like me...etc is assumed to have the rest i.e. a consciousness of its own.". Maybe they do, but we don't know.
posted by Gyan at 5:28 PM on June 24, 2005


thebestsophist: (to repeat,) It's possible to imagine someone being dead inside and still operating physically, which means the physical components aren't consciousness, they're at best just its means of action.

Paul Ganssle says hi.
posted by abcde at 5:58 PM on June 24, 2005


Though a thermometer does not feel temperature, machines can be programed to note temperatures and react to them accordingly. Robotic arms in assembly plants have been programmed to notice the hardness of the objects that they touch, and if it is too soft (and therefore possibly human and not metal) they are programmed to stop before harm as caused, just as animals do. The mars rovers have been programmed to "feel" the consistency of the the medium they are traveling on and adjust accordingly. Can you say what we do is something more? Animals take information that they've collected and act accordingly to how they percieve it, I don't see how any of that can't be replicated with complex enough programming.

I understand that we can't be completely sure that anything other than ourselves, and that argument is irrefutable, we will never be sure, however it is generally safe assumption to make and is fundimentally required if we want to be able to operate in this universe (or else we can just sit somewhere and contemplate whether or not our feet are actually there for the rest of our lives...which would be rather short without any nutrition).
Satyagraha
posted by thebestsophist at 6:04 PM on June 24, 2005


abcde: But I can imagine a unicorn, that doesn't mean that unicorns are real. I actually can't imagine someone being dead inside because I don't there there's an "inside" to be dead, I'll believe it when someone proves otherwise.

and umm...hi?
Satyagraha
posted by thebestsophist at 6:06 PM on June 24, 2005


thebestsophist : "Can you say what we do is something more?"

We experience. If a thermometer does the same, then No. Else Yes.
posted by Gyan at 6:09 PM on June 24, 2005


I'm not talking about the thermometer, I'm talking about a programed machine (say a robot with an AI), it takes the detects the temperature and reacts accordingly, isn't that experiencing, what more do we do? it detects temperature and reacts to it, what more do we do?
Satyagraha
posted by thebestsophist at 6:17 PM on June 24, 2005


Umm, subtract "takes the" from that sentence. What is experience other than the detection of sensation and reacting to it (mentally and physically, all of which can be programmed).
Satyagraha
posted by thebestsophist at 6:22 PM on June 24, 2005


thebestsophist : "What is experience other than the detection of sensation and reacting to it."

What is it like to be a bat?
posted by Gyan at 6:32 PM on June 24, 2005


You believe it isn't physically impossible, as do I, but if it's logically possible (a somewhat more debatable topic) that would entail dualism - anyway, just read those papers, I don't think I've said anything here that Chalmers hasn't already (he has all day to think about it ;) )
posted by abcde at 6:50 PM on June 24, 2005


Quite note before I go to bed: ...if it's logically possible a somewhat more debatable topic) that would entail dualism...
Tautological validity does not imply first-order validity (nor logical validity, but I understood what you meant). Therefore, it would only allow for conceptual dualism, and not actual dualism. I am reading the papers (including the one you just linked, Gyan) but it's going to take me a while to respond, I have to pack my dorm room to move home for the rest of the summer, and and still working on two servers, I promise I will respond sometime saturday when I am better rested, I haven't ditched the discussion.
Satyagraha
posted by thebestsophist at 10:59 PM on June 24, 2005


Ugh, I almost fell asleep and then went "wait, that's not the right phrase." So scratch that it should say "logical validity does not imply logical necessity." Very sorry, long day.
Satyagraha
posted by thebestsophist at 11:14 PM on June 24, 2005


Sorry, I said I was going to present a reply today, but I just have way too much to do, I'm reading through the papers, and I'll have a flight on Monday to write my response. If you'd like, I can email it to both of you, versus posting it here, It'll actually be more of a full response, than a post.

Btw, I spend my whole day thinking about these things as well, the difference between him and me...other than the obvious difference in experience due to age, and the opposing points of view, is that he gets paid to think about it while I'm paying to think about it, hehe.
Satyagraha
posted by thebestsophist at 10:35 PM on June 25, 2005


Post it here.
posted by Gyan at 10:49 PM on June 25, 2005


I haven't forgotten about this topic, I'm still writing, it's going to take me longer than a few days. As a preliminary, I want to give a quick response to Chalmer, in doing research looking at what other philosophers say, I point at David Dennett's essay Are we explaining Consciousness yet?.

Furthermore, as a response to Nagel, I would like to say that his argument is too solipsistic. It is not falsifiable, and therefore is not for the fields of science or contemporary philosophy, it is more akin to theology. I know what I say is harsh, but it renders the entire argument unscientific and therefore just bad. If something interacts with the world, then its effects are observable, and therefore information about the object must be derivable. However, I will present a more comprehensive argument against it.
Satyagraha
posted by thebestsophist at 10:33 PM on June 28, 2005


thebestsophist : "If something interacts with the world, then its effects are observable, and therefore information about the object must be derivable."

If consciousness is epiphenomenal, it does NOT interact. If it isn't, you still have to show you can derive ALL information that there is.

"It is not falsifiable, and therefore is not for the fields of science or contemporary philosophy"

Does it matter? Must all truths of the world be available in a Popperian form? Isn't that like putting on blinders? Note that I'm not asserting that there ARE truths not amenable to science, though I think so; just wondering about the constraints.
posted by Gyan at 11:40 PM on June 28, 2005


If consciousness doesn't interact with the physical world at all, then we wouldn't know about it, it has to interact with the physical world to gather information about it so it can react to it.

Also, it does matter, philosophy and science are about learning the truth, if we have no way of determining it's validity, then we can't know anything about it, and therefore...to put it bluntly, is in the realm of theology and not philosophy nor science.
Satyagraha
posted by thebestsophist at 12:15 AM on June 29, 2005


thebestsophist : "If consciousness doesn't interact with the physical world at all, then we wouldn't know about it, it has to interact with the physical world to gather information about it so it can react to it."

I assumed 'interact' meant two-way interaction. Even in epiphenomenalism. there is {brain -> mind} function, just nothing the other way around. Also, you don't know about my consciousness, you know about yours. I thought we covered this already.

thebestsophist : "Also, it does matter, philosophy and science are about learning the truth, if we have no way of determining it's validity, then we can't know anything about it, and therefore...to put it bluntly, is in the realm of theology and not philosophy nor science."

Philosophy is about truth, not science. And who said that divinity i.e. theology comes into the picture. Some true things are not provable.
posted by Gyan at 12:37 AM on June 29, 2005


I assumed 'interact' meant two-way interaction. Even in epiphenomenalism. there is {brain -> mind} function, just nothing the other way around.

If interaction is one way, then decisions would not be able to change according to situation. For instance, imagine it's a hot day (not that hard during the summer), and you just finished a 5 mile jog. Someone hands you a cool cup of lemonade. Your body and mind go "Sweet, fluids!" and your consciousness reacts "Ice cold amazingness!" Now imagine this is repeated once a day for 1,000 days. Your mind and body will want the fluids every time, however, your consciousness will start getting tired of it and go "Oh....not again, this stuff is disgusting." Most likely, given enough time, you'll ask "may i have something different?" How could that happen if the consciousness isn't communicating with the mind? If communication was one-way the mind and body would continue to ask for the lemonade without change, but there was a change, you asked for something different.

Also, you don't know about my consciousness, you know about yours. I thought we covered this already.

I'm covering this very extensively in my paper. Give me time, I'm going at about a page (single spaced) a day, I promise, I'll have something. Because I just flew back home, I don't have any of my philosophy books except Bertrand Russel's History of Western Philosophy.

Philosophy is about truth, not science.

I didn't say philosophy is about science, I said philosophy is about finding the truth, and to be able to find the truth you have to be able to test it, otherwise it's unprovable. That's why the study of logic came about. And historically science has been about truth, it's about the discovery of truth, science has it's, roots in philosophy.
I say that unprovable things are in the realm of theology and not philosophy because it is unprovable. Mere speculation with no ability to decern it's validity, I want nothing to do with it, ultimate existance and discovery is pointless is we cannot understand it, why try if we can't grasp it? I have yet to come across anything without a truth value.

And who said that divinity i.e. theology comes into the picture. Some true things are not provable.
Only a theologian would say that. No actual philosopher I know would dare argue that. Philosophy is about taking a possible truth, taking it to it's end, and finding it's truth value.
Satyagraha
posted by thebestsophist at 1:21 AM on June 29, 2005


thebestsophist : "If interaction is one way, then decisions would not be able to change according to situation."

You are thinking in simple stereotypes. If (A), then (P). If (A)(B), then (P)(Q). If (A)(C) then (R). That seems to prevent you from realising that there are a gazillion components. Actual function would be like If (A1)(A2)(A3).........(A10000000000000035)(A10000000000000036)....etc then (S). Such behaviour would appear to have elements of both stereotyping and randomness. Stereotyping, because changes are incremental. Random, because after a certain threshold, changes are major. Think of complexity and chaos theory. After all, you said you did NOT believe in free will, meaning the appearance of a conscious will is just an illusion.

Your mind and body will want the fluids every time, however, your consciousness will start getting tired of it and go "Oh....not again, this stuff is disgusting. Most likely, given enough time, you'll ask 'may i have something different?' How could that happen if the consciousness isn't communicating with the mind?"

Eh, what's hard about this? Your brain is adapting, habituating and desensitizing to a certain stimulus. That's just reflected in your consciousness. As an analogy, the cycle of day/night is constant within our lives, however there was a time when there wasn't a Sun to beam light onto a planet, and there will be a time when there won't be one. You, as a physicalist, certainly don't believe that the Sun will literally "feel tired" of beaming light. Similarly, you wanting a change is just a psychological narrative of physical events.

"I didn't say philosophy is about science"

Neither did I. You initially said that philosophy & science are about truth. I replied, "Philosophy is about truth, not science.". 'science' contrasts with 'Philosophy'; doesn't substitute for 'truth'. I should have phrased that clearly.

"I said philosophy is about finding the truth, and to be able to find the truth you have to be able to test it, otherwise it's unprovable."

This presupposes that truth must be provable. Prove that I did not sneeze 8 minutes ago.

"Mere speculation with no ability to decern it's validity, I want nothing to do with it, ultimate existance and discovery is pointless is we cannot understand it"

That's tough luck. You are not an idealist, which means that you don't believe that Truth is contingent on what you *want* it to be. So, if the Truth is unprovable, then it's unprovable. Whether the Truth is amenable to discursive reasoning or not, it remains the Truth.

"Only a theologian would say that."

I don't believe in a god, so stop bringing up theology.
posted by Gyan at 2:24 AM on June 29, 2005


Think of complexity and chaos theory. After all, you said you did NOT believe in free will, meaning the appearance of a conscious will is just an illusion.

Not exactly, free-will is an illusion, consciousness itself I see as a complex system that processes perceptions and reacts (including mental reactions like "red kool-aid tastes red"), I'll explain why I can break consciousness down into these two distinct parts (and no others) in my paper.

Your brain is adapting, habituating and desensitizing to a certain stimulus. That's just reflected in your consciousness. As an analogy, the cycle of day/night is constant within our lives, however there was a time when there wasn't a Sun to beam light onto a planet, and there will be a time when there won't be one. You, as a physicalist, certainly don't believe that the Sun will literally "feel tired" of beaming light. Similarly, you wanting a change is just a psychological narrative of physical events.


Then what does the consciousness do? If all functions are including experience is handled by the mind (for the sake of unconfusion, we'll say the mind does all the physical processing), then what is left for the consciousness to do?

Eh, what's hard about this? Your brain is adapting, habituating and desensitizing to a certain stimulus. That's just reflected in your consciousness.

To reiterate and further my last argument. If the consciousness only observes the mind and does not provide any feedback back to it, then our minds would not be able to react to it, therefore our minds in of themselves must be complete systems that are able to handle everything from continually beating our hearts, to deciding that green jello tastes green.

Neither did I. You initially said that philosophy & science are about truth. I replied, "Philosophy is about truth, not science.". 'science' contrasts with 'Philosophy'; doesn't substitute for 'truth'. I should have phrased that clearly.

Haha! We confuse each other and ourselves. If you don't mind I'll turn to lessons from symbolic logic to break down my original statement of "Philosophy and Science are about finding the truth." The statement can be broken down into two sentences: "Philosophy is about finding the truth" and "Science is about finding the truth." To put it into crude symbolic logic:

(Philosophy -> Truth Finding) & (Science -> Truth Finding)
which in turn becomes
(Philosophy v Science) -> Truth Finding
where "v" means "or"

So a clearer way of saying it would be "If we are carrying on Philosophy or Science, then we are trying to find the truth." But due to a twist in English, the "or" becomes an "and" in everyday language.



This presupposes that truth must be provable. Prove that I did not sneeze 8 minutes ago.

I can't, I wasn't there to observe it, if I was, or if I had a reliable source watching you all the time (I need to get in the big brother business), then I could prove whether or not you did. Therefore, it is provable with enough information and observation. That's the point of philosophy and science.


That's tough luck. You are not an idealist, which means that you don't believe that Truth is contingent on what you *want* it to be. So, if the Truth is unprovable, then it's unprovable. Whether the Truth is amenable to discursive reasoning or not, it remains the Truth.

Touche, however, my point still stands, I have yet to see proof that there are Truths that are unobtainable and unprovable.

I don't believe in a god, so stop bringing up theology.
Gomen, I'm not talking about god. I'm talking about a system of beliefs that cannot be proven, theology and religion isn't only about gods, several eastern religions specifically don't have gods, they're about seeing the world in a specific way without needing it to be proven. Philosophy is specifically about taking an argument and proving it's validity (or invalidity).
Satyagraha
posted by thebestsophist at 11:08 AM on June 29, 2005


No account of qualia is falsifiable (or confirmable), so any study of consciousness has to have an aspect of taking things for granted based on your own sensations - at least that's what dualists say. By the way, whatever crushing refutation you'll post is going to be dependent on your materialist intuition - you're satisfied by the idea that (roughly) since every material cause or manifestation of experience is visible, they constitute experience itself, and I can't see that. So don't anticipate me giving an equally exhaustive rejoinder since the disagreement isn't even something that can really be discussed.
posted by abcde at 11:19 AM on June 29, 2005


abcde: What if my argument shows that other people's experience can be observed? I wouldn't even dare thing about presenting an argument without attempting to show that the discussion is one that can be made.
Satyagraha
posted by thebestsophist at 12:21 PM on June 29, 2005


Let me define the terms straight, so we aren't talking past each other.

Brain == physical matter & structure.

Consciousness == phenomenal canvas

Mind == the contents of consciousness i.e. the "painting" on the canvas

thebestsophist : "Then what does the consciousness do? If all functions are including experience is handled by the mind (for the sake of unconfusion, we'll say the mind does all the physical processing), then what is left for the consciousness to do? "

Nothing. The mind simply represents the brain. Let me pick a simple illustration. You feel hungry, so you head to the pantry; pick out a cookie and eat it. According to you, you consciously reacted to the hunger and satisfied it. But as per physicalism, the brain is ruled by physical laws. Which means, you felt hunger, because of physical conditions and processing. Your heading to the pantry; picking up cookie; placing it in mouth..etc were all the strict causal consequences of physical determinism. Your brain simply translated all that activity in an internally consistent manner. So that the phenomenal correlate of physical activity A is hunger; the correlate of physical activity B is experiencing the walk to kitchen; correlate to physical activity C is locating cookie and so on. The progression from A to B to C... is purely physical determinism. The "potency" of consciousness as an affective agent is an apparition.

thebestsophist : "I can't, I wasn't there to observe it, if I was, or if I had a reliable source watching you all the time (I need to get in the big brother business), then I could prove whether or not you did. Therefore, it is provable with enough information and observation. That's the point of philosophy and science."

That's no different from religious arguments. If I was Jesus in Palestine 2,000 years ago, I would have the inside scoop, as well (either way). Either I sneezed or I didn't. There is a truth value to that statement. And you are just saying, if conditions were different, I would know. But that's not an answer. The question is can you prove that I didn't sneeze, not whether you could if conditions were different. The obvious answer to that is No, barring some refutation from you.

thebestsophist : "I have yet to see proof that there are Truths that are unobtainable and unprovable."

What would constitute as proof and demonstration?

I'm talking about a system of beliefs that cannot be proven, theology and religion isn't only about gods

I'm just saying that theology refers to the study of the workings of divinity. And only Christian studies are classified as theology. It's the Western category error to classify other religious apologetics as theology.

several eastern religions specifically don't have gods, they're about seeing the world in a specific way without needing it to be proven.

Actually, Eastern traditions claim that via luck or effort, it is possible to access higher states of consciousness, where metaphysical truths become known, just as sure you know you are conscious. Proof does not enter the picture, because these modes of consciousness are anterior to discursive reasoning, and not bound by it.
posted by Gyan at 1:49 PM on June 29, 2005


Your brain simply translated all that activity in an internally consistent manner. So that the phenomenal correlate of physical activity A is hunger; the correlate of physical activity B is experiencing the walk to kitchen; correlate to physical activity C is locating cookie and so on. The progression from A to B to C... is purely physical determinism. The "potency" of consciousness as an affective agent is an apparition.

Then my question still stands, if the consciousness does nothing, and there is no evidence for it's existence, then why should it even exist? We should not assume the existance of something that we do not need.

The question is can you prove that I didn't sneeze, not whether you could if conditions were different. The obvious answer to that is No, barring some refutation from you.

Actually, the answer is yes. I could fly to where you were, analyze your hands, and all other physical evidence for trace amounts of mucus, see whether or not there is a highetened amount of moisture, etc. Estimate the disperation of air caused by a possible sneeze (or 8 possible sneezes) and time elapsed. Of course this would require sensitive measuring equipment that has not been invented yet, as well as extremely complicated calcuations, in other words given information I can prove that you did/did not sneeze, which is the same thing with conscoiusness, given enough information we can prove whether or not consciousness is physical or not.

What would constitute as proof and demonstration?
An irrifutable logical gap. Mind I take Hume's critique on causal deduction very seriously, and if you make an argument against materialism from that point of view, I admit I cannot refute it, however given probability things do tend proceed in a causal manner, and therefore can be (for all intensive purposes) be counted as such, otherwise no scientific study or learning can take place. Furthermore, before you say that there is a logical gap between consciousness and cognative functions, I yet again, say that I will attempt to bridge the gap in my paper.

I'm just saying that theology refers to the study of the workings of divinity. And only Christian studies are classified as theology. It's the Western category error to classify other religious apologetics as theology.

Theology isn't only the study of western religion, it is "the study of religious faith, practice, and experience." While true it generally refers to christian studies, it is not limited to such, and I use it in the broader term to describe all religious studies.

Actually, Eastern traditions claim that via luck or effort, it is possible to access higher states of consciousness, where metaphysical truths become known, just as sure you know you are conscious. Proof does not enter the picture, because these modes of consciousness are anterior to discursive reasoning, and not bound by it

Off topic? I used it as an example to show not all religion is centered around a belief in a diety, and only an example for such.
Satyagraha
posted by thebestsophist at 2:43 PM on June 29, 2005


To clarify: you're arguing that consciousness is a medium in which cognition (activities of the mind) operates, yes?
Satyagraha
posted by thebestsophist at 3:37 PM on June 29, 2005


thebestsophist : "if the consciousness does nothing, and there is no evidence for it's existence, then why should it even exist?"

Good question. Assuming physicalism, it shouldn't.

"We should not assume the existance of something that we do not need."

I assume your consciousness exists, and I know mine does.

"I could fly to where you were, analyze your hands, and all other physical evidence for trace amounts of mucus, see whether or not there is a highetened amount of moisture, etc. Estimate the disperation of air caused by a possible sneeze (or 8 possible sneezes) and time elapsed. Of course this would require sensitive measuring equipment that has not been invented yet, as well as extremely complicated calcuations, in other words given information I can prove that you did/did not sneeze"

Even granting this farfetched scenario, how would you distinguish between a sneeze at 8 minutes prior, to one 7 minutes prior, to that earlier comment?

"While true it generally refers to christian studies, it is not limited to such, and I use it in the broader term to describe all religious studies."

My point is, not all metaphysical speculation is theology. None of my statements have a religious color to them, so invoking the domain of theology is misplaced.
posted by Gyan at 5:37 PM on June 29, 2005


Good question. Assuming physicalism, it shouldn't.

Are we both arguing for physicalism? I'm confused, I'm arguing against dualism here...

I assume your consciousness exists, and I know mine does.

I'm not arguing the existence of the consciousness, I'm arguing against the existence of a consciousness that is apart from the physical mind.

Even granting this farfetched scenario, how would you distinguish between a sneeze at 8 minutes prior, to one 7 minutes prior, to that earlier comment?

Calculating time is part of the equation, the dispersion of moisture over 60 seconds is very apparent given small amounts. I admit fully that this is completely impractical, given that massive amount calculating power that would be needed, such a project would take more processing power than the entire world's computers combined would allow.


My point is, not all metaphysical speculation is theology. None of my statements have a religious color to them, so invoking the domain of theology is misplaced.


You are correct, not all metaphysical speculation is in the domain of theology, in fact, a lot isn't. What I'm arguing is metaphysical speculation with no ability way of determining validity, and based on only belief is. I know it's a harsh and abnormal point of view, and somewhat irrational, but it's due to a whole lot of bitterness dealing with a lot of people calling unjustified speculation philosophy.
Satyagraha
posted by thebestsophist at 7:13 PM on June 29, 2005


thebestsophist : "Are we both arguing for physicalism? I'm confused, I'm arguing against dualism here..."

I'm arguing against physicalism. Assuming physicalism is true, there is no good reason for consciousness to exist.

"I'm arguing against the existence of a consciousness that is apart from the physical mind."

Well, consciousness exists, and if consciousness can affect things, then physicalism is not true, unless you bizarrely extend the term 'physical' to mean objects of consciousness.

"the dispersion of moisture over 60 seconds is very apparent given small amounts"

My point is, in a chaotic system, which the world can be (i.e. weather), you would need to measure to infinite accuracy to get a confident answer. Suppose, you had the measuring equipment and processing power you want, you fly here, and get some numbers, say, 4.77874643 & 4324.24443334. From this, your result is that I sneezed at 7:48 prior to the comment. But in a chaotic system, if you had more precise measurements like 4.7787464273 & 4324.244433341042, the difference in the result might be completely disproportionate to the difference of the input, so your answer might turn to be 6:12 prior to comment. In other words, you would need to measure with infinite accuracy, hence being unable to ascertain whether I sneezed 8 minutes prior to the comment.

You know what. Let me pick a better, unambigious example. Prove that at 15:00 GMT yesterday, I wasn't thinking of, say, a friend.

"What I'm arguing is metaphysical speculation with no ability way of determining validity, and based on only belief is."

Most of Western ontology would then be classified as theology. I have no objection to you calling such speculation useless, but theology has a specific meaning, and you're perverting it.
posted by Gyan at 6:45 AM on June 30, 2005


I'm arguing against physicalism. Assuming physicalism is true, there is no good reason for consciousness to exist... ...Well, consciousness exists, and if consciousness can affect things, then physicalism is not true, unless you bizarrely extend the term 'physical' to mean objects of consciousness.

In other words, you would need to measure with infinite accuracy, hence being unable to ascertain whether I sneezed 8 minutes prior to the comment.

As I said, theoretically possible, but practically not. :-)

No good reason for consciousness as a metaphysical medium. I repeat, I am arguing the nature of what consciousness is, not whether or not it exists. I'll explain myself more in depth (yet again) in the paper.


Most of Western ontology would then be classified as theology. I have no objection to you calling such speculation useless, but theology has a specific meaning, and you're perverting it.

I am, consider it a failing of a bitter philosophy student.

Satyagraha
posted by thebestsophist at 10:29 AM on June 30, 2005


thebestsophist : "As I said, theoretically possible, but practically not. :-)"

Measuring to infinite accuracy is not a theoretical possibility either, unless you are omniscient.

Any idea when your paper will be ready?
posted by Gyan at 11:17 AM on June 30, 2005


I'll have it posted by monday (thus making it a two week long paper...like class), I'd have it posted by Saturday or Sunday, except I'm getting dragged out of town, and so I need to finish up on a website design by tomorrow, and therefore have less time to work on the paper. That's a promise.
Satyagraha
posted by thebestsophist at 2:45 PM on June 30, 2005


I apologize profusely for being late, we ended up staying a day longer than anticipated, and so I got home Monday, and spent the entire time since finishing up the draft. It's really rough, and I haven't had time to put in proper citations yet. Though the base argument is there, please note that any inconsistancies are (hopefully) not due to lack of thought, but of inexperienced rhetoric, late night writing sessions, and not enough proofreading. Enjoy(pdf)!
Satyagraha
posted by thebestsophist at 3:57 AM on July 5, 2005


I'll read it and comment on it here, with annotations.
posted by Gyan at 6:47 AM on July 5, 2005


(All quotations from this PDF, unless noted otherwise)

"We obviously have a consciousness, however the nature of our consciousness seems to be questionable."

Nothing obvious about 'we' having a consciousness. I, certainly. You? I assume so.

"While we understand much of how the brain learns remembers, and even emotions"

Not true. Our corpus of data includes ample correlations. There isn't yet a successful model that has proven predictive power. So, "understand" is a misplaced term.

"all events(perceived or not) either happen internally or externally. To clarify what I mean, I'm talking about everything, be it a ball rolling down a hill, a star going nova, or the idle thought of Socrates, it either happens within ourselves, or outside of it: in the case of these three, all happen outside."

Does not follow in the last example, depending on what you meant. I assume 'idle thought of Socrates' refers to Socrates having a thought, rather than I having a thought about Socrates. If so, we do not know if this really happens, at all. If Socrates had consciousness, then true, else false.

"I need to make another division, there are external perceptions (sense perception), external reactions (physical reactions), internal perceptions (memories, etc) and internal reactions (emotional reactions, etc)"

This division is assumed, on the assumption of solipsism and epiphenomenalism being false.

"The interesting part about internal reactions is they are often outputted back in, thus we have streams of thought."

This is confusingly written. Are you saying that thoughts are threaded?

"Mind: The collective of current explainable cognitive processes,"

So, unexplained processes aren't part of the mind??

"Chalmers' question of how we can physically represent the conscious perception of the quality of colour (specifically deep blue) is an intriguing one. In order to answer this question, we have to ask what do we mean by quality, in this case, we can talk about it's hue, what makes the blue deep? Then ask the question what is deep blue? Answer: a dark shade of blue. Now we have information to work with. Why do we translate dark blue into deep blue? What makes the dark deep? From here we are in familiar territory, we tend to experience deep places to be dark, therefore deepness is correlated with darkness(like say, the ocean or a lake), again back to the familiar realms of memory and sensory perception."

You've simply correlated one judgement of a perception to a judgement of another one. I don't see Chalmers' concern of 'physical representation' being tackled.

"Nagel makes note that we cannot actually know whether or not any other being perceives the same way we do. He is correct and this claim is irrefutable, however, to follow the course of this claim to the fullest we then have to question whether or not any other being exist outside of ourselves, and to follow Hume's arguments, we must also refute knowledge of all causal effects, we only perceive and assume that the perceptions are caused by something is out there. In response to this, I also take note from Hume's book and say: yes, we can never be sure, we only know that the things are probably out there, and likelihood is that other people perceive the same way. Therefore, it is an assumption that we have to make if we want to make any progress in understanding anything about the world around us."

This is the meat of the matter, and you've as good as admitted defeat.

Two points:

1)If I'm the only one conscious, then "an assumption that we have to make" is wrong and leading me down the wrong path towards understanding the world. How can I tell?

2)If epiphenomenalism is wrong, and consciousness interacts with the physical, then there is no need for an assumption in the first place. It should be possible, in principle, to conclusively demonstrate that consciousness is physical. The very fact that one needs to make an assumption, indicates that physicalism is not the right track.

"Furthermore, in direct response to his claim that we cannot every truly understand how bats perceive through hearing, we can conceptually understand it, human hearing is stereo, we do have a basic sense of directional hearing, and taking the assumptions from the last sentence, we can (rather safely) assume that bats just have a heightened sense of directional hearing."

It's begging the question to say, we can "safely" assume. If it's an assumption, how do you know it's a safe one?

"Nagel comments on the inability to communicate what a thing looks like to someone that has been blind since childbirth, recent developments in technology have allowed blind people to see with high resolution video cameras and neural implants, though this does not mean we cannot explain the act of “seeing” beyond the conceptual, it does show that seeing and (partial) knowledge of how others visually perceive things is possible: they do it the exact same way we do"

It can't be partial knowledge if they do it the exact same way we do, though I suspect this is just a linguistic goofup. This again comes down to the earlier explanation I gave about the physicalist's problem.

"Consciousness, is no longer a medium upon which all our memories, thoughts, and perceptions are developed and imprinted. Instead, I suggest we use it as an overarching term for all cognition,"

You have just changed the terminology. Why does cognition give rise to any perception? Or in a looser sense, why does there exist any perception to be correlated with the cognitive processes?

"In order to have an advantage an animal needs to be able to gather all information and react to it as quickly, efficiently, and correctly as possible, conscious experience is a very important factor because it allows for efficient catagorization and reaction to stimuli"

But why is consciousness required at all? Consciousness, as per physicalism, is totally physically-based. The laws of physics will take care of everything. BTW, evolution isn't goal-driven. That's the narrative we lay atop past events. It's a tautology, those who could survive, survived.
posted by Gyan at 7:41 AM on July 5, 2005


Preliminarly question: what do you mean by perception? I'll pose a response later tonight.
Satyagraha
posted by thebestsophist at 1:40 PM on July 5, 2005


Perception ==> any 'feeling' or 'experience'.
posted by Gyan at 1:44 PM on July 5, 2005


First, I'd like to apologize, I obviously didn't explain things clearly enough, I tend to make leaps that work perfectly in my head without fully explaining them, furthermore I tend to make references to things I think are obvious, but aren't because people don't read the exact same things that I do. Also, I am a very literal person, and therefore everything I write and say is exactly what it seems. Also, I won't be responding to you in the order that you present, there are many issues that you bring that I want to address together.

Not true. Our corpus of data includes ample correlations. There isn't yet a successful model that has proven predictive power. So, "understand" is a misplaced term.

We understand how certain thoughts and mental activity coincide, furthermore, we understand what regions of the brain different types of decisions are made, that's what I mean by understand, nothing more. Furthermore, we do have a sort of predictive power, we can predict how certain chemicals affect mood and consciousness, we know for certain that certain stimuli will result in certain types of behaviors.

Does not follow in the last example, depending on what you meant. I assume 'idle thought of Socrates' refers to Socrates having a thought, rather than I having a thought about Socrates. If so, we do not know if this really happens, at all. If Socrates had consciousness, then true, else false.

It is exactly as it seems, nothing more, don't nit pick the obvious, it's not philosophically, nor academically productive.

This is confusingly written. Are you saying that thoughts are threaded?

I apologize, I should have written it clearer, I wish I could find the exact quote, but at the moment I only remember the idea of it.

So, unexplained processes aren't part of the mind??

According to the dualist definition? No, anything unexplained is consciousness.

You've simply correlated one judgement of a perception to a judgement of another one. I don't see Chalmers' concern of 'physical representation' being tackled.

We have to go back to child psychology for this, all conscious perception is based upon previous experience. It builds upon one another in memory back to the initial base conscious perceptions we have when we are children and toddlers, these have no experience to give them any meaning, therefore it is only processed without any previous experience --only sensory input-- and put away into memory, both of which are basically understood (as Chalmers said).

It can't be partial knowledge if they do it the exact same way we do, though I suspect this is just a linguistic goofup. This again comes down to the earlier explanation I gave about the physicalist's problem.

I only presented it as a side note as evidence for a physical view, take it as such.

But why is consciousness required at all? Consciousness, as per physicalism, is totally physically-based. The laws of physics will take care of everything. BTW, evolution isn't goal-driven. That's the narrative we lay atop past events. It's a tautology, those who could survive, survived.

Read the paragraphs before it, I suggested that consciousness wasn't an actual separate entity, but only a conceptual distinction. Furthermore, I know it evolution isn't goal driven, I never presented it as such, I said that those that were the most conscious were the ones that were most likely to survive because they were better able to analyze and react.

Nothing obvious about 'we' having a consciousness. I, certainly. You? I assume so.... ....This division is assumed, on the assumption of solipsism and epiphenomenalism being false.... ....This is the meat of the matter, and you've as good as admitted defeat.....is physical. The very fact that one needs to make an assumption, indicates that physicalism is not the right track.

I lump these all together because they're based on the same principle: Nagel's critique is irrefutable. We will never be able to know whether or not there is anything beyond ourselves, Descartes made that point already, the only thing that is definite for me is “I am, I exist.” However, this is a very limiting stance to take, we can only go around saying “Cogito ergo sum” for so long before we start wondering if there is anything else we can know. As Hume has shown, we cannot actually prove causal relationships are actually true, we only know that things tend to happen after another. However, given tendencies, we do have probability of things happening in succession, and these probabilities are close enough that we can consider them as causal relationships. Without this, we would not be able to make any scientific advances, or even any experiential claims, such as Hume's own example: bread nourishes me.
Satyagraha
posted by thebestsophist at 2:25 AM on July 6, 2005


thebestsophist : "Furthermore, we do have a sort of predictive power, we can predict how certain chemicals affect mood and consciousness, we know for certain that certain stimuli will result in certain types of behaviors."

Both have nothing to do with brain sciences, but firsthand observations and secondhand observations (i.e. watching other people and talking to them).

"According to the dualist definition? No, anything unexplained is consciousness."

Oh, dear. Mind is the sum total of all objects of consciousness, not distinct from it.

thebestsophist : "back to the initial base conscious perceptions we have when we are children and toddlers"

These are not understood, w.r.t their 'physical grounding'.

thebestsophist : "We will never be able to know whether or not there is anything beyond ourselves, Descartes made that point already, the only thing that is definite for me is “I am, I exist.” However, this is a very limiting stance to take, we can only go around saying “Cogito ergo sum” for so long before we start wondering if there is anything else we can know."

Tough luck. You still haven't shown consciousness to be physical, despite agreeing with Nagel. You only have an assertion of a 'conceptual distinction' whatever that means.

"I said that those that were the most conscious were the ones that were most likely to survive because they were better able to analyze and react."

And my reply was that consciousness is irrelevant here, since in physicalism, physics underlies everything. Consciousness has to proceed according to physical determinism, so it is not a free, potent agent, thus rendering it superflous.
posted by Gyan at 8:17 AM on July 6, 2005


Both have nothing to do with brain sciences, but firsthand observations and secondhand observations (i.e. watching other people and talking to them).

Both have everything to do with neuroscience, the brain works via chemicals.

Oh, dear. Mind is the sum total of all objects of consciousness, not distinct from it.

Sorry, I had used that as a response to a comment you (or maybe it was abcde) had made much earlier. In that case, I ask again my question of how can brain and mind interaction be one-way? Even if the body is desensitized by stimuli (in reference to this), then again I ask how can the mind not interact the physical body/brain not only would you ask for something different, you'd say why. Furthermore, as Chalmers' example of experiencing a deep blue, we not only experience it, we can talk about our experience of it.


These are not understood, w.r.t their 'physical grounding'.

Sorry, correction, I meant to use 'sense perception' not 'conscious perception.' Then to return to my argument with the darkness of a blue: as Chalmers likes to point out, this is a subjective observation, we categorize it as "dark," because when compared to other shades of blue, it is darker.

Let me use another example that has constantly been popping into my head for the last few weeks. I remember seeing this in an episode of Scientific American about 6-8 years ago. (I've been slowly going through their archives for the last week to find it.) Psychologists were doing tests with babies about 4 months to a year old, one of the experiments was that they let first let babies play with a small stage that had a block on a table, the block was completely free moving. Then the baby would watch as a finger would slowly push the block off the table, where it would stay in the air. When they showed this to babies that were very young, they didn't react. However, when older babies watched, they would react (sic. they cried).

What's the difference? Obviously the older babies had the experience to know that the block should have fallen, and therefore reacted to the oddity. But what makes it an 'oddity?' Chalmers says that this is exactly where consciousness is needed, because if everything was physically determined, all that would happen would be the processing of information in other words "100 fall, 1 float." It's the experience that makes it "odd." This is a very compelling argument, however we need to break down what makes something "odd." As I see it, this is the crux of our discussion. What makes this "odd" instead of just information? To address this, we can look at learning, because learning is what makes experience, first is the information, noting that something did not fall; second (and the important part of this conversation), is categorization (which I consider also consider solved, because we've got pretty good categorization algorithms), when the information is processed (by comparing to expected results based upon previous observation), it is categorized as different (odd), and a reaction is triggered. Everything that is happening here can be simulated artificially, and therefore I consider problems that can be physically grounded. In this case, the reaction is an emotional one of confusion, and yet again emotion is also basically in the realm of understood (I say this because it has been replicated).

Why would the note of oddity trigger a reaction? Because as I stated earlier, the ability to react quickly and correctly is a trait that would be evolutionarily desirable, not that it is a goal, but those that are able to react correctly are the ones most likely to survive, and therefore reproduce, and spread genetic material, and teach.


And my reply was that consciousness is irrelevant here, since in physicalism, physics underlies everything. Consciousness has to proceed according to physical determinism, so it is not a free, potent agent, thus rendering it superflous.

It isn't a free agent (no free will), that does not mean it isn't a product of physical determinism, which is what I've been arguing. It is not superfluous, because it (conscious experience and reaction) causes change, and is therefore just as much an agent for change as a thunderstorm or gravity.
Satyagraha
posted by thebestsophist at 2:39 AM on July 7, 2005


thebestsophist : "Both have everything to do with neuroscience, the brain works via chemicals."

Not the point; you mentioned that "we can predict how certain chemicals affect mood and consciousness". On the basis of neuroscience, we can't. It's by trial & error. You think that one test agent will be an analgesic, because its functional group is similar to, say, endorphins. But you still have to pop it in, and find out. And most of the time this doesn't work. Alexander Shulgin derived hundreds of tryptamines and phenethyamines. Not all of them were psychedelic. If the predictive power existed, you could look at a molecular structure and predict, but you can't. Not yet. Otherwise, they wouldn't waste billions in drug development. They would just figure out the structure, by deduction. But they can't.

thebestsophist : "I ask again my question of how can brain and mind interaction be one-way?"

How can it be two-way, in physicalism?

thebestsophist : "Chalmers says that this is exactly where consciousness is needed, because if everything was physically determined, all that would happen would be the processing of information in other words '100 fall, 1 float.'"

The '100 fall, 1 float' makes it odd. When it happens 100 times, the brain impulses cascade one way; when it doesn't, it cascades another. You should read On Intelligence. The theory described therein, deals with exactly with such type of scenarios.

thebestsophist : "It is not superfluous, because it (conscious experience and reaction) causes change, and is therefore just as much an agent for change as a thunderstorm or gravity."

Let's work using a comparision. Take two cases: one, a zombie, and another, a conscious human. In the zombie's brain, motion of every organelle can be modeled by describing all the forces and the position/velocity vectors and mass. In short, we can predict the activity of an organelle, by simply looking at the physical features and using appropriate dynamical physics. We aren't at QM scales, so those don't enter the picture. The same is true of the conscious brain. The physical world is causally closed. All there is, is mass, space, time, energy, gravity, EM, weak nuclear, strong nuclear, velocity..etc in the classical realm, where the brain operates. Where does consciousness enter the picture?
posted by Gyan at 8:17 AM on July 7, 2005


thebestsophist : "Both have everything to do with neuroscience, the brain works via chemicals."

Not the point; you mentioned that "we can predict how certain chemicals affect mood and consciousness". On the basis of neuroscience, we can't. It's by trial & error. You think that one test agent will be an analgesic, because its functional group is similar to, say, endorphins. But you still have to pop it in, and find out. And most of the time this doesn't work. Alexander Shulgin derived hundreds of tryptamines and phenethyamines. Not all of them were psychedelic. If the predictive power existed, you could look at a molecular structure and predict, but you can't. Not yet. Otherwise, they wouldn't waste billions in drug development. They would just figure out the structure, by deduction. But they can't.

thebestsophist : "I ask again my question of how can brain and mind interaction be one-way?"

How can it be two-way, in physicalism?

thebestsophist : "Chalmers says that this is exactly where consciousness is needed, because if everything was physically determined, all that would happen would be the processing of information in other words '100 fall, 1 float.'"

The '100 fall, 1 float' makes it odd. When it happens 100 times, the brain impulses cascade one way; when it doesn't, it cascades another. You should read On Intelligence. The theory described therein, deals with exactly with such type of scenarios.

thebestsophist : "It is not superfluous, because it (conscious experience and reaction) causes change, and is therefore just as much an agent for change as a thunderstorm or gravity."

Let's work using a comparision. Take two cases: one, a zombie, and another, a conscious human. In the zombie's brain, motion of every organelle can be modeled by describing all the forces and the position/velocity vectors and mass. In short, we can predict the activity of an organelle, by simply looking at the physical features and using appropriate dynamical physics. We aren't at QM scales, so those don't enter the picture. The same is true of the conscious brain. The physical world is causally closed. All there is, is mass, space, time, energy, gravity, EM, weak nuclear, strong nuclear, velocity..etc in the classical realm, where the brain operates. Where does consciousness enter the picture?
posted by Gyan at 8:20 AM on July 7, 2005


Not the point; you mentioned....structure, by deduction. But they can't.

Point taken, however, I think given enough time and research we will understand enough to be able to have some sort of predictive power.

The '100 fall, 1 float' makes it odd. When it happens 100 times, the brain impulses cascade one way; when it doesn't, it cascades another. You should read On Intelligence. The theory described therein, deals with exactly with such type of scenarios.

...That's what I said...


The statement "100 fall, 1 float" is just information, but then the 1 float is categorized as odd, which results in a proper reaction.

Let me put it in the form of a program. Say we have a computer program that systematically analyzes the same type of thing over and over (a block going off a table). Every time it sees a block, it records whether or not it falls, or floats. It is programmed to learn to expect events that happen more often, are the ones that are most likely to happen. If something abnormal happens (something that was different from the generally happened previously), it is to send a signal (we don't care where).

The first time a block falls, it didn't know what would happen, therefore it reacts by sending a message (we can happily ignore this message, babies have firsts for everything, and generally are really happy about them and go "AGAIN!"). The second time, the block falls again, the information is put away, and again, and again till 100 blocks have fallen, the program puts the information away, nothing has gone against the unexpected.

When the 100th block comes, it floats. The program notes this, and compares it to previous data, and notes that this is unexpected, it is different. This is where it is noted as "odd," and a message is sent off.

Now imagine that instead of sending a message only when something was different, a message is sent everytime, saying whether or not the event was expected or not (odd or not). With this message is also information on how expected the event was (which means the effect of an oddity is exponentially more unexpected), the message is sent to an emotional program that reacts, so when the blocks fall repeatedly it reacts less and less (it becomes desensitized to the continual stimuli). But when the 101st block floats, it gets a message that a very unexpected oddity has happened, and therefore reacts accordingly (in surprise), just as the baby did. I ask you, where does a metaphysical consciousness fit here? Everything in this situation can be programmed, and therefore has a physical explanation (in the form of electrical currents).

How can it be two-way, in physicalism?

Let's work using a comparision.....Where does consciousness enter the picture?


Conscious thought is the sum total of all the parts of the mind that make a being react, all of which are physical. It isn't an entity 'in-of-itself' it is the sum total of all the processes that go on in the mind, and therefore the separation is conceptual, not actual. That means we can think of it separately (note I did not use the word separate), from the rest of the body, because it is easier to think of people as personalities, in adverse to bodies. However, when we talk about individuals, we still have to understand that the mind is a part of the physical body. To repeat, consciousness is the sum total of all the brain's functions, from sense perception to learning to emotions and reactions. We have to stop taking consciousness as a separate process that you can point at a specific part of the brain and say "Consciousness is right here." Because conscious thought and action can be considered the sum total of everything that we do (mentally, mind the lax in my use of terms), different parts of consciousness must reside in different parts of the brain.

Satyagraha
posted by thebestsophist at 11:35 PM on July 7, 2005


Everything in this situation can be programmed, and therefore has a physical explanation (in the form of electrical currents).

Exactly, So, why any consciousness?

thebestsophist : "consciousness is the sum total of all the brain's functions, from sense perception to learning to emotions and reactions."

You keep on going back to square one. Why does any of that processing have to result in feelings and perception??? Like you said, the entire thing is physically-driven. So, if I place a red ball in front of you and ask you to pick it up and examine your brain perfectly & simultaneously, why must you see the red ball and see yourself picking it up or not. Don't bring up the 'react' thing again. The photons from the red ball hit your eyes. The pressure waves of my voice impact your ears. Both of these start a train of activity conditioned by past experience and result in activating your motor neurons such that your hand reaches out and picks it up. If you have it, your consciousness simply reflects all these changes. When the signal from the photons is being processed, you see the red ball. Similarly, you hear my voice. Based on past conditioning, you feel your "decision" to lift or not, depending on how the neural network acts out, and if you lift, you feel and see your hand moving out to pick it up. So, why have these sensations, corresponding to physical brain processors, in the first place?
posted by Gyan at 8:21 AM on July 8, 2005


Exactly, So, why any consciousness?

As I said before, it is a conceptual distinction.

You keep on going back to square one. Why does any of that processing have to result in feelings and...... ...that your hand reaches out and picks it up. If you have it, your consciousness simply reflects all these changes. When the signal from the photons is being processed, you see the red ball. Similarly, you hear my voice. Based on past conditioning, you feel your "decision" to lift or not, depending on how the neural network acts out, and if you lift, you feel and see your hand moving out to pick it up. So, why have these sensations, corresponding to physical brain processors, in the first place?

From the eye to the brain, it works exactly like a camera, ditto for the ear to the brain like a microphone. However, that is not enough, those are just unprocessed pixels and sound waves. Within the brain complex algorithms (some maybe learned, some perhaps genetically pre-programmed, we don't know) are able to distinguish shape and sounds. Shape/object recognition and sound distinction have both been physically duplicated.

Satyagraha
posted by thebestsophist at 12:30 AM on July 9, 2005


thebestsophist : "As I said before, it is a conceptual distinction."

It's a tangible thing!! You see, smell..etc. I don't think you believe that a watch experiences passage of time or a fridge feels cold. The "conceptual" disintction refers to a tangible empirical phenomena. In order to experience it, you have to be that brain. That's not true for anything else. You can't be a lamp because, presumably, there's nothing experienced by the lamp. Whereas I don't have to be that lamp to see that lamp, and neither do you. Consciousness is unique within the class of empirical phenomena.

thebestsophist : "From the eye to the brain, it works exactly like a camera, ditto for the ear to the brain like a microphone. However, that is not enough, those are just unprocessed pixels and sound waves."

It's all physical. Those 'complex algorithms' are also all physical. Nothing needs to be different, in essence, from a fridge or the chip in my watch. So, why produce consciousness?

thebestsophist : "Shape/object recognition and sound distinction have both been physically duplicated. "

I think I've tackled this a few times before. This is behaviour duplication, based on algorithms. We don't know if these technologies see or hear anything.
posted by Gyan at 10:00 AM on July 9, 2005


It's a tangible thing.... ....empirical phenomena.

I'm not refuting any of that, I'm just saying it can all take place within the physical brain, it doesn't have to be metaphysical.

It's all physical. Those 'complex algorithms' are also all physical. Nothing needs to be different, in essence, from a fridge or the chip in my watch. So, why produce consciousness?

Physically, it isn't different, what produces "consciousness" is that it's a total cumulative effect of all the algorithms put together.

I think I've tackled this a few times before. This is behaviour duplication, based on algorithms. We don't know if these technologies see or hear anything.

Which is why I'm confused why you brought it back up. Furthermore it is behavior duplication, and if it can be duplicated physically that means that it does not need a metaphysical explaination. You're obviously not completely following my entire argument so let met start over from beginning sense perception to complete thought. I obviously haven't been clear enough:

Sense data enters the eye. goes to the brain. With me so far? All physical, nothing controversial yet. Algorithms process data and turn it into objects. Also physical.

From there the information is now of objects (or collection of chemicals in the air that will become what we know of as smells, etc). It is sent to the rest of the brain for processing. This is where things get tricky, so I'll slow down.

What do we have right now? We have sense data that has been interpreted into an object (let's use the example of a red playground ball --so a round object). From here several things happen, the round thing is compared to memory, it is recognized as a ball, the smell is compared to previous sense data, it is a rubber ball, it's a red playground ball. Still physical.

However seeing and recognizing it as a red playground ball isn't all that happens. Once we see it, we immediately have a feeling about it, we see it as a red playground ball, and we think something about the ball, we experience the ball.

What is my experience of red playground balls? This draws upon previous experience with balls, perhaps I got hit in the face multiple times, then I wouldn't like it. Perhaps I remember playing with one fondly on the playground with my parents, then it would be a good one.

From here we ask what experience is: and we can break it down to two things: the current sense perception analyzed in comparison to previous experience. That means experience is recursive, so we need to find a beginning. This isn't hard however, humans have a definite start, we don't experience until we are conceived, and therefore have a definite beginning. Because the first experience had nothing previous to build upon, it must have only sense perception and analysis of such from whatever has been inborn. Note here, when I say inborn, I mean whatever has been genetically preprogrammed, nothing metaphysical. This means our ability to learn, based upon the neural net, also physical.

I believe I missed something, however. Though we know that when I see this ball I think of it as fun because of previous experience, but what makes it fun and enjoyable? This can also be explained in the neural net. The way the brain learns (and therefore the way “we” learn) is by positive and negative feedback. Obviously fun experiences are positive, but why? To answer this, we must ask ourselves, what makes positive and negative feedback. Obviously negative feedback is easier to answer: pain, anything that causes or may cause harm is negative (with a caveat I will explain later). Positive feedback is harder, obviously it cannot include negative feedback. Furthermore, it has to be more than just the avoidance of negative feedback, or does it? You aptly said that evolution is not goal driven, and neural networks are a product of evolution; however, this does not mean that organisms themselves (who are, for the most part, subject to evolution). With goals, (and the success therefore), we have what can constitute as positive feedback. Two questions arise from goals, however. First is what these goals are, and I point to Maslow's hierarchy of needs for a quick all-encompassing overview. The second (and more subtle) question is how could a goal-driven organisms come out of a evolution; which is specifically not? Where do these needs and wants come from?

To look at this we can look at the most fundamental nature of organisms along with the most fundamental needs: to survive and reproduce. Why do almost all organisms need and want to do this? (I say almost, because many humans do not want to have children, however the cells still replicate constantly.) Even looking at the simplest single cell organism, it wants to reproduce, it's entire life is living and reproducing. Having spent the better part of the day thinking about this problem, (not long in times of philosophy or science) I still haven't come to an answer beyond, in the primordial goo that generated the first living organism, the creation of the first self-reproducing organism did not only have to be genetically programmed to be able to reproduce, it also had to be programmed to want to reproduce. This is all physically possible, which is the important part (for this discussion).

To return the the problem of why the ball is fun, all with the brain's ability to correlate (physically possible), the ball is seen as something to play with. Previously experience are other toys that have also been “fun.” From here, we can understand why it may be a positive goal. Why do we give babies toys to play with? Other than the obvious “to distract them, so we can do things” we have to look at why it distracts them. Toys give children –toddlers in particular-- a source of stimuli which allows them to learn, which for babies and toddlers are both very important, and therefore the predisposition quite possibly carried over and continued, this is one of many possible scenerios. By the way learning can also be physically explained.

From here, we have a completely physical explanation of an experience of seeing an object from primary sense perception to completed thought. Therefore, to answer your comment of whether an algorithms can “see or hear” (please be more specific in the language you use), one can see and hear to the point where there is object recognition, while the next is able to take the object data and and turn it into comprehension. To summarize, we can break down experiences into two things: the initial sense perception, and analysis of the perception through previous experience. Consciousness is made of the integration of all the brain's functions. Because all these functions can be physically represented, there is no need for a metaphysical entity. I cannot stress the fact that I am not saying that consciousness doesn't exist, I am saying that consciousness is the term that we use to conceptualize the sum total of all the processes.

Satyagraha
posted by thebestsophist at 2:40 AM on July 10, 2005


You have totally missed the point.

thebestsophist : "Sense data enters the eye. goes to the brain."

Incorrect. Photons hit the eye. Sound waves hit the ear. The particles it is incident on, don't know of it as "sense data".

thebestsophist : "Algorithms process data and turn it into objects. Also physical."

What???? Objects only exist within consciousness, or sense perception. There are no 'objects' among neurons and neurotransmitters. The neuron in layer IV cortex doesn't "know" of the thalamus or the raphe nuclei. It's just receiving sodium and pumping out potassium...etc

thebestsophist : "We have sense data that has been interpreted into an object"

Nope. You still haven't explained how?

thebestsophist : "However seeing and recognizing it as a red playground ball isn't all that happens. Once we see it, we immediately have a feeling about it, we see it as a red playground ball, and we think something about the ball, we experience the ball."

Crucial confusion here. Seeing the ball is feeling it. Your neurons don't have a collective communal identity. They are unconscious. They don't know they're part of a single brain. They don't communicate to each other, "hey, you remember the smell, I'll remember the sight of the chocolate". All your latter paragraphs are useless, because they just try to explain formation of objects and relations within consciousness. You still haven't explained consciousness itself. The patterns and macroactivity that you are appealing to, are only visible to & interpreted as such, within a consciousness. The constituents (neurons) themselves don't have any such notions (of being a "processor").

thebestsophist : "Once we see it, we immediately have a feeling about it, we see it as a red playground ball, and we think something about the ball, we experience the ball."

Even within this confused description, I can ask, why must we 'experience' the ball?
posted by Gyan at 10:38 AM on July 10, 2005


Incorrect. Photons hit the eye. Sound waves hit the ear. The particles it is incident on, don't know of it as "sense data".

Yet again, stop arguing the minute, it detracts from your argument, I know how particles hit the eye.

What???? Objects only exist within consciousness, or sense perception. There are no 'objects' among neurons and neurotransmitters. The neuron in layer IV cortex doesn't "know" of the thalamus or the raphe nuclei. It's just receiving sodium and pumping out potassium...etc

Yet again, stop arguing the menial, you know what I meant, good enough. Furthermore, I cannot answer how chemical reactions become object data, however I can also say that I cannot explain how bits make up the machine code that is intpereted into the programming languages that I use either, but it can. If a processor at 32-bits can make machine code into perl, python, and ruby. Why can't a mass of neurons operating in parallel make electro-chemical impulses into a language that we all are so familiar with in our own personal way: thought?

Nope. You still haven't explained how?

Most likely a complex way simular (though more subtle) than the way computers do. Before you ask why would it, I ask you: why wouldn't it? All the processes of conscious experience can either be physically explained directly (sense perception in the way of seeing light and detecting sound vibration), or duplicated with algorithms (or we have good ideas of how to do so), this means that all the processes that make up consciousness are (possibly) physically explainable. What else is there? In direct response to you stating that we do not know whether or not algorithms for object recognition can "see" I ask you what else is there to "seeing" that hasn't been explained? What makes a conscious being "seeing" different from something that has been programmed to "see." Ex nihilo nihil fi, therefore until you can prove that there is a metaphysical medium from which everything works, don't attempt to impose it upon the physical realm, especially in relation with fields of science. We are constantly getting closer to robots (not androids mind you) that are acting more and more human, all algorimthmically explainable. And again, before you say we don't know whether or not it actually "sees" or "feels" ask yourself what else is a part of our "seeing" and "hearing" and "feeling." Break it down to it's most fundimental parts.

Crucial confusion here. Seeing the ball is feeling it. Your neurons don't have a collective communal identity. They are unconscious. They don't know they're part of a single brain. They don't communicate to each other, "hey, you remember the smell, I'll remember the sight of the chocolate". All your latter paragraphs are useless, because they just try to explain formation of objects and relations within consciousness. You still haven't explained consciousness itself. The patterns and macroactivity that you are appealing to, are only visible to & interpreted as such, within a consciousness. The constituents (neurons) themselves don't have any such notions (of being a "processor").

Nor do they need to, consciousness is the sum total of what makes it. Just as the harddrive does not know how the RAM operates, nor does the processor know how the fsb does, individual neurons do not need to know how the entire system works, it just does. Why? millions upon millions of years of of evolution.

Even within this confused description, I can ask, why must we 'experience' the ball?

What do you mean why? As in how it was formed, or whether or not there is a reason experience should exist at all? Experience, as with consciousness is the sum total of all it's parts, nothing more, and it's parts are the analysis of current sensory data with respect to previous data, experience is seeing the ball and thinking about the ball (not in the contemplate way of thinking, but the having mental response to it). If you are asking how it could grow out of evolution, we can look at what experience is: analysis. Animals that are most able to analyze and adapt are those that are most sucessful, therefore it is obvious that those that could experience the best were the ones that survived and prospered.

I apologize, but this is going to be my last response, the deadline for three of my projects have been cut tremendously short because of people going that way, and adminstration being well....administration. In other words, a liesurely three months has been turned into what hopefully will not be a stressful one month (though it undoubtly will). I really enjoyed the discussion, thank you.
Satyagraha
posted by thebestsophist at 10:53 PM on July 11, 2005


thebestsophist : "If a processor at 32-bits can make machine code into perl, python, and ruby

But it doesn't. You're the one who perceives the structure and function at the micro and macro level. You're the one who can consciously see the display and make sense out of what's displayed. That 'meaning' does not exist for the computer. It's just electrons moving around, according to physical constraints.

thebestsophist : "Before you ask why would it, I ask you: why wouldn't it? All the processes of conscious experience can either be physically explained directly (sense perception in the way of seeing light and detecting sound vibration), or duplicated with algorithms (or we have good ideas of how to do so), this means that all the processes that make up consciousness are (possibly) physically explainable"

Because concsiousness itself hasn't been physically explained.

thebestsophist : "In direct response to you stating that we do not know whether or not algorithms for object recognition can 'see' I ask you what else is there to 'seeing' that hasn't been explained? What makes a conscious being 'seeing' different from something that has been programmed to 'see."

The seeing itself!!! When we say, something is programmed to 'see', it's a figure of speech. We mean the thing is acting and behaving as if it could actually see. That's all. For all I know, my lamp can see, but it's inanimate and there's no structural isomorphisms (re: the eyes), so I find it bizarre to claim that it sees. It's simply the behaviour that we correlate. But behaviour is not consciousness.

Really, the point we've been discussing is an elementary one. I suggest sometime on the future, you reread this debate.
posted by Gyan at 11:22 AM on July 12, 2005


« Older Wanna buy a news ticker?   |   This whole post is probably NSFW... Newer »


This thread has been archived and is closed to new comments