Questioning Consciousness
January 30, 2008 10:00 AM   Subscribe

Questioning Consciousness. "To understand consciousness and its evolution, we need to ask the right questions." By Nicholas Humphrey, who was previously discussed here. [Via Disinformation.]
posted by homunculus (51 comments total) 29 users marked this as a favorite
 
I love the idea of exploring consciousness in an objective, observant manner (though my interests fall somewhere within the parameters of studying our consciousness of social phenomena, to which Humphrey only vaguely alludes). This is a fascinating article! Thank you for posting it.

At any rate, all of this consciousness-speak incites my yearning to apply to UCSC's exclusive History of Consciousness program. Did you see their faculty list?! SO AWESOME.

Also, my word of the day is "qualia." What a fantastic term!
posted by numinous at 10:16 AM on January 30, 2008 [1 favorite]


The way I explain consciousness to my 9-year-old daughter is that a cat could never be licensed to drive a car. For this reason: to drive a car requires you to look in the rear-view mirror. We interpret what we see in the mirror as part of the three-dimensional reality of the surrounding world we manufacture in our minds. A cat would look in the rear-view mirror and assume the ambulance was maintaining, paradoxically, a constant distance in front of us.
posted by Turtles all the way down at 10:28 AM on January 30, 2008


...and at the risk of repeating others' recommendations as regards the study of consciousness, Julian Jaynes' The Origin of Consciousness in the Breakdown of the Bicameral Mind, as well as the work of Daniel Dennett (The Mind's I, with Hofstadter is phenomenal) is highly recommended.
posted by Turtles all the way down at 10:36 AM on January 30, 2008 [1 favorite]


Angry mysterians in 3... 2... 1...
posted by fleetmouse at 10:39 AM on January 30, 2008


Seconding Turtle's recommendation of Hofstadter's work. Fantastic stuff.
posted by Baby_Balrog at 10:47 AM on January 30, 2008


I have a question. To what extent do qualia account for experience? Take sight for instance. Is it the colour? Is it the luminosity as well? Is it the sensation that there is any light there at all? See, from my understanding of what qualia are, all of those things qualify. Surely then the answer to a question akin to "what possible purpose can they serve?" is that they stop you from going arse over tit off the edge of a cliff.
posted by vbfg at 10:49 AM on January 30, 2008


I don't know about the cat driving analogy... just being nit-picky. Crows and AI have predicting abilities, but does that mean that they have consciousness?

Maybe it's different degrees..

My neuro-professor said he likes to think that cows are like us, when we are driving automatically. I mean like, when you are driving and you suddenly realize that you are where you wanted to get to, but don't remember in between? yeah Cows are like that all the time.

Or so he saids, no way to ever prove it.
posted by countzen at 10:54 AM on January 30, 2008


Yes, good article.I think the author's final conclusion is the only one that can be drawn at this stage of our understanding and evolution.

Whether or not a cat (or a turtle) can get a drivers licence has nothing to do with consciousness. For the record, my cat happens to be alot more conscious (or at least considerate) than many drivers i've observed.

I will concede that there may well be differing quality and depth of consciousness between different species (and even between individuals of the same species). I also wonder whether the consciousness of some species, like cetaceans, isn't so much inferior, as different, and they just don't get as worked up as we do about working with abstractions. OK Im

I also suspect that we do not currently represent the apex of consciousness, though we haven't yet met anything with an apparently higher level.

I think the author's final conclusion is the only one that can be drawn at this stage of our understanding and evolution. I can also tell you that I just finished a delicious tempura lunch and most of you will be able to understand what I meant, and some may even be able to conjure up a recollection of a similar experience. Like the author, it's enough explanation for me.
posted by Artful Codger at 10:58 AM on January 30, 2008


If he wants to know how all those things he describes with four syllable words arise, he might want to read something by Erich Harth. Then instead of making up a lot of new words he might be able to explain something.

What Humphrey is calling "sentition" is the application of an iterative image recognition and sharpening algorithm to sensory data; Harth suggests that the RL algorithm will be similar to one he wrote called alopex, though with improved self-calibration features. (Harth's simple algorithm makes many of the same perceptual mistakes humans do.) It is the success of this algorithm at recognizing and sharpening features that we call "perception." You don't need a new word for it. Incidentally, this occurs in the thalamus, not the cerebral cortex, although the cortex supplies the patterns that are being recognized.

Going beyond Harth, when repeating data that aren't recognized as stored patterns keep popping up in the thalamus, the cortex records and strengthens them. We call this process "learning" and "memory." There is a distinct positive feedback associated with it, which we experience as the euphoria of an epiphany when we suddenly make and strengthen many new connections at once.

In addition to incoming patterns there are outgoing ones, of what muscle controls have produced what results in the past. As the back of our brain is measuring the situation, the front is applying the situation to possible motor controls that could modify it. If the situation applies to a known action, we get a remembered evaluation of whether that action made things better or worse. If the answer is "better" the action is broken up into smaller actions, compared to the current situation at lower levels of abstraction; often we notice that corrective action that would be nice is not in fact possible because the lower level abstraction inputs aren't there. But if they are, and the evaluation continues to be positive, it will trickle down to the lowest levels of abstraction at the motor humonculous and we will take action.

Everything else -- the inner landscape which Humphrey is trying so heroically to describe -- is the process of weighing what is happening against what we can do to optimize it against several different scales of measurement which we call "feelings" and "emotions."
posted by localroger at 10:59 AM on January 30, 2008 [2 favorites]


Bonus points for finding the lapse of consciousness in my post above. Or lack of conscientiousness, perhaps.
posted by Artful Codger at 10:59 AM on January 30, 2008


In the case of consciousness, we cannot simply change our perspective to see the solution. We are all stuck with the first-person point of view. So, the result is we persist with questing for the qualia as such.

There are many ways to become unstuck, as it were. Kind of sloppy.
posted by prostyle at 11:02 AM on January 30, 2008


It seems dangerously fanciful to suggests that the value of consciousness from a natural competition perspective is "enjoyment".

More likely is that tool using is improved by the ability to assess the effectiveness of the tool from a mental distance.
posted by ewkpates at 11:05 AM on January 30, 2008


I think that we should be free to think about 'degrees' of consciousness. But I also think that's a red herring. We, as humans, are blessed with phenomenal intellectual ability: witness our ability not only to survive on the savanna but to send spaceships to the Moon, or to destroy our planet, for that matter. There is something fundamentally different—orders of magnitude—between the way we think and the way our 99% cousins, chimpanzees, think. This is the miracle of consciousness, probably related to our facility with language, which allows us to hold a model of reality in our 'mind', independent of what our senses are reporting to us at the time.

I can imagine a colony on Mars. I can imagine "Mars", and the spaceship that might take us there. I don't think any other animal can.
posted by Turtles all the way down at 11:06 AM on January 30, 2008


> There is something fundamentally different—orders of magnitude—between the way we think and the way our 99% cousins, chimpanzees, think. This is the miracle of consciousness

That is the miracle of hubris. I don't think that in order to be classified as conscious, a species must demonstrate the ability to post about it on Metafilter.
posted by Artful Codger at 11:12 AM on January 30, 2008 [3 favorites]


So, suppose you were looking at a ripe tomato: What might you want to explain about the qualia-rich red sensation that you are experiencing?

Wait... I always understood "qualia" as a hypothesis to help explain the binding problem, sort of like how dark matter is a hypothesis to explain gravitational forces observed in space. As in... you've got all these properties of a tomato coming in from different sensory channels, so how do they all come together to form the impression of a tomato? Do they come together in one cell, like the "grandmother cells" of vision, which is then linked to the word, "tomato"? However it happens, let's call it qualia and keep an eye out for it. Is this inaccurate?

This guy is using the term in a different way... he seems to use it to mean the subjective experience of a sensation. I guess I sort of understand now why Dennett doesn't like the term. It's somewhat vague.
posted by Laugh_track at 11:17 AM on January 30, 2008


The Origin of Consciousness in the Breakdown of the Bicameral Mind! That takes me back. The last time I saw that was when I was at school. It was on the shelf of a young teacher who was also into Camille Paglia - this was the early 1990s - and used to give my friend blowjobs. She also smoked weed with me.
posted by Mocata at 11:30 AM on January 30, 2008


Thanks for sharing.
posted by Snyder at 11:36 AM on January 30, 2008


Turtles all the way down, I disagree that the high-level analytical, modelling thinking that you're describing has much to do with consciousness, at a necessary level. At the learning stage, perhaps. But I think we fall into a habit of ascribing a lot of our talents to our consciousness purely because we have, at one point, been very aware in our conscious minds of the thought processes associated with those talents.

I don't need a concept of an "I", nor an internal monologue, nor a self-awareness of my thoughts, to process what's going on when I look in a mirror. On a more complex level, when was the last time you thought consciously about the mechanics of your driving? Maybe a conscious awareness of our thought processes and our ability to be purposefully goal-focused makes us able to learn skills in a way and with a proclivity that animals without consciousness can't and don't. But the majority of the stuff your brain does during the day -- even high-level, skilled, learned tasks and processing -- is done unaccompanied by conscious awareness.
posted by chrismear at 11:36 AM on January 30, 2008


In my dreamscape, time and death mean nothing. Seeing a dead friend in the dream doesn't
surprise me at all. I accept that he is talking to me and we interact. If I were totally awake,
my response would be of shock and disbelief. I've lived days in my dreams only to wake and find that it has only been a few hours or minutes. Dual consciousness at its best.
posted by doctorschlock at 11:55 AM on January 30, 2008


I don't think that in order to be classified as conscious, a species must demonstrate the ability to post about it on Metafilter.

Yep, I find "I think, therefore I am" to be functionally useless. Prove to me that you think. What is implied is actually, "I communicate thought, therefore I am". (I prefer simply, "I communicate, therefore I am" -- the "thought" is implied.)

Is my cat conscious? Most people would say "no". My cat asks for food, recognizes me, plays, gets angry, etc., but all she can say is "Meow, meow, meow!" What if my cat could actually say "LordSludge, I'm hungry. Feed me?" or "I would like to be petted now." or "I feel happy." People would freak out. Not just because of the madd kitteh language skillz, but also because OMG ITS CONSCIOUS!!!

There's definitely a continuum. Definitely. Is a dead person conscious? Nah. What about a person in a persistant vegetative state? Maybe a little. What about a profoundly retarded person on heavy medication? What about a fairly stupid person who was raised without language skills and can only grunt n point? What about an embryo? Newborn? Infant? Toddler? Profoundly retarded child?

Sadly, and importantly, I think this accounts for a good portion of xenophobia -- that is, if we can't understand what other people are saying (damn japs with their ching-chang-chung, towelheads go lalalalalalalala!!!LOL!!1!, etc.) then they must be less than human, less than people we CAN communicate with.

Human nature is pretty fucked up. Now I'm depressed. Thanks, Metafilter!
posted by LordSludge at 12:20 PM on January 30, 2008


The only way to find out, I'd say, would be to take seriously the idea that consciousness is a trick, and think through what further questions would follow at a scientific level.
...
In response to sensory stimulation, we react with an evolutionarily ancient form of internalized bodily expression (something like an inner grimace or smile). We then experience this as sensation when we form an inner picture—by monitoring the command signals—of just what we are doing.


How is this "inner picture" formed? By monitoring. And what is "monitoring"? I guess, forming an inner picture.

And why is there any monitoring? (in the descriptive sense, not the teleological answer he provides at the end)

He's going around in circles.
posted by Gyan at 12:22 PM on January 30, 2008


It's like he's saying what we're all thinking.

But yeah, seriously, I like what this guy has to say. Trying to explain green to a blind man is a puzzle that's harder to understand when you acknowledge green doesn't exist beyond your own mind.

But I think he misses an obvious answer here:
"What mental processes can be performed only because the mind is conscious, and what does consciousness contribute to their performance?"
Modelling. The ability to create mental phantasms from sensory data allows us to model situations mentally, rather than by trial and error. This allows, among other things, tool use and engineering. It allows me to see a fallen tree as a bridge, and then to adapt that model to create a bridge from a sturdy branch, a sharp rock, some binding twine/vines, and a live tree.
If sentition appears phenomenal only when observed from the specific first-person viewpoint, this is bound to create major difficulties for those neuroscientists who hope to find the neural correlate of consciousness (the NCC) by studying the brain from the outside.
I'm curious what effect this would have on my attempt to find the NCC by lucid dreaming inside an MRI/PET scanner. Though with lucidity, you could argue I'm not working totally from the 'outside'. I'm still about 80-85% sure that if you compare the lucid and non-lucid images of the same subject's brain within REM sleep, the difference in activation will if not isolate the center of consciousness and/or conation, will shed some light on the processes active in it. There's still that nagging possibility that there will be no difference, which is (to me) even more interesting. I'm not sure if he's agreeing with the latter possibility, in effect saying that consciousness is all in our mind (and not our brain).

Every now and then I get these gentle tugs pulling me back towards graduate school. Dammit.
posted by Eideteker at 12:47 PM on January 30, 2008


Oops: "Trying to explain green to a blind man is a puzzle that's easier to understand when you acknowledge green doesn't exist beyond your own mind."
posted by Eideteker at 12:48 PM on January 30, 2008


Shouldn't we be able to replicate "consciousness" in robots if this guy is right? Can the computer on my desk fall in love if it has the right combination of circuitry and hardware? Assuming the answer is no, then neither can we explain with science or techno jargon why the lump of cottage cheesy, red-jellowish, macaroni mass between my ears can decide to fall in love. I love because I love.

Exit sappy mysticist.
posted by boots77 at 12:51 PM on January 30, 2008


I'm also of the mind (ha!) that our concept of consciousness is a largely human conceit. Consciousness could be described as the phenomenological sum of our sense data and our responses to it, both potential and actual. So I'd consider consciousness to be present in every lifeform that interacts with the outside world. The idea of "degrees" or "levels" of consciousness is every bit as outdated as the "evolutionary ladder". Each being develops its own consciousness best suited to its environment.
posted by Eideteker at 12:56 PM on January 30, 2008


Laugh_track:

I had the same impression. Qualia are theoretical entities used to explain certain aspects of consciousness. When we experience a color, what are we experiencing? Qualia, as an answer, is a stand-in for some future answer. In short, the suggestion of qualia is methodological assertion (and a good on at that,) not a metaphysical/existential one.

Another point I'd like to bring up is the suggestion that consciousness is somehow "unemployed." It seems fairly clear that in terms of evolutionary competition, consciousness is a huge advantage. Without some level of consciousness, there would be very little in the way of abstract thinking. Without abstract thinking human beings would be poorly equipped to deal with many (most) environmental threats.
posted by elwoodwiles at 12:57 PM on January 30, 2008


"I may shock you by what may seem the naivety of my conclusion (I've shocked myself): I think the plain and simple fact is that consciousness—on various levels—makes life more worth living.

We like being phenomenally conscious. We like the world in which we're phenomenally conscious. We like ourselves for being phenomenally conscious. And the resulting joie de vivre, the enchantment with the world we live in, and the enhanced sense of our own metaphysical importance have, in the course of evolutionary history, turned our lives around."
It amazes me that so many people who should know how evolution works get it backwards. The chemical rewards we get for having sex, being conscious, etc., like the chemical punishments we get for not having sex, touching fire, etc., are not "reasons" for these things. They are biological carrots on sticks that have been locked into our genes by ancestors who happened to exhibit behaviors that enhanced their chances of survival. Sure, we have sex because it gives us pleasure, but that is not the "reason" for sex, any more than enjoying a drink of water when we are thirsty is the "reason" for water. And the "reason" for consciousness may turn out to be not much different from the "reason" for water.
posted by weapons-grade pandemonium at 12:58 PM on January 30, 2008 [2 favorites]


Each being develops its own consciousness best suited to its environment.

Just wanted to agree with that.
posted by elwoodwiles at 1:04 PM on January 30, 2008


Consciousness enables us to wonder about ourselves; thus, this engaging thread; and the article in Seed; and so on.

Consciousness is an adaptation fed by the resonance between us and "what we are" - consciousness invents "what we are", too.

Frankly, I don't think we'll ever know what consciousness is, but will evolve to ever more sophisticated ways of using ourselves (because "consciousness is us"). Like Wittgenstein said - something like "wonder THAT you are", because trying to get to the ground of *exactly* who, what, and why we are is something we'll probably never get to the ultimate bottom of.

That's where I am so far, but who knows where consciousness will take me. Sit back, and enjoy the wonder. Wonder is the gift of consciousness.
posted by MetaMan at 1:13 PM on January 30, 2008


> Oops: "Trying to explain green to a blind man is a puzzle that's easier to understand when
> you acknowledge green doesn't exist beyond your own mind."

Q: If a tree falls in the forest...?
A: No.

That puzzles many and makes some people angry. On the other hand it's never been clear to me why it has to be "no one there to hear it," rather than nothing. Any creature with ears should be able to turn those vibrations into a heard sound, surely. If Schrodinger's cat is walking in the forest...
posted by jfuller at 1:21 PM on January 30, 2008


Careful jfuller. You're implying that there are "unheard" sounds.
posted by weapons-grade pandemonium at 1:23 PM on January 30, 2008


Eh. English thinkers will never let go of consciousness because it's the last best hope for legitimizing the 'self'. Even if qualia were the product of some mysterious Sentition Algorithm in the brain it would not explain awareness or even understanding of the algorithm's output. That is, even if the brain "enhances" experiences by adding in phenomenal hints like Space and Time there's still the little question of why/how/where we understand Space and Time. This is just a repackaging of the Chinese Room Argument. (Unless you're a mystic and believe consciousness is an accident and the site of the accident happens to be the brain. Then you could get away with saying that the input of the algorithm is 'The Universe' and the output of the algorithm is 'The Universe' and then take some drugs and everything would be alright.)
posted by nixerman at 1:35 PM on January 30, 2008


What continues to fascinate is that we use consciousness to create questions about consciousness, and then, in turn, to answer those questions, which lead to more questions.

Getting to the "biological bottom" of consciousness will only take us so far, because that "biological bottom" itself is generated from the subatomic dynamics present in our universe - ironically, a "universe" that is mirrored by every one of our respective consciousnesses *differently*.

Thus, it occurs to me that even if we "figure out" how consciousness works, we'll never quite *understand* consciousness.

In a very real way, as others have already proposed, I don't think we have the capacity to ask the right questions (as if positing a question that requires an answer should even be considered "end game" in this quest), much less be able to "figure out" something (consciousness) that we literally change the definition of every time we use it, or attempt to define it.

Somehow, we have to get beyond thinking of consciousness as a "thing". It's more than that; it's a dynamic, evolving entity that *we evolve*, even as we try to figure it out.
posted by MetaMan at 1:36 PM on January 30, 2008


Decent Wikipedia article on qualia.

The problem as always is with definition. What is consciousness, and how is it measured. Why do we assume that all people's consciousness is equal if we readily acknowledge that some people are more intelligent or perceptive than others? I can communicate green but deconstructing green objectively and communicating that objective information to you and calling it 'green'.

I have always thought of consciousness with the following analogy. Consider the possibility that there is a pattern that you can see, i.e. visual pattern which reflects light that hits your eye, but because of the particular arrangement/complexity of the pattern, you cannot comprehend it or remember it. Phrased differently, does there exist a pattern which can be apprehended by our visual senses, but not comprehended by our minds regardless of the time and attention focused on that pattern? Perhaps it cannot be comprehended because our minds are physically and chemically structured in a way that renders processing of this pattern impossible. Or perhaps the way our minds adapt to process the world around us when we are very young makes it impossible to process this completely artificial and wholly unnatural pattern. The reason is irrelevant to the example. All that is needed is to assume that such a pattern can exist.

If such a pattern exists, then it follows that the human mind which cannot comprehend it also cannot create it deliberately (it could be created accidentally).

But now suppose some other entity is capable of comprehending this pattern and creates one. In fact, suppose that this is but one of an class of patterns that are similar in that they cannot be comprehended by us but can be by this other entity.

These patterns would appear to us as unintelligible noise, but to the entity as something meaningful. Just as our creations and communications are meaningless gibberish to animals, so would these patterns be to us. Is the entity a higher consciousness than us? Or is it better to say that what we perceive and discard as meaningless noise is in fact at the bounds of our consciousness?
posted by Pastabagel at 1:42 PM on January 30, 2008


Isn't consciousness a social thing?
Something that exists only when it is shared?

For example, the concept of qualia doesn't seem to work alone for a color: "red" is first a name for a shared perception; it becomes an individual perception only in reference to the previously agreed consensus.
posted by bru at 1:56 PM on January 30, 2008


Or is it better to say that what we perceive and discard as meaningless noise is in fact at the bounds of our consciousness?.

Well this is the problem. But it's only a problem if you're a Platonist (and most everybody is because of Rousseau) and you believe that ideas really exist and the mind perceives ideas. But there's no reason to assume that the mind perceives at all. The statement 'I see a red tomato' makes sense but it doesn't stand up under the most rigorous materialist analysis. To make it make sense you have to wave your hands really vigorously and introduce the equivalent of synthetic a priori judgments.

The alternative to this is to imagine consciousness as an extension of the purely physical ie motion. Not like, but strongly equivalent to motion. The brain is taken to be like a bouncy red ball that has special properties like texture. When forces act on the ball and it moves in just such and such manner this is taken to be consciousness. If this is the case the brain never actually 'comprehends' anything in the Kantian sense, instead it just exhibits a strong tendency to move in certain recognizable patterns under the influence of certain forces. But nobody really likes this argument because it opens the door to arguing that everything and anything that moves -- rocks, electrons, and teenagers -- is 'conscious' insofar as it moves in patterns or, alternatively, it suggests that we're all philosophical zombies (one of my favorite terms ever). So everybody insists that the ball must somehow be aware of its motion and the forces acting upon it.
posted by nixerman at 2:16 PM on January 30, 2008


Nice one, PastaBagel. I think I agree.

MetaMan > Consciousness enables us to wonder about ourselves.
Bru > Something that exists only when it is shared?

A certain type and level of intelligence, and an ability to think and communicate using abstract symbols, enables us TO FORMULATE AND COMMUNICATE ideas and propositions about ourselves with other members of the same species who have learned the same communication system and local syntax (aka language). A certain amount of smugness often causes us to mistake the ability to discuss this miracle, as the miracle itself.

But you can't tell me that my cat isn't self-aware, or doesn't perceive me as another self-aware creature. Cats are also acutely aware that they are cats; ever watch one react when he/she sees another cat across the street? And doesn't react when there's a cat on the TV?

I can't say for certain that my cat does or doesn't wonder about herself. I suspect that's mainly because she can't or won't discuss it with me. I do know that she gets concerned if we don't feed her at the appointed hour. She's visibly pleased when my wife comes home, even if I'm already there and have fed her. We know that she dreams. You HAVE to be self-aware to dream, don't you?

So, yes I'm convinced that my cat is conscious. Does she contemplate the fact that she's conscious? I dunno. Does she make plans and formulate thoughts about the future? Probably not, though she's pretty quick at lunging through the front door if my wife leaves it open a moment too long. And not when I open the door, cos I suspect she recalls that I'm faster with the foot. And not just any door, either. All of which might suggest she's contemplated the act at some point beforehand.

So I think we have to less self-absorbed and more broadminded about consciousness. As I said before and others have underscored, I don't think we are the last word in consciousness, either.

(nixerman - this red ball isn't so much worried that it's been called a red ball, or that it might actualy just be a red ball. Because that isn't going to reduce by one tiddle the anticipation I have for that cold beer in the fridge when I get home, and the statistically significant likelihood that the actual event will occur, and will be perceived as occurring more or less as anticipated. Unless some rotter already took the last beer)
posted by Artful Codger at 2:23 PM on January 30, 2008


The alternative to this is to imagine consciousness as an extension of the purely physical ie motion.

There is another alternative, which is to define the mental processing of the brain with precision down to the neuron level, so that the propagation of thoughts and sensory input can be modelled down to the neuron level. In other words, an MRI machine that could resolve every neuron and synapse.

Then you have a precise, rigorous model that can be compared to other brains (animals) and could make predictions about hypothetical brain structures. Obviously this is a bit off.

But I don't think philosophy is going to illuminate consciousness well because it is very much a product of that consciousness and little else (as opposed to the aforementioned MRI machine which generates data). To understand something it is necessary to understand what it is not and where its boundaries are. But how can a consciousness (or any process) generate a thought about thoughts it cannot generate?
posted by Pastabagel at 2:27 PM on January 30, 2008


(as opposed to the aforementioned MRI machine which generates data).

data which can only be apprehended as a product of consciousness and little else. your criticism of conscious reflection on consciousness can be applied to empirical approaches to the problem of consciousness, too. (can we apprehend a data point that can't be consciously apprehended? no, so our apprehension of data is limited by consciousness from the get-go.)
posted by saulgoodman at 3:08 PM on January 30, 2008


Smart rock to the stupid rock: you're dumb as a box of humans.
posted by valentinepig at 3:26 PM on January 30, 2008


So it's such a hard problem how a lump of matter could have "qualia" like the sensation of seeing red, but then why should it be much less of a problem for a lump of matter to be able to form the "myth" that it in fact has those qualia and is conscious?

Views of consciousness have been prodded and somewhat distorted by the hopes of "strong AI". If what the brain does to produce thought and consciousness is like what computer hardware does when it runs software, then "red" or "blue" are placeholders like "0" and "1". They only obtain meaning from how they are used, but many people intuitively resist this because there seems some special quality of redness above and beyond the information that it's not blue.

Years and years ago I was with Hofstadter and Dennet on strong AI but now while I view some of Searle's views still as absurd, I'm tending to disbelieve the idea that one can abstract such descriptions as "thought" and "consciousness" and "intelligence" from their biological basis and view them as resulting from algorithms. An algorithm that provided a high-fidelity simulation of a neural network, down to subcellular processes, would still be a simulation of a brain and not a brain itself.
posted by Schmucko at 4:04 PM on January 30, 2008


For I will consider my Cat Jeoffrey...
posted by Crabby Appleton at 4:13 PM on January 30, 2008 [1 favorite]


turtles all the way down: Julian Jaynes' The Origin of Consciousness in the Breakdown of the Bicameral Mind

No offense, but I'd like to point out, as someone who's put a lot of time and effort into trying to understand both Homer and consciousness, that Julian Jaynes lacks rigor as a psychologist, a classicist, and a biologist, and that none of his theories (for example, the bicameral mind) have much grounding in reality.

nixerman: But it's only a problem if you're a Platonist (and most everybody is because of Rousseau) and you believe that ideas really exist and the mind perceives ideas. But there's no reason to assume that the mind perceives at all. The statement 'I see a red tomato' makes sense but it doesn't stand up under the most rigorous materialist analysis. To make it make sense you have to wave your hands really vigorously and introduce the equivalent of synthetic a priori judgments.

There's no way you can make a "rigorous materialist analysis" of whether the mind perceives. If it does, it does. If it doesn't, then you won't know anything because of the results of that analysis, nor of any analysis. And this is aside from the fact that the notions of "extension" and "material" might be up-to-date by Cartesian terms, but are certainly not in line with where science is now. We're at the point now where picturing "matter" as "extended" is clearly scientifically imprecise. Picturing much of anything at all is imprecise, in fact.

Pastabagel: There is another alternative, which is to define the mental processing of the brain with precision down to the neuron level, so that the propagation of thoughts and sensory input can be modelled down to the neuron level. In other words, an MRI machine that could resolve every neuron and synapse. Then you have a precise, rigorous model that can be compared to other brains (animals) and could make predictions about hypothetical brain structures. Obviously this is a bit off.

I agree that it's 'a bit off.' The notion of mapping the motion of every neuron, especially when they approach the molecular level (as they sometimes do), is astronomically difficult. At the atomic level, it's been proven to be impossible. In fact, in general, this prompts a pretty thorough rethinking of what we mean by "matter," since "physical" doesn't really have the precise definition it did two hundred years ago.

But I don't think philosophy is going to illuminate consciousness well because it is very much a product of that consciousness and little else (as opposed to the aforementioned MRI machine which generates data). To understand something it is necessary to understand what it is not and where its boundaries are. But how can a consciousness (or any process) generate a thought about thoughts it cannot generate?

I've said here before that science is a branch of philosophy. But you describe the problem well when you say that we have to find boundaries to define the thing. The difficulty is that it's almost impossible to define the limits of consciousness, as saulgoodman points out above. You cannot point to a brain and say "that is consciousness" and turn to a rock and say "that is not consciousness" unless you've exhausted the definition of what consciousness is, and it's very hard even to point to the boundary between the two in the first place. I think the only thing you can do is argue that thinking about things in terms of "material" is somewhat wrong, and that consciousness is a thing that arises in certain conditions, is dependant upon those conditions, and maintains itself by continuing to exist, and that in this way perception of the world is possible.

Or else abandon science, which is feasible I suppose, and turn to religion.
posted by koeselitz at 4:32 PM on January 30, 2008 [1 favorite]


I am trying to get my pet Caenorhabditis Elegans to understand Newton's laws of motion, at least well enough to predict where a ball will land, but its 300 or so neurons don't seem to be physically capable of computing this problem.

Maybe I should teach it to count to 100 instead?
posted by Dr. Curare at 5:07 PM on January 30, 2008


If cats aren't conscious, then EXPLAIN THIS!
posted by LordSludge at 5:36 PM on January 30, 2008


Dr Curare: it's a poor teacher who blames the pupil.

My cat can predict where a ball will land. However she steadfastly refuses to supply sufficient proof that she was able to calculate this from first principles.

(And yes the beer was very nice, thanks for asking)
posted by Artful Codger at 5:51 PM on January 30, 2008


I heartily recommend The Unity of Body and Mind by Lothar Bickel. It procedes from the Spinozist viewpoint that thought is the body's awareness of its own motion.
posted by No Robots at 8:38 PM on January 30, 2008


It seems a lot of the consternation around consciousness is the usual 'kicking up dust and complaining one cannot see.' Consciousness is simply the awareness of one's self and one's environment and having a picture that describes the two working in tandem. "Each being develops its own consciousness best suited to its environment" seems to sum things up quite nicely.

In short, in very short, consciousness is most likely epiphenomenal - it's an outgrowth of a series of cognitive abilities that enable the human animal to best deal with it's environment. That being the case, human consciousness is necessarily limited. We can only understand the things we need to understand in order to respond/enact to our biological imperatives.

I had a tighter ending, but it's time to feed my body some food and wine.....
posted by elwoodwiles at 8:50 PM on January 30, 2008


"I may shock you by what may seem the naivety of my conclusion (I've shocked myself): I think the plain and simple fact is that consciousness—on various levels—makes life more worth living.

We like being phenomenally conscious. We like the world in which we're phenomenally conscious. We like ourselves for being phenomenally conscious. And the resulting joie de vivre, the enchantment with the world we live in, and the enhanced sense of our own metaphysical importance have, in the course of evolutionary history, turned our lives around."


I suppose it's merely one of those wonderfully irresolvable and involuted, Klein bottle-like self-references I've come to expect from MetaFilter that he seems to commit the homunculus fallacy here by failing to notice that the 'we' he refers to as enjoying "being phenomenally conscious" can scarcely be anything other than the conscious selves he is trying to explain by reference to that 'we'.
posted by jamjam at 1:17 AM on January 31, 2008


koeselitz: "Julian Jaynes lacks rigor as a psychologist, a classicist, and a biologist, and that none of his theories (for example, the bicameral mind) have much grounding in reality.

There was a lot of rather uninformed initial criticism of Bicameral Mind when it first came out. Have you seen:

Dennett: Julian Jaynes’s Software Archeology
Julian Jaynes Revisited
Reflections on the Dawn of Consciousness: Julian Jaynes's Bicameral Mind Theory Revisited

And since Jaynes, we have the new fields of archeo-psychology and cognitive archaeology:

Psychology of Mental Fossils, toward an Archeo-Psychology
Cognitive Archaeology


On the general topic: self link .
posted by psyche7 at 10:45 AM on January 31, 2008 [2 favorites]


I'm with nixerman and boots77, I don't think this Humphrey guy is "Questioning Consciousness": I think he's misunderstanding the quandary of consciousness in the first place. From the article:

We like being phenomenally conscious. We like the world in which we're phenomenally conscious. We like ourselves for being phenomenally conscious.

This seems to me to at worst be a set of incoherent statements and at best to be a circular definition; isn't to like something, in his context, an operation of consciousness itself as much as experiencing the qualia of a red tomato is?

And nixerman, thanks for mentioning p-zombies, that's a handy term I hadn't come across before.
posted by XMLicious at 4:02 PM on January 31, 2008


« Older "You stink. God Rocks. I hate you."   |   Short Stories by Roberto Bolaño Newer »


This thread has been archived and is closed to new comments