What's it like to be Peter Hacker?
October 25, 2010 4:30 AM   Subscribe

 
Yeah, I wonder what it's like for him to be him.
posted by goodnewsfortheinsane at 5:03 AM on October 25, 2010 [11 favorites]


“The main barrier is the scientism that pervades our mentality and our culture. We are prone to think that if there’s a serious problem, science will find the answer. If science cannot find the answer, then it cannot be a serious problem at all. That seems to me altogether wrong. It goes hand in hand with the thought that philosophy is in the same business as science, as either a handmaiden or as the vanguard of science. This prevailing scientism is manifest in the infatuation of the mass media with cognitive neuroscience. The associated misconceptions have started to filter down into the ordinary discourse of educated people. You just have to listen to the BBC to hear people nattering on about their brains and what their brains do or don’t do, what their brains make them do and tell them to do. I think this is pretty pernicious – anything but trivial.”

If one replaces the word "scientism" with "liberalism", this would sound like it came from Rush Limbo.

The voices in my head command me to ignore it.
posted by three blind mice at 5:13 AM on October 25, 2010


In my book, it's the mark of a clever, but journeyman philosopher if you spend more time explaining why everyone is wrong but it's not clear what you're proposing.

From what I can read, his take on consciousness isn't that groundbreaking - it's an evolutionary trait. OK. But it doesn't advance philosophy of the mind an awful lot to say that the evolutionary brain structure, and the why/how of perceiving stuff just pushes the mystery somewhere else.

It's throwing up one's hands and saying that an undefined scientific endeavor will explain consciousness in evolutionary terms. In short, he talks about philosophical frameworks and the need to rebuild, but he hasn't proposed any meaningful idea of what he hopes to build in the place of what he sees as the gap between dualists and monists.
posted by MuffinMan at 5:14 AM on October 25, 2010 [2 favorites]


Yes, this does seem to come off more of a "nobody is paying attention to the Very Important Thing that I am saying" than anything else. That being said, I'm going to have to side with the neuroscientists on this one. I haven't seen any philosophical treatment for brain damage.
posted by Old'n'Busted at 5:19 AM on October 25, 2010 [2 favorites]


Philosophy does not contribute to our knowledge of the world we live in after the manner of any of the natural sciences. You can ask any scientist to show you the achievements of science over the past millennium, and they have much to show: libraries full of well-established facts and well-confirmed theories. If you ask a philosopher to produce a handbook of well-established and unchallengeable philosophical truths, there’s nothing to show.

He doesn't know many logicians, ethicists or mathematicians.

Likewise, he doesn't know many string theorists or sociologists or mathematicians, if he's looking for those lovely volumes of "facts."

Come to think of it, he understands neither philosophy nor science clearly enough to champion or challenge either without sounding like a total crank.

But if anyone thinks that I am completely mistaken, I’d like them to explain to me why. If they cannot show that my arguments are wrong, they should admit the errors of their ways and withdraw from the field! That’s the challenge.”

Mmmmyep.

Don't get me wrong, I think that consciousness studies are barking up the wrong tree as well, albeit for different reasons, but I don't think this is the right guy to challenge anyone except the darn kids standing on his lawn.
posted by Slap*Happy at 5:21 AM on October 25, 2010 [1 favorite]


If one replaces the word "scientism" with "liberalism", this would sound like it came from Rush Limbo.

The voices in my head command me to ignore it.


What an odd form of argument.
posted by OmieWise at 5:28 AM on October 25, 2010 [7 favorites]


The whole monist/dualist thing goes away (one might say it dissolves) if you know a little about information theory, and it's clear from the essay that neither Hacker nor the people he's criticizing do.

Information is a real thing that can be stored, retrieved, and measured that is not itself made of matter or energy, but in the universe as it seems to work information does depend on matter and energy in order to be transmitted and stored. Consciousness is complicated because it consists of a large amount of information being processed by a complicated piece of matter (and interacting with a large amount of environmental information in the bargain). The brain without the information would be vegetative; the information without the brain cannot interact or display the qualities of consciousness.

Philosophy goes nowhere against questions like this for the same reason Leonardo da Vinci probably wouldn't get very far if you dropped him in the bowels of a nuclear power plant and asked him to explain where the power comes from.
posted by localroger at 5:32 AM on October 25, 2010 [4 favorites]


I haven't seen any philosophical treatment for brain damage.

You've never seen a psychology textbook?

Science shows that the "talking cure" works better than drugs to cure depression - does talking alter the head-meat? Or the construct inhabiting the head-meat? (Cue long, heated arguments ranging oh, so far afield!)
posted by Slap*Happy at 5:32 AM on October 25, 2010


He doesn't know many logicians, ethicists or mathematicians.

That's a bit unfair. There may well be heaps of ethical or mathematical truths being produced by philosophers, but they aren't (in the sense that Hacker would use the term) philosophical truths.

Likewise, he doesn't know many string theorists or sociologists or mathematicians, if he's looking for those lovely volumes of "facts."

Sociologists and mathematicians aren't natural scientists by anyone's reckoning.

he understands neither philosophy nor science clearly enough to champion or challenge either without sounding like a total crank

He's the Hacker part of Baker and Hacker. I think he knows a bit about philosophy.

Seriously, give him a chance. Read his book, not the interview.
posted by GeckoDundee at 5:32 AM on October 25, 2010 [5 favorites]


I haven't seen any philosophical treatment for brain damage.

I have, and I have a damaged brain. A while back I had a bleed in my right occipital lobe which required brain surgery. My vision has been permanently altered by the event. Neurosurgery was able to stem the bleed and (with luck) prevent it from recurring, but it was not able to do anything about the distorted vision or the occasional visual migraines that appear to be a side effect of either the bleed or the surgery. Zen, on the other hand, has been very useful in helping me to function with what I have now -- dealing with the distortion and a variety of "side effects" (depression, coping, that sort of thing). You can argue that these are two totally different things, but, for me, they are all part of the same event. The scientific elements helped save my vision and (possibly) my life, but the philosophical elements helped build a life that remains worth living. They aren't really in competition.

And having a seizure gave me some empirical proof that our "selves" are, indeed, constructs that we build out of stimulus and memory, moment after moment. Zen has held to that idea for a while, neuroscience seems to be getting there in its own time.
posted by GenjiandProust at 5:47 AM on October 25, 2010 [26 favorites]


“Merely replacing Cartesian ethereal stuff with glutinous grey matter and leaving everything else the same will not solve any problems. On the current neuroscientist’s view, it’s the brain that thinks and reasons and calculates and believes and fears and hopes. In fact, it’s human beings who do all these things, not their brains and not their minds. I don’t think it makes any sense to talk about the brain engaging in psychological or mental operations.”

After reading this article, I think that this guy may be some sort of a Christian apologist, because the whole essay seems to say that neuroscience findings take the magic away from consciousness. Or something like that. Why doesn't mainstream philosophy value clarity in writing?

The other thing I'd like to say is that this poor gentleman is probably isn't as smart as he thinks he is, because mainstream philosophy is principally based on the writings of exclusively white, European men, and therefor the "understanding" (to take Hacker's term) is incomplete because it does not take into account the true range of human experience. His epistemolic tradition is inherently flawed.

Or, maybe I'm just reading this essay the wrong way... where's my bong?
posted by fuq at 5:48 AM on October 25, 2010 [1 favorite]


Wittgensteinian badman.
posted by Not Supplied at 5:56 AM on October 25, 2010


I'd second GeckoDundee and suggest reading the Hacker's own writing rather than that of an interviewer. There's really not enough in that interview to use as the basis for any useful criticism.
posted by le morte de bea arthur at 5:59 AM on October 25, 2010


This thread is kind of the worst of Metafilter: A bunch of people who don't know what they're talking about dismissing the work of an expert in his field because they disagree with the gloss of his work they haven't really read. This seems to happen most often (on Metafilter) with people and views that challenge the idea that natural science/Richard Feynman can explain all that needs to be explained about human existence. (Which is not at all the same thing as suggesting that religion is necessary to explain the human experience.)

Check out a list of Hacker's books. Scroll to his recent books at the bottom, and you'll see that his examinations of the relationship between neuroscience and philosophy are published by reputable presses (Blackwell, Columbia), and that his arguments are taken seriously enough that Dennet and Searle feel the need to reply.
posted by OmieWise at 6:00 AM on October 25, 2010 [42 favorites]


"You can ask any scientist to show you the achievements of science over the past millennium, and they have much to show: libraries full of well-established facts and well-confirmed theories. If you ask a philosopher to produce a handbook of well-established and unchallengeable philosophical truths, there’s nothing to show.

He doesn't know many logicians, ethicists or mathematicians.
"

Philosophy of language has made some huge strides, and for that matter so has metaphysics (do you know anyone who still argues for Scholastic forms?).

Gecko, reading the article I see that he just thinks of philosophy as a clarifying discipline, but doesn't that just arbitrarily cut-off well established fields of philosophy to suit his preferred definition?


"if you know a little about information theory, and it's clear from the essay that neither Hacker nor the people he's criticizing do."

The world as interpreted information is not a new position to philosophers. David Chalmers has ideas in that direction.
posted by oddman at 6:00 AM on October 25, 2010 [1 favorite]


The whole monist/dualist thing goes away (one might say it dissolves) if you know a little about information theory, and it's clear from the essay that neither Hacker nor the people he's criticizing do. [...] Information is a real thing that can be stored, retrieved, and measured that is not itself made of matter or energy, but in the universe as it seems to work information does depend on matter and energy in order to be transmitted and stored.

So assuming that information is some kind of fundamental constituent part of reality, you've now got a tripartite theory of consciousness. That's one way around the monist/dualist division I suppose, but not a very satisfactory one. Information as stuff also does nothing to help with the "aboutness" of consciousness, if you're one of the people who are bothered by that.
posted by iivix at 6:05 AM on October 25, 2010


It's not called the "hard problem" for nothing you know.
posted by iivix at 6:06 AM on October 25, 2010 [4 favorites]


Well, some argue that information is the only constituent of reality and the physical and mental events/things are nothing more than different ways that the information interprets/interacts with other information. Which, of course, just makes them monists of a different flavor. Also, it still maintains mind body dualism. It's just different kind of dualism. And as iivix points out doesn't do much to explain the phenomenology of consciousness.
posted by oddman at 6:09 AM on October 25, 2010


OmieWise: you'll see that his examinations of the relationship between neuroscience and philosophy are published by reputable presses (Blackwell, Columbia), and that his arguments are taken seriously enough that Dennet and Searle feel the need to reply.

But by his own admission, "philosophers of mind and cognitive neuroscientists are talking sheer nonsense", and yet those folks also get their books published by Blackwell et al. and have back and forth in journals. So, if Hacker was being consistent, I'd think he would agree that the extent of his scholarship and pedigree don't imply anything about the competence of his work.
posted by Gyan at 6:10 AM on October 25, 2010 [3 favorites]


I think the fact that so many Mefites with no apparent qualifications in philosophy think they can criticize Peter Hacker for not really knowing anything about philosophy actually goes to the heart of some of the things he is saying about the nature of philosophy.
posted by game warden to the events rhino at 6:10 AM on October 25, 2010 [6 favorites]


he would agree that the extent of his scholarship and pedigree don't imply anything about the competence of his work

Ha. Good point. But the dismissiveness of various commenters here isn't based, as far as I can tell, on a well-worked-out argument for why Peter Hacker's reputation in the world of philosophy shouldn't be relevant to his argument. Rather, it's based on an ignorance of Peter Hacker's reputation in the world of philosophy.
posted by game warden to the events rhino at 6:17 AM on October 25, 2010 [2 favorites]


The above quote is by the article author, a Hacker quote is below with my emphasis:

"I’m not accusing paid-up members of the so-called consciousness studies community of bad faith – I’m sure they are just deluded – but the result of their confusion is that we’re bringing up a whole generation of people to think in a thoroughly muddled way, to have hopes and expectations which are totally absurd, and to concentrate on things which are just incoherent. It’s literally a total waste of time."
posted by Gyan at 6:18 AM on October 25, 2010


No, it's a feminist criticism of the world of philosophy as a whole.
posted by fuq at 6:21 AM on October 25, 2010


Rather, it's based on an ignorance of Peter Hacker's reputation in the world of philosophy.

We know. The philosophy club is restricted. Members only.

The quote put up by Gyan is a perfect example of why I dismiss this article. "I'm not accusing anyone who disagrees with me of bad faith, I'm sure they're just deluded..." is precisely the sort of deep philosophical argument that confuses my tiny, pedestrian brain.
posted by three blind mice at 6:28 AM on October 25, 2010 [2 favorites]


having a seizure gave me some empirical proof that our "selves" are, indeed, constructs that we build out of stimulus and memory, moment after moment

You and Jill Bolte Taylor both.
posted by flabdablet at 6:32 AM on October 25, 2010 [2 favorites]


oddman: doesn't that just arbitrarily cut-off well established fields of philosophy to suit his preferred definition?

Well, we're talking about someone who reckons analytic philosophy was ended by Quine (see W's Place in C20th Analytic Philosophy), so he has some fairly narrow definitions. But he's really just following the Wittgensteinian view that philosophy is "therapeutic". (I can't remember who came up with that label, Michael Williams maybe?). We don't really have problems like scientists do, we just have mistakes in talking about things. Philosophy's job is to clarify our meanings and then the "problems" will dissolve. That view, in broad brush-strokes at least, comes over well in the linked interview.

Many philosophers are working in ways that this view wouldn't approve. Many of them are working in cognitive science. So, given that Hacker seems to think that analytic philosophy ended over half a century ago, it might look that he just needs to widen his definition.

It does in fact look a lot like the "get off my lawn" position Slap*Happy ascribes to him above.

But he'd probably say that he's not defining philosophy narrowly, just pointing out Wittgenstein's great insight. That being, philosophical problems will keep piling up if we don't see them for what they are. What they are is confusions about meanings. So contemporary philosophy is, on his view, headed down a dead end. It probably still philosophy, just not "analytic philosophy", properly understood.
posted by GeckoDundee at 6:33 AM on October 25, 2010


I want to make sure I understand this:

We judge scientists on whether they produce theories which predict testable repeatable experimental results.

We judge philosophers based on how much they've been published and how popular they are at dinner parties.
posted by empath at 6:38 AM on October 25, 2010


We know. The philosophy club is restricted. Members only.

This is a ridiculous, bitter, cynical, and self-serving statement. It may make you feel better about your own lack of desire to seriously engage these topics, but it isn't in the least true, or, rather, is only true to the extent that all specialties are restricted clubs. One cannot perform surgery, lawyering, social work, or bridge design without a license. The debates of serious practitioners in almost all disciplines are restricted (and perhaps non-sensical) to those who first put in the ground work to understand the terms and the history. This is the price of specialization. One might decide that because philosophers are "just" thinking, their "exclusion" of everyone with an internet connection and a copy of the Matrix from serious consideration is elitist, but one would be wrong to do so.

I assume that you don't feel excluded because you don't have unfettered access to a scanning electron microscope (if you don't).
posted by OmieWise at 6:43 AM on October 25, 2010 [12 favorites]


The world divides into two kinds of people: those who divide others into two kinds of people and those who don't.

Mutatis mutandis, the same could be said of all modern theories of Mind.

Everything that makes us human is the consequence of a hardwired faculty or capacity subject to empirical discovery and description. We're just chemicals and meat. Who write books and songs and make war and fall in love and post comments to websites full of erudite musings, which means any scientific problem is a problem of consciousness at some level.

Dreams of a unified science tell us much about the dreamer, and nothing about science.
posted by fourcheesemac at 6:50 AM on October 25, 2010


Chimeras are right dangerous, with their 9 hit dice, multiple attacks, and fire breathing ability.
posted by Mister_A at 6:53 AM on October 25, 2010 [8 favorites]


I found the article unreadable and unintelligible. Can someone tell me what the argument is?
posted by humanfont at 6:55 AM on October 25, 2010


Yeah, OmieWise, but if philosophy is done properly it should be accessible to everyone. (Following Plato's wide view of what philosophy is rather than, say, Hacker's much narrower one).

Maybe this is part of the reason we don't do philosophy so well? The site creates an expectation of some level of knowledge. Most of us wouldn't jump into a thread on, I dunno, kabuki theatre, because we don't have enough knowledge. But we happily comment on philosophy threads (as "Socrates" would want), and once we're here we feel we should sound like experts?
posted by GeckoDundee at 6:55 AM on October 25, 2010


Check any AskMe thread on computers or language translation or emergency brain surgery; no one fears to jump in with whatever they know.
posted by fourcheesemac at 6:57 AM on October 25, 2010


The point is not that it's a "restricted" club. The point is, don't base your argument on suggesting that people like Peter Hacker are ignorant of philosophy, as a couple of people did upthread. That's all. It makes you look stupid.

His credentials aren't substantively relevant to the validity of his argument, of course, but if you're going to comment as if he didn't have any credentials, don't be too surprised when others point out that he does.
posted by game warden to the events rhino at 7:13 AM on October 25, 2010 [2 favorites]


>Yeah, OmieWise, but if philosophy theoretical physics? is done properly it should be accessible to everyone.

Wow, MeFi showing its blind spots today.
posted by Joseph Gurl at 7:17 AM on October 25, 2010


What's it like to think in a thoroughly muddled way?
posted by Obscure Reference at 7:17 AM on October 25, 2010 [1 favorite]


Yeah, OmieWise, but if philosophy is done properly it should be accessible to everyone.

GeckoDundee, I don't think you and Omiewise really disagree.

I'd suggest that a well written philosophical text ought to be accessible to anyone with the proper background. In some cases that background can be provided by that same philosophical text, but not all texts do so (and we shouldn't think that a failure to do so is necessarily a serious flaw). For example an essay on some issue in meta-ethics can't really be expected to explain every mentioned metaphysical and ethical concept in a way that an intelligent layman could understand (if it did it would quickly become a rather large series of books), but it should be written in a way that any philosopher with a half-way decent education can make sense of the general argument (at least).


But then, I think the same could be said for a physics text. There is no reason, in principle, that a text on string theory, legal issues, or sociology cannot be accessible.
posted by oddman at 7:19 AM on October 25, 2010 [1 favorite]


Science shows that the "talking cure" works better than drugs to cure depression - does talking alter the head-meat? Or the construct inhabiting the head-meat? (Cue long, heated arguments ranging oh, so far afield!)

You didn't get the long, heated arguments, but your comment did remind me of this.
posted by rocket88 at 7:20 AM on October 25, 2010


"I don't get philosophy. It's the study of Deep Thought, right?.....Studying Deep Thoughts people other than you came up with is like studying their stool to figure out how to cook what they ate. It may be possible, but it's not worth what you have to wade though."

-Tim Sandlin
posted by jonmc at 7:21 AM on October 25, 2010 [3 favorites]


It seems to be a rule that when anyone denounces philosophy they are about to offer us some.

Hacker's main point (according to the interview) seems to be a robust scepticism about qualia (the redness of red). That is a legitimate position within current philosophy of mind, which Dennett and others would share.

Here we don't get much more than assertion. He mentions Nagel, but wilfully misunderstands the phrase 'something it is like' as literally invoking a comparison. This is legitimate as rhetoric (nothing wrong with a bit of rhetoric), but it's not really an argument. There are several other well-known arguments for the existence of qualia - inverted spectrum, Mary the color scientist, and so on; perhaps he addresses these in his book. The real claim in respect of qualia is that seeing a red dot is something more than just acquiring the information that the dot is red; there's also the actual experience of redness.

This qualia business, the 'hard problem' does get a lot of attention these days, but it's not the whole problem of consciousness. There is obviously the 'easy' problem, which isn't easy, and several others, notably that of intentionality. Again, I dare say he covers these in his books.

On the basis of the interview I cannot quite make out what his issue with brain versus mind is. He seems almost to be making the same kind of vacuous point as someone who insisted you didn't pay your fare with a dollar bill, you paid it with money: hey, bub, if that piece of paper was a dollar bill without being money, see how far it'd get ya! But that's probably quite unfair. It's true that you do see idiotic things said in the press along the lines of 'scientists show that learning causes changes in the brain!' - um, yeah - but that sort of thing doesn't trouble serious philosophers so far as I can tell.

Again, in the books, no doubt, of which I had never heard until now - so thanks, gyan.
posted by Segundus at 7:21 AM on October 25, 2010 [5 favorites]




By doing philosophy you come to realise things about the structure of our conceptual scheme that you would never have realised otherwise .... in doing philosophy, we come to realise the character of the grammatical and linguistic scaffolding from which we describe the world, not the scaffolding of the world.

I couldn't agree more, and it's good to hear a philosopher say this. Too often when philosophers say that science is wrong, they mean that it contradicts common sense or their preconceived prejudices, which they can explain to you are true by definition, e.g. 'brains can't think because, by definition mental is different from physical.'* It doesn't seem like he's saying anything that simplistic, though, from the article, it's not clear what he's saying. Maybe his books are worth reading.

If he can understand neuropsych research and critique the conceptual vocabulary researchers are using to think about what they're doing, that would actually be useful.

I agree with his criticism of popular reporting on psychological and neurological research. (At least, I agree that it contains a lot of nonsense.) I hope he understands that cognitive psychologists and neurologists understand their field better than most reporters, and philosophers. (The criticism of lay people like me trying to talk about philosophy also applies to philosophers trying to talk about science.)

I'll have to add that the idea of consciousness in neuropsychology isn't bunk. It basically means being aware of something. There is a difference between being aware of something and not being aware of something. What has to happen for a person to be aware of a stimulus isn't a meaningless question in neurology.

* Sorry, philosophers, I just mean that, in my limited exposure to philosophy, it usually comes across this way.
posted by nangar at 7:40 AM on October 25, 2010 [1 favorite]


To all those asking for Hacker to be given the benefit of the doubt, this is what Dennett has to say in his rebuttal, linked above by bhnyc,

"Their Appendix devoted to attacking my views is one long sneer, a collection of silly misreadings, ending with the following: “If our arguments hold, then Dennett’s theories of intentionality and of consciousness make no contribution to the philosophical clarification of intentionality or of consciousness. Nor do they provide guidelines for neuroscientific research or neuroscientific understanding.” (p435) But there are no arguments, only declarations of “incoherence”. At the APA meeting at which this essay was presented, Hacker responded with more of the same. It used to be, in the Oxford of the 60's, that a delicate shudder of incomprehension stood in for an argument. Those days have passed. My advice to Hacker: If you find these issues incomprehensible, try harder. You’ve hardly begun your education in cognitive science."
posted by Gyan at 7:45 AM on October 25, 2010 [5 favorites]


"You overplay your hand – you make things clearer than they actually are. I constantly try to keep aware of, and beware of, that. I think it’s correct to compare our conceptual scheme to a scaffolding from which we describe things, but by George it’s a pretty messy scaffolding. If it starts looking too tidy and neat that’s a sure sign you’re misdescribing things.”

Metafilter: you make things clearer than they actually are
posted by Obscure Reference at 7:45 AM on October 25, 2010


I'm not an expert. I just don't agree with much of Wittgenstein, and don't really care for those following in his wake. What's more, the article gives the impression of a hyper-specialist trying to turn every problem in every field into a nail for his hammer. I could read the book, but since I don't really agree with Wittgensteinian thought, and he's come across as a crank, why invest that kind of time?
posted by Slap*Happy at 7:54 AM on October 25, 2010


We judge scientists on whether they produce theories which predict testable repeatable experimental results.

We judge philosophers based on how much they've been published and how popular they are at dinner parties.


Yes I also think all things should be judged by the same criteria. Also, this soup isn't as good as Citizen Kane!
posted by shakespeherian at 8:01 AM on October 25, 2010 [10 favorites]


If you ask a philosopher to produce a handbook of well-established and unchallengeable philosophical truths, there’s nothing to show.

He doesn't know many logicians, ethicists or mathematicians.


Are there results/findings in the field of ethics that are considered well-established by a consensus of ethicists? On the level of, say, the Appel and Haken proof of the four-color theorem in mathematics?
posted by straight at 8:27 AM on October 25, 2010


We judge philosophers based on how much they've been published and how popular they are at dinner parties.

Philosophy is what we use to determine everything from the strategy of detente to who gets a new liver. It's been a powerful, pervasive influence in many aspects of our lives. Don't be silly.
posted by mobunited at 8:36 AM on October 25, 2010


If it starts looking too tidy and neat that’s a sure sign you’re misdescribing things.

That's a scientist's slag at the whole foundations of philosophical inquiry, really. If you can't tell us how uncertain you are, if you are definite, you're not even worth talking to. That's a brutally dismissive comment to make on a piece of scientific thought, esentially calling it garbage. It's as bad as any of the slurs against "scientism" in Hacker's peice.

I don't think there's a lot of productive thought to be drawn out of a mud-slinging name-calling debate, but maybe that's just me.
posted by bonehead at 8:43 AM on October 25, 2010


We have actually made a lot of progress in working out how the brain works, and anyone who does not think there is a clear relationship between the physical matter of the brain and the stuff of consciousness is clearly deluding themselves. On the other hand, there is obviously something special about that lump of matter and what it does.

As I've already said, consciousness is (because it must be) composed of a combination of information and processing. A somewhat subtle addition to this is that part of that information is an algorithm by which the information is processed. That algorithm is what these guys seem to be calling the "hard problem."

I think a lot of the reason we haven't made much progress on the consciousness algorithm is that it is actually going to be much simpler than anybody expects and it's not implemented at the top of the brain in our wonderfully human-exceptional cerebral cortex, but in the brain stem in some very old structures we share with nearly every other multicellular animal. And yes, when we get it right it's going to become obvious that most animals have a very similar type of experience to what we do (as anyone who has ever interacted with animals for any amount of time will already know).

All of this is very obvious to me because instead of studying philosophy for the last 25 years I've been designing systems, and consciousness is very obviously a system. It is senseless to draw distinctions like "monist" and "dualist" when talking about a system because the system is a single thing that doesn't exist until all its parts are assembled.

By analogy, this comment is coming together on a similar (but different) system. It consists of hardware made by Asus, software written by Microsoft and Mozilla, and words added to the mix (what most would consider the real purpose of the activity) by localroger. The output of this system, a transmission of information to another system called "Metafilter" is a result of all those parts working together. Try to isolate any of those parts away from the others and what you get is no system, and you are not reading this comment.

So it is not dualist or trialist or whatever to say that consciousness is a sum of brain structure, brain processing algorithms, and experiential information; all of those things are ultimately expressed in the matter of the brain but one involves rules coded into the brain and another involves cumulative changes induced by experience. If you go into the brain and start removing bits you will get malfunctions, and if you go into certain parts of the brain, particularly in the brain stem, you will get permanent unconsciousness -- the very definition of removing the brain structure which is responsible for implementing consciousness.
posted by localroger at 8:43 AM on October 25, 2010 [10 favorites]


I love this thread. I'm not a neuroscientist or a philosopher but I do know this: it's possible to be respected, decorated, revered and eminently qualified and still be stupidly wrong.
posted by Summer at 8:58 AM on October 25, 2010 [1 favorite]


Scientism, schmientism. How's Hacker's kicking foot?
posted by Pirate-Bartender-Zombie-Monkey at 9:06 AM on October 25, 2010


"Are there results/findings in the field of ethics that are considered well-established by a consensus of ethicists? On the level of, say, the Appel and Haken proof of the four-color theorem in mathematics?"

There are many: slavery is bad, defending autonomy is good, genocide is bad, informed consent is good, and more than a few others. As recently as one or two generations ago many people weren't sure about many of these. Now they are so well established as to be taken as givens.

localroger, I can't see exactly what you think of the algorithm doing in your description, but it seems to me that you are actually thinking of how to respond to the "easy problem of consciousness" not the the hard problem. (Using those terms as Chalmers does.) Also, whether or not we need a systemic treatment of consciousness, it would still be fruitful to establish whether that system is dualistic, monistic or something else. How the system works is one concern, the ontology of the elements of the system is a different concern.
posted by oddman at 9:19 AM on October 25, 2010


Hi, guys, what's up?
posted by wittgenstein at 9:24 AM on October 25, 2010 [6 favorites]


Whereof we know not, thereof we're loudly sounding off.
posted by TheophileEscargot at 9:29 AM on October 25, 2010 [5 favorites]


oddman, the best stab at what the algorithm does was made IME by the neuroscientist Erich Harth. It sharpens incoming data against an array of stored patterns, some of which may be tagged with associated emotional responses. It creates and strengthens and tags patterns based on ongoing experience. At any given time it is maintaining a situational awareness based on pre-existing patterns whose detectors are firing, and is also checking against other patterns encountered in the past to see if the situation can be improved by moving about in ways which have shifted similar patterns in the past to ones with better tags. It is this constant shifting and jockeying to apply old patterns against the existing situation to improve our present situation which we experience as "thought."

Harth makes a convincing argument that the pattern matching algorithm uses thermal noise in order to implement a suitably favorable hill climbing algorithm, and he even patented such an algorithm, called alopex, in the 1990's. It has the interesting characteristic of making many of the same perceptual errors in visual input that humans do.

The patterns are stored by electrical nerve firing in the short term, the growing of new synapses mostly in the cerebral cortex in the long term. The alopex-like algorithm is implemented in the thalamus. It is likely that there are some simpler, older layers with similar functionality deeper in the brain stem, but this is where Harth was doing his work.

While it's by no means a complete solution it suggests the likely direction in which the solution will be found, and it's what convinced me strong AI will probably be built one day after a decade or so of having given up on the idea. (Oddly, Harth himself disbelieves this, but he's a neuroscientist and not a programmer and probably doesn't see some of the implications I do.) I see no reason why the solution has to be very complicated; the apparent complexity of human thought arises because we have a ridiculous amount of pattern storage and detection processing available.

And if "this is how it works, if you build it it will act conscious" is only the answer to the easy question, then I'd posit that the answer to the hard question is going to be "the answer to the easy question is also the answer to the hard question."
posted by localroger at 9:40 AM on October 25, 2010


We have actually made a lot of progress in working out how the brain works, and anyone who does not think there is a clear relationship between the physical matter of the brain and the stuff of consciousness is clearly deluding themselves.

Yup, absolutely. There is no other place for consciousness to reasonably reside (without assuming some kind of soul which is beyond the purview of science). On the other hand, there is a difference between what makes up consciousness on the electrochemical level and what consciousness "feels" like to the given consciousness. And I am not really sure how much examination of the physical structure of the brain tells us about that experience I think this is the "easy problem/hard problem" split mentioned by oddman, above). Heck, I am not sure what my experience of my own consciousness tells me about anyone else's experience, especially considering my hardware is non-standard, and I have some grave doubts about whether there is, properly, a "me" to have "my experiences." And Philosophy, while not particularly useful (except maybe as a guide for further exploration) on the physical end of things, is probably a more useful tool in organizing one's thoughts about one's thoughts about one's self than knowledge about what specific physical processes are operating when one, say, thinks about thinking about thinking.
posted by GenjiandProust at 9:43 AM on October 25, 2010


Is it just me, or has philosophy essentially gotten to the point of being two parts masturbation, one part self-justification... and, if you develop a following, one part mutual masturbation?!

Why don't we cut to the chase and pass laws allowing for mutual masturbation in public, instead? Wouldn't it be more productive?!
posted by markkraft at 9:45 AM on October 25, 2010


I did not read the article and know nothing about philosophy but I DO have a BRAIN and I think duelling is a brutal, antiquated way to settle differences and should be discouraged.
posted by everichon at 9:53 AM on October 25, 2010


(Note to self: If you're talking about Ayn Rand / Objectivism, add an additional heaping scoop of self-justification. Stir until frothy and cult-like.)
posted by markkraft at 9:54 AM on October 25, 2010


That algorithm is what these guys seem to be calling the "hard problem."

Unsurprisingly, anything you can think of, a philosopher has already thought of... What you're suggesting sounds to me like Functionalism, which really does nothing to tackle the hard problem of "aboutness".
posted by iivix at 9:57 AM on October 25, 2010


This thread is kind of the worst of Metafilter: A bunch of people who don't know what they're talking about dismissing the work of an expert in his field

Look, I love philosophy as much as the next man. I studied philosophy of the mind at Oxford. But philosophy is full of clever people splitting hairs and going on to propose nothing substantially useful in their place.

Very rarely, someone comes along and says something interesting. Normally in not a lot of words. Tom Nagel's book, referenced in the article, is just such a piece.

Hacker has no problem dismissing many of his peers - and bear in mind he's not a philosopher of the mind by trade - of being delusional, following the wrong paths, misunderstanding the question.

In short, it's not at all unreasonable to accuse Hacker of doing just what you say people in this thread are doing.
posted by MuffinMan at 10:01 AM on October 25, 2010 [2 favorites]


Philosophy of language has made some huge strides, and for that matter so has metaphysics (do you know anyone who still argues for Scholastic forms?).

There are plenty of people who still argue for universals, and others who argue that there are no universals. Metaphysics sure has advanced! Metaphysics has changed, anyway.

I love all the people early in the thread (I haven't read much of it, I confess) who asserted that this Hacker fellow must not know much of anything.
posted by kenko at 10:09 AM on October 25, 2010


In short, it's not at all unreasonable to accuse Hacker of doing just what you say people in this thread are doing.

Nonsense. There's a world of difference between advancing (or even just stating) a legitimate argument against a given program of research, even dismissively, and the kind of knee-jerk dismissals, from positions of almost pure ignorance, that litter this thread. Not knowing the first thing about what you're talking about is not Hacker's problem, despite the customarily pissy retort from Dennett that Gyan quoted above, but it is most of Metafilter's this morning.
posted by RogerB at 10:14 AM on October 25, 2010 [2 favorites]


(Following Plato's wide view of what philosophy is rather than, say, Hacker's much narrower one).

I believe that in Plato's day, philosophy and science were identical; there was no separate field of philosophy that didn't comprise mathematics and natural science.
posted by Mental Wimp at 10:15 AM on October 25, 2010


"I'm not accusing anyone who disagrees with me of bad faith, I'm sure they're just deluded..." is precisely the sort of deep philosophical argument that confuses my tiny, pedestrian brain.

In all seriousness, what else would you have him say? If he believes his position to be right, doesn't that entail thinking that those who disagree are wrong? You might prefer him to adopt a less bristly rhetoric, but that has little to do with the content of his claims; and at least he's being honest and explicit about it. Of course they're famously quarrelsome, but there are also real benefits to this (Anglo-analytic) style of forthright disagreement.
posted by RogerB at 10:24 AM on October 25, 2010


me: Are there results/findings in the field of ethics that are considered well-established by a consensus of ethicists?

oddman: There are many: slavery is bad, defending autonomy is good, genocide is bad, informed consent is good, and more than a few others. As recently as one or two generations ago many people weren't sure about many of these. Now they are so well established as to be taken as givens.

Seems hard to argue that any of those are "findings made by the field of ethics" instead of just general changes in our society. Are you claiming that ethicists have made significant, generally accepted contributions to our understanding of any of those things?

Is there someone of whom ethicists could say, "Dr. Soandso demonstrated in her groundbreaking work Somethingsomething that informed consent is in fact good"?
posted by straight at 10:30 AM on October 25, 2010


In all seriousness, what else would you have him say?

At least part of this is culture clash. Credible scientists are careful to hedge their language. Boundaries and limits are things to be take careful note of. Cranks are also famously quarrelsome. Hacker's forthright style of argumention has the cultural markers of falsehood to someone used to falsifiable hypotheses and statistical claims.
posted by bonehead at 10:32 AM on October 25, 2010


He is wrong.
posted by Postroad at 10:40 AM on October 25, 2010


Is it just me, or has philosophy essentially gotten to the point of being two parts masturbation, one part self-justification... and, if you develop a following, one part mutual masturbation?!

There's a joke about Diogenes in there somewhere, I just know it.

posted by twirlip at 10:48 AM on October 25, 2010


oddman: ... whether or not we need a systemic treatment of consciousness, it would still be fruitful to establish whether that system is dualistic, monistic or something else. How the system works is one concern, the ontology of the elements of the system is a different concern.

Research in cognitive science assumes a form of neutral monism, that neural systems can think, feel and remember things, and that it's possible through research to gain some understanding of how they're able do that. Hacker thinks this is wrong, but it's not clear from the article why he thinks it's wrong.

It is possible for the assumptions of cognitive science or neuropsychology to be wrong, however they seem to be making fairly decent progress, given the complexity of the systems they're trying to understand. Empirical research is in fact really hard. I wish Hacker and other philosophers would try to understand empirical research and show a little less of a contemptuous attitude towards it.

(I'm aware that other philosophers have tried to understand research in this area, but those seems to be the ones he doesn't like.)

How the system works and the ontology of the elements of the system are not different concerns. If we can understand how people think and feel in terms of how neural systems work, then the assumption is right. It's an empirical question.
posted by nangar at 10:53 AM on October 25, 2010 [1 favorite]


“You can ask any human being having an experience ‘What was it like for you to have that experience?’ Most commonly the answer is: ‘Nothing in particular.’ What was it like to see the lamp post? What was it like to see your shoes?’ – ‘The experience was quite indifferent!’ Sometimes the answer would be, ‘It was wonderful, marvellous, joyful, jolly good or revolting, disgusting, awful’–and so on. If you want to generalise over that, engage as Nagel does in second-level quantification, the result is not ‘There is something which it is like to experience such and such’, but ‘There is something which it is to experience such and such, namely wonderful, awful, exciting, boring’. Why? Because the answer to ‘What was it like for you to do it?’ isn’t ‘It was like wonderful’ – unless we’re in California – but rather ‘It was wonderful’. So it is a plain confusion to think that for any given experience of a conscious creature, there is something that it is like for the creature to have that experience. Sometimes there is something that it is to experience this-or-that – most of the time there isn’t. That’s one pair of mistakes.”

It's been a long time since I read anything serious about philosophy, but I don't get this. This seems like an argument that qualia are illusory because

1. people have difficulty expressing them in words, and
2. people don't have emotions or values attached to them

I don't think much about my experience of seeing a light post, and it doesn't make me feel anything in particular, but there's certainly something that it's "like" for me to experience seeing a light post.

I'm not dismissing Hacker's ideas; I'm mostly hoping that someone who knows more about them can explain what he's talking about here.
posted by bjrubble at 10:58 AM on October 25, 2010 [1 favorite]


iivix, if you look at my answer to oddman you'll see that my ideas really don't align with functionalism. The problem is that functionalists seem to think that we will have a machine which demonstrably creates consciousness when run but it won't be too obvious why, except that when the machine is run you get consciousness.

What I am saying is that when we understand the machine, it will be obvious why it creates the kind of states we subjectively feel, and the only reason it isn't obvious today is that we don't know how the algorithm works.
posted by localroger at 10:58 AM on October 25, 2010


The whole monist/dualist thing goes away (one might say it dissolves) if you know a little about information theory, and it's clear from the essay that neither Hacker nor the people he's criticizing do.

Philosophy shouldn't try to exist without the knowledge provided by other disciplines, but why do you have to bring them with the assumption that working philosophers won't, can't possibly have already considered them? Philosophers flourishing on pwning one another, and if a discipline as large as information theory was to trot along with a superpower-size weapon that would dissolve hundreds of years of debate you think nobody would fire it?

They couldn't help themselves, even if it was called the WillDestroyAllPhilosophy weapon. In fact, especially then. Why would you think otherwise?

Philosophy goes nowhere against questions like this for the same reason Leonardo da Vinci probably wouldn't get very far if you dropped him in the bowels of a nuclear power plant and asked him to explain where the power comes from.

Ah, right, because you're patronising. Carry on.

we experience as "thought."

Who's this we, and by what means is it experiencing the thought? You'll get an infinite regress if you try and explain it in the same terms you explained how the thought worked.

And if "this is how it works, if you build it it will act conscious" is only the answer to the easy question, then I'd posit that the answer to the hard question is going to be "the answer to the easy question is also the answer to the hard question."

So, materialism then, or monism. What happened to the dissolution? And we're assuming that acting conscious is the same as being conscious, then?
posted by bonaldi at 10:58 AM on October 25, 2010


This thread is kind of the worst of Metafilter: A bunch of people who don't know what they're talking about dismissing the work of an expert in his field because they disagree with the gloss of his work they haven't really read. This seems to happen most often (on Metafilter) with people and views that challenge the idea that natural science/Richard Feynman can explain all that needs to be explained about human existence. (Which is not at all the same thing as suggesting that religion is necessary to explain the human experience.)

Check out a list of Hacker's books. Scroll to his recent books at the bottom, and you'll see that his examinations of the relationship between neuroscience and philosophy are published by reputable presses (Blackwell, Columbia), and that his arguments are taken seriously enough that Dennet and Searle feel the need to reply.
posted by OmieWise at 6:00 AM on October 25 [33 favorites +]


This is such a brilliant comment, I just have to quote it. I have said before but will reiterate that Metafilter is often at its worst when discussing academic research. It's the worst kind of "everyone is a bleeding idiot but me, which I can plainly demonstrate through my pithy commends" that you see in beginning graduate student seminars. It is the worst kind of disingenuous intellectual grandstanding of a literate, above-average-intelligence group that assumes it can easily discount or point out gaping logical flaws in complex subjects merely through the perusal of a summary or interview. Believe me, MeFites, your off-the-cuff dismissal of tremendously complicated topics says much more about your learnedness than you realize, but not in the expected direction.
posted by proj at 10:58 AM on October 25, 2010 [6 favorites]


Consciousness is complicated because it consists of a large amount of information being processed by a complicated piece of matter (and interacting with a large amount of environmental information in the bargain).

I really really hate it when people say things like this. How the hell do you know that the brain "processes" information in the way that the word "process" is conventionally understood? It is frustrating not only because it presumes the brain is nothing more than information processing equipment similar to a computer, but it also presumes that the only information processing equipment that is possible are the relatively crude binary computers we have today that do not yet have the "information processing" capabilities of a duck, let alone a human.

What information is processed to result in joy? What information is processed to lead to anxiety? How is it that two people standing in the same place at the same time receiving the same inputs from their five senses can respond to them in these two different ways? Stop thinking of the brain as the thing that does math more poorly than computers, and start thinking of it as the thing that reads and writes poetry.

Science shows that the "talking cure" works better than drugs to cure depression - does talking alter the head-meat? Or the construct inhabiting the head-meat? (Cue long, heated arguments ranging oh, so far afield!)
posted by Slap*Happy at 8:32 AM on October 25


First, the science does not actually show that, unless you are limiting "talking cure" to CBT. But yes, talking to a shrink would alter the "head meat" just as talking to anyone else would. It is still an experience, right?
posted by Pastabagel at 11:01 AM on October 25, 2010 [1 favorite]


I really really hate it when people say things like this. How the hell do you know that the brain "processes" information in the way that the word "process" is conventionally understood?

Because Alan Turing and Claude Shannon say that's the only way information can be handled at all. Information theory is actually much more basic than the laws of physics. If it can be encoded manipulating it requires a certain amount of bandwidth, and if it can't be encoded it doesn't exist.

Nobody is saying the brain handles information "in the way a computer does," if by that you mean herding binary values through a central processing unit based on instructions coded by more binary values. What we are saying is that like a computer a brain is a piece of matter which receives informational input, this input makes changes in its state, and the combination of input and previous-input state changes causes it to issue output.

As for what information is processed to result in joy or anxiety or whatever, that would depend on the brain's past inputs and its state just as the information necessary to get a computer to display the message "Hello, world" is going to depend on what program is running. However, you can be fairly sure that you've gotten joy when the limbic system starts firing certain neurons and flooding certain neurotransmitters which are the same in all brains. The reason two people receiving the same inputs don't react the same is that their history is different, and history is a much bigger and more variable part of the system than the mechanics of neural processing or the algorithm that it implements, although without those things you cannot demonstrate the phenomenon of consciousness even if you have all the information represented by the brain's developed state.

And I am quite confident that when we get a handle on the algorithm, building a machine that reads and writes poetry will turn out to be straightforward.
posted by localroger at 11:37 AM on October 25, 2010


Stop thinking of the brain as the thing that does math more poorly than computers, and start thinking of it as the thing that reads and writes poetry.

Well, you are at least half wrong. Not that it is very good poetry....
posted by GenjiandProust at 11:43 AM on October 25, 2010


Oh, there's been considerably better poetry than that applet's (which I can't get to load right now, anyway) composed with or "by" computers. Arranging words interestingly is not a strong-AI-complete problem. But that's not what Pastabagel meant by "writing poetry" anyway — the point there was to rectify a seemingly utterly impoverished idea of what consciousness is.
posted by RogerB at 12:11 PM on October 25, 2010


The real claim in respect of qualia is that seeing a red dot is something more than just acquiring the information that the dot is red; there's also the actual experience of redness.

Yes exactly, and even Dawkins is puzzled by this phenomenon.

I don't understand Hacker's argument. He says we are thinking about it all wrong and we need to recalibrate the language we use to describe consciousness.

From the article:

Light sensitive cells develop into eyes. Eyes give a creature the ability to see. Creatures that can see can be conscious of something moving in the underbrush over there, and so on.

He’s trying to take the mystery out of the equation. He bangs a fist on a table, “How could this stuff be conscious? – It couldn’t! How could consciousness arise from mere matter? – It can’t. Consciousness ‘arises’ from the evolution of living organisms.”


So is he saying that it is stupid to even ask what qualia is or how it arises? Is he saying that qualia is a fundamental property of the universe. Is asking why the red dot is red akin to asking why the charge of an electron is −1.602176487(40)×10−19 C? If so, he seems to be making an argument similar to David Chalmer's - that consciousness is a property of information processing systems.

Perhaps I'm totally misunderstanding his argument. If someone could clarify it for me, I would really appreciate it. I've been fascinated by this topic for quite some time.
posted by Acromion at 12:20 PM on October 25, 2010 [2 favorites]


What information is processed to result in joy? What information is processed to lead to anxiety? How is it that two people standing in the same place at the same time receiving the same inputs from their five senses can respond to them in these two different ways? Stop thinking of the brain as the thing that does math more poorly than computers, and start thinking of it as the thing that reads and writes poetry.
That's easy. Joy, fear, love, anxiety that's just hormonal changes like adrenalin , endorphins and oxytocin generated as a response to previous experience. Of course two people have different reactions they a reacting to the sum of their previous memories and experiences. The stimulus triggers memory, triggers emotions and senses. These may be transcendent, sublime, etc but they are just illusions, like the binaural beats you can play on your headphones.
posted by humanfont at 12:51 PM on October 25, 2010


Hacker's stance seems to be that consciousness is a fundamentally bad concept, just as how 'idea' and 'sense data' were bad in the history of philosophy. His gripe is that there are whole programmes of serious scientific/philosophical research based on this poorly-founded, ill-defined notion.

One line of his argument is in showing that Nagel's definition of consciousness is rhetorically faulty. He gives 3 reasons for this; I'm not sure I agree completely with the first two but I do agree with the conclusion. Surely there are other directions from which to deconstruct the term "consciousness". I think an analogy from physics would be the 1900s usage of "ether", another bad model. And in Wikipedia, there's a subsection briefly noting some of the philosophical criticisms of consciousness.

In the article he also talks about the role of philosophy, i.e. its primary task is to contribute to knowledge/understanding through criticism and introspection, as opposed to rigorous experiments or formalisms as done in the sciences/maths. So the stuff about consciousness is just one small part demonstrating this approach.

Some famous researcher in the area of cogsci was quoted something like, "Consciousness is 100 different things", and that's why we still have a long way to go in understanding it, applying it to AI, etc. But I think Hacker's view would be: no, you're looking at the problem the wrong way, in assuming the very existence of consciousness.
posted by polymodus at 1:04 PM on October 25, 2010


The problem is that functionalists seem to think that we will have a machine which demonstrably creates consciousness when run but it won't be too obvious why, except that when the machine is run you get consciousness.

We already have machines that create consciousness without us understanding how. There are 6 and a half billion of them on the planet right now.
posted by empath at 1:07 PM on October 25, 2010


What information is processed to result in joy? What information is processed to lead to anxiety? How is it that two people standing in the same place at the same time receiving the same inputs from their five senses can respond to them in these two different ways?

No two people have the same input, nor do they have the same initial states.
posted by empath at 1:09 PM on October 25, 2010


neural systems can think, feel and remember things, and that it's possible through research to gain some understanding of how they're able do that. Hacker thinks this is wrong

But he didn't say that. Consciousness is not the same as thinking/feeling/remembering; the latter are a lot more concrete and his counterargument examples indeed rely on the ability to think, etc.

Hacker has no problem dismissing many of his peers - and bear in mind he's not a philosopher of the mind by trade - of being delusional, following the wrong paths, misunderstanding the question.

He presents reasoned criticism of a branch of thought. That is valuable, even if the way he says it in the interview comes of as a bit ranty.
posted by polymodus at 1:21 PM on October 25, 2010


We already have machines that create consciousness without us understanding how

Harper doesn't disagree; "The question should be about living organisms and how they became sentient, not about how the stuff that makes up tables might be conscious". I.e, the first part suggests investigating evolution/biology; the second is parodying how the philosopical community actually approaches it today.
posted by polymodus at 1:33 PM on October 25, 2010


But I think Hacker's view would be: no, you're looking at the problem the wrong way, in assuming the very existence of consciousness.

Maybe I'm missing something, because this argument seems to me the equivalent to closing your eyes and plugging your ears and shouting "NANANANANA I DON'T UNDERSTAND YOU SO YOU DON'T EXIST."

How can you doubt consciousness exists? When I cut an onion, receptors on my olfactory nerves bind to the dimethyl sulfide and transmit a signal to my brain, which recognizes this chemical as belonging to an onion. The propanethiol S-oxide combines with water on my eyes to produce sulfuric acid, which stimulates my lacrimal glands and makes me tear up.

But there is more to it than that. I experience the unique sensation of smelling the onion. The sulfuric acid stings my eyes. How do these sensations arise from the interaction of molecules in our bodies? Does Hacker deny these sensations exist? And if so, why? Because we have a hard time describing them?
posted by Acromion at 1:38 PM on October 25, 2010 [1 favorite]


How can you doubt consciousness exists?

In the same way that free will (or lack thereof) is an illusion. He's saying there's a big problem in how we are using these words.

Personally, it's easier for me re. the term "consciousness"; I find it to be a very Western notion. I guess that's what it's like for me…
posted by polymodus at 1:47 PM on October 25, 2010


polymodus -

I can understand how free will is an illusion, but please explain how the stinging in my eyes from cutting an onion is an illusion. In fact, it is possibly the exact opposite of an illusion, since the sensations we have are the barest, most immediate, undeniable facts of our existence. For all I know, I could be a brain in a vat, and everything around me could be an illusion, but I do know when I see the light turn red, I have a sensation of redness.
posted by Acromion at 1:52 PM on October 25, 2010 [1 favorite]


I am reminded of a quote about how all the studying of mechanics won't give a great indication of where I will be tomorrow, but, surprisingly, the best method would be to ask me.

Personally, whenever someone argues from the perspective of strong AI I zone out because if that is the universe we live in then pretty soon, at the rate of technology, none of this will matter much, that and those who argue about strong AI still tend to get upset when I treat them like the robot they believe they are.
posted by Shit Parade at 2:09 PM on October 25, 2010


since the sensations we have are the barest, most immediate, undeniable facts of our existence.

Note that "sensation" =/= "consciousness"
posted by Mental Wimp at 2:17 PM on October 25, 2010 [1 favorite]


All of this is very obvious to me because instead of studying philosophy for the last 25 years I've been designing systems, and consciousness is very obviously a system.

"Phwoar! Check out that nail," said Mr Hammer. "I'd hit it."

...Not that I disagree with you, localroger, but it's common for people to interpret things in the manner of the most prevalent metaphor in their lives - hence the clockwork universe for Newtown, the excess of engineers who believe in intelligent design, and the prevalence of computational models today. Just a caveat.
posted by Sparx at 2:29 PM on October 25, 2010 [8 favorites]


It's clear from the article that he has a clear, positive agenda, albeit a strongly Wittgensteinian one that may be distasteful to some. This project has to do with how there can be something like a meaningful fact, i.e. how there can be meaning scaffolded over facts as I think someone put it nicely here. A critique of reductive scientistic tropes is part of that project, and you might get a feel for the real force behind that critique in the following example of Wittgenstein's:

"What is it that is so frightful about fear? The trembling, the quick breathing, the feeling in the facial muscles? – When you say: 'This fear, this uncertainty, is frightful!” – might you go on “If only I didn’t have this fear in my stomach!' (....) The expression 'This anxiety is frightful!' is like a groan, a cry. Asked 'Why do you cry out?', however – we wouldn’t point to the stomach or the chest."

L. Wittgenstein, Remarks on the Philosophy of Psychology. Vol. I. G. E. M. Anscombe, trans., G. E. M. Anscombe and G. H. von Wright, ed. Chicago, University of Chicago Press, 1980. pp. 132-133. (Remarks 728-729)

Those physiological disturbances or 'patterned changes in the body' [James/ Lange] are the facts of the emotion of fear, but there is a meaning to them as well. Can we get clear on how that meaning is learned, shared, generated? Is there something like experience or consciousness or understanding that allows for it, and if not, then what? Does language on its own have the capacity to generate such meaning. These are indeed tough questions to which Hacker maybe does not yet have an answer, but an useful contribution would nonetheless be one that helps us rule out candidates for where such meaning stems from.
posted by rudster at 4:07 PM on October 25, 2010


I think the problem with this thread lies with a certain kind of computer programming-influenced metaphysical naturalism, which is the default theology (and yes, I do mean theology) of a sizable percentage of Metafilter commenters on these sorts of threads.

South Park's underpants gnomes would put it this way: 1) Daniel Dennett rocks because he's a philosopher and he says things I agree with about religion 2) ????????? 3) Whatever Daniel Dennett says about any philosophical matter must be true! Or to put it even more simply - I think Dennett and his buddies are getting way more credit here than they deserve, and I think the reason has nothing to do with the strength of their arguments about consciousness.

Now, there's no reason why the underpants gnomes' logic should be true. None at all. It's sloppy, hubristic, irrational thinking. But it's all over this thread. The dismissals of non-Dennettian philosophy on this thread read very much like the thinking behind PZ Myers' masterpiece of proud anti-intellectualism, the Courtier's Reply argument - "I already know a priori that all philosophy that doesn't reinforce my beliefs is completely worthless, so why should I even bother trying to understand it?"

You don't need to drag God into this to see that when people yammer on about how philosophy is worthless - except Daniel Dennett or any other philosophy that, gee whiz, reinforces their previously-held worldviews to a T - they usually have in mind some cartoon version of Derrida they heard about on a blog somewhere or mental images of guys who look like Dieter, the host of Sprockets from an early '90s Saturday Night Live sketch. But I do think that God has something to do with this. There seems to be a deep fear and loathing of anything that calls into question the premises of Metafilter computational metaphysical naturalism for dummies - everything in existence is just like a bunch of C++ code that couldn't possibly require any deeper thought than some really kewl programming skills - that, no matter how nontheistic the arguments actually are, they have to be derided and guarded against.
posted by jhandey at 5:24 PM on October 25, 2010 [6 favorites]


Sparx, there have been two great innovations in the twentieth century that render whole fields of what was once philosophical fodder irrelevant. The first was information theory, the most important works of which being by Claude Shannon and Alan Turing, which completely destroys the whole debate about mono/duo/whateverist notions of things like consciousness and will eventually replace them with the kind of solid understanding we have of gravity.

The second was chaos theory, pioneered by the recently deceased Benoit Mandelbrot, which will do the same for notions of God and similar forces that might be thought to be necessary to make the universe look the way it does.

These things were completely unavailable and would have been incomprehensible to even the most brilliant people who came along too early to know about them. Trying to tackle the problem of consciousness without an understanding of information theory puts you in the place of Socrates trying to tackle matters like gravity and inertia without the insights of Newton. You can be very smart, and still totally wrong because you simply don't have the tools to face the question.

I don't make my living facing mysteries, I make it solving problems. This looks to me like a problem with definite potential solutions, if not at the present certainly on the horizon with what we know about the math of it. Sure it could be that I'm fitting the nail to my hammer but it could also be that this long-standing nail has never met a hammer that could hit it before.
posted by localroger at 5:24 PM on October 25, 2010


jhandey, it is not "fear and loathing" that cause some of us to reject metaphysical solutions, it is that there is an extremely long history of metaphysical solutions being trumped by and subsumed into the body of scientific explanation which works so much better. The smart money is on that trend continuing. We no longer think that arguments between the gods cause lightning or that witches cause sickness, and in the future we will think it equally foolish that at some time in the past our ancestors thought there was something magical and inaccessible about consciousness or complexity.
posted by localroger at 5:32 PM on October 25, 2010


As an unqualified outsider, having read the article, I completely agree with Hacker.
posted by General Tonic at 5:48 PM on October 25, 2010


"So long as people read Wittgenstein, people will read Peter Hacker."

Uh...

I could say that as long as people read Nietzsche, people will read Kaufmann.

A critter teasing me says: "Andy Kaufman?"
posted by ovvl at 5:53 PM on October 25, 2010


localroger, the point, you are missing it. Scientific explanations and information theory are concepts no less metaphysical than qualia, consciousness, or meaning. And it is very hard to take seriously the promise of information theory that you are proselytizing while seeming so indifferent to the problems those concepts name, especially when you associate other thinkers' rich and nuanced efforts facilely to superstitious views on lightning or magic.
posted by rudster at 6:00 PM on October 25, 2010 [1 favorite]


Check out a list of Hacker's books.

I'll get right on that... sometime soon...

Sorry, this argument seems to be a bit of a loop. Hacker dismisses an entire venue of inquiry, but I seem to be having a bit of difficulty discerning what his alternate view is.
posted by ovvl at 6:21 PM on October 25, 2010


anyone who does not think there is a clear relationship between the physical matter of the brain and the stuff of consciousness is clearly deluding themselves

yup, absolutely. But, having spent the past 3 years studying neuroaesthetics for a PhD, I must say that consciousness works much better if you think of it as a verb rather than as a noun. What kind of substance was your bike ride? It was made up of machines and weather and thoughts and geography and interactions with motorists and other cyclists, etc. None of that amounts to the totality of the bike ride, but we can start itemizing because the bike ride exists in the past, as an event that is now over. Understanding consciousness is very much like this, except it is much harder to bracket out as a discrete event because it is so clearly an ongoing, temporal condition. Consciousness is not a thing. But it is a material, physiological experience. There are some kickass brilliant neuroscientists contributing to the understanding of consciousness as an interactive property that involves the brain but also the body and also all the things the brain/body organism does and has done in the world. Antonio Damasio, Warren Brown. Oh, and Merlin Donald from cogsci. Neuroscience and philosophy are very much converging on these issues. It's fantastic.
posted by aunt_winnifred at 6:29 PM on October 25, 2010 [2 favorites]


Hacker dismisses an entire venue of inquiry

He's only criticising their one of their core assumptions (the notion of consciousness); probably their work can still be salvaged with a change of course. So it's not that bad.
posted by polymodus at 6:35 PM on October 25, 2010


Consciousness is not a thing. But it is a material, physiological experience.

How can you name a thing without defining it? In Hacker's view, this is totally nonsensical.
posted by polymodus at 6:37 PM on October 25, 2010


Sparx, there have been two great innovations in the twentieth century that render whole fields of what was once philosophical fodder irrelevant. The first was information theory, the most important works of which being by Claude Shannon and Alan Turing, which completely destroys the whole debate about mono/duo/whateverist notions of things like consciousness and will eventually replace them with the kind of solid understanding we have of gravity.
...
I don't make my living facing mysteries, I make it solving problems.


Let me guess. You're a computer programmer of some kind? And you've never read any major contemporary philosophical work evaluating or arguing for some form of physicalism or functionalism?

Because either of those could fairly describe what you're talking about as if no one had ever thought of it before.
posted by Marty Marx at 6:52 PM on October 25, 2010 [1 favorite]


But there is more to it than that. I experience the unique sensation of smelling the onion. The sulfuric acid stings my eyes. How do these sensations arise from the interaction of molecules in our bodies?

Well that's also straightforward. You have a sophisticated sensory system drives various automatic responses based on an evolutionary algorithm. Thus to protect your eyes from the sulpherous chemicals tears and a scent in your nose to warn you. What's conscious about it. You can't decide not to smell it or have your eyes tear up. I don't see what consciousness has to do with it. My refrigerator light comes on when the door opens, that doesn't make it self aware, no matter how old that Chinese takeout is in the back.
Just accept it. You are biological
robot whose behaviors, emotion and thoughts are beyond your control. You are just souped up Roomba bouncing life like the robot vac in my living room. Good news though the Apple is still out there. We are still in the garden. You only thought you'd fallen.
posted by humanfont at 6:53 PM on October 25, 2010


rudster: Scientific explanations and information theory are concepts no less metaphysical than qualia, consciousness, or meaning.

Actually there is a very large difference, because there are no hard mathematical descriptions of qualia, consciousness, or meaning and there are very hard mathematical definitions of information and chaos which are very useful in situations where it is not obvious that they would be useful if you haven't followed the math.

Basically, if you can't describe it with math, you aren't describing it at all. The really nifty thing computers do is they describe very complicated things we would normally regard as aesthetic and too complicated to mathify, with math.
posted by localroger at 7:17 PM on October 25, 2010


You've missed the point, humanfront. There's no theology here, just a difficult problem of relating the experience of onion-smell to the physical mechanisms that must be in place for us to experience onion-smell. You're saying, I think, that any mental event (like experiencing a smell) reduces to/is determined by/is necessitated by a physical event, which is fine, but not a new or unproblematic position within philosophy of mind.
posted by Marty Marx at 7:17 PM on October 25, 2010


"Basically, if you can't describe it with math, you aren't describing it at all."

[citation needed]
posted by Marty Marx at 7:21 PM on October 25, 2010


marty marx: The difference between me and the physicalists and functionalists is that they have pretty much thrown their hands in the air and said "when the explanation comes, it will be the explanation even though we won't understand it."

I think we will understand it, just as when I read Erich Harth's very tentative description of how the thalamus works it was suddenly very obvious why we experience certain misperceptions and fascinations. I think we haven't hit on the algorithm that creates new feature extractors yet but when we do, it will reveal further insight into why we remember certain things and not other things and forget almost everything. And I think when we have it all tied together, we will look at it and it will be kind of obvious that anything implementing that algorithm with enough storage for patterns and the right emotion tagging will act the way we do because it will, subjectively, feel as we do. It will no longer be a mystery, just as gravity and lightning are no longer mysteries.
posted by localroger at 7:24 PM on October 25, 2010


Marty Marx: [citation needed]

Google Claude Shannon.
posted by localroger at 7:25 PM on October 25, 2010


SHANNON/TURING 2010 WAKE UP PHILOSOSHEEPLE

I think we're done here.
posted by RogerB at 7:26 PM on October 25, 2010 [3 favorites]


Looking at the objections to physicalism I note a thought experiment where we imagine Mary who has been confined to a black and white world her whole life and is suddenly allowed out into a color world to quote:
For, upon being released into the world of color, it will become obvious that, inside her room, she did not know what it is like for both herself and others to see colors — that is, she did not know about the qualia instantiated by particular experiences of seeing colors

The Piraha people are not colorblind, but they have no words for color. They don't experience it. The assumed quala of seeing color assumes an awakening that rarely happens. The deaf receiving a coclear implant at an older age hearing a voice for the first time were aware of sound. The just couldn't know it directly.
posted by humanfont at 7:32 PM on October 25, 2010


Well RogerB (HI OTHER ROGER) when you've got the beginnings of the periodic table going it is kind of hard to be patient with people who keep bringing up earth, air, fire, and water.
posted by localroger at 7:32 PM on October 25, 2010


That's fine (and hi other Roger), but there seems to be a bit of a disagreement about who's the underinformed one here. I mean, you're talking with people who appear to have seriously studied these philosophical problems and read fairly extensively about them, yet you seem to be inventing/discovering your own position almost ex nihilo and then expecting everyone to be immediately cowed by your appeals to SCIENCE. And you're doing it without a shred of real argument, just repeated reassertions of faith (in strong AI, in Claude Shannon, whatever). For instance, when Marty Marx asked for a citation above, he was talking about your question-begging assertion about the ontologically fundamental status of math, but you waved him off with a vague "Google Claude Shannon" (hence my weak Ron Paul joke). This is not really how philosophy works; in fact, it's exactly what Hacker is cautioning against when he talks about "scientism." If you want to persuade people to share your transhumanist-or-whatever position, you're either going to need to do some serious homework so you can talk about philosophy more convincingly, or else provide an existence proof of the strong AI on which you're resting your claims that computer science has already superseded philosophy.
posted by RogerB at 7:59 PM on October 25, 2010 [2 favorites]


Basically, if you can't describe it with math, you aren't describing it at all.

Numbers don't exist, and math is just a fairy-tale we tell each other.

Serious - Fictionalism shows that physics can be performed without using any numbers or mathematics at all - and therefore mathematics is simply a useful fiction, employed to create human-comprehensible metaphors that scientists, mathematicians and philosophers can identify and understand.

Now, there are counter-arguments to Fictionalism, and I don't necessarily subscribe to this view myself, not entirely, but it does illustrate that unless you know what math is, how can you claim it's the only way to describe anything? Hume and Gödel are going to mug you in a dark alley.
posted by Slap*Happy at 8:04 PM on October 25, 2010 [1 favorite]


Well that's also straightforward. You have a sophisticated sensory system drives various automatic responses based on an evolutionary algorithm. Thus to protect your eyes from the sulpherous chemicals tears and a scent in your nose to warn you. What's conscious about it. You can't decide not to smell it or have your eyes tear up.

Sorry you are missing the point. There is a distinct sensation of the smell of an onion that doesn't need to be there for the algorithm to work. There is a particular sensation of stinging pain as opposed to, say a dull persistent ache. No you can't just wave a magic wand and make consciousness disappear because it doesn't fit into your algorithm.

I don't see what consciousness has to do with it. My refrigerator light comes on when the door opens, that doesn't make it self aware, no matter how old that Chinese takeout is in the back.

I don't see what consciousness has to do with it either. It is unnecessary for an organism to function. Why do these subjective sensations happen when our brain processes information? And how do you know your refrigerator light it isn't self aware on some very limited level? If qualia or consciousness is a byproduct of information processing then it could be that your thermostat has some level of consciousness.

Just accept it. You are biological
robot whose behaviors, emotion and thoughts are beyond your control. You are just souped up Roomba bouncing life like the robot vac in my living room. Good news though the Apple is still out there. We are still in the garden. You only thought you'd fallen.


Wait... where did this come from? Are you one of those proselytizing internet atheists? If so, take it to a thread where we are talking about God.
posted by Acromion at 8:05 PM on October 25, 2010


I'm sorry, the imputation of naivete there came out stronger than I meant it; what I should have focused on instead is the scientism, the idea that you can argue from your faith in future pragmatic developments in neuro-computation to a coherent philosophical position rather than providing a ground for it with a properly philosophical argument.
posted by RogerB at 8:09 PM on October 25, 2010


There is a distinct sensation of the smell of an onion that doesn't need to be there for the algorithm to work

I think localroger's point is that a system implementing the algorithm he's talking about would report just such a sensation.

Why do these subjective sensations happen when our brain processes information?

Because that's what this particular information processing system does.

And how do you know your refrigerator light it isn't self aware on some very limited level?

My refrigerator light is not analogous to my brain in any important way.
posted by flabdablet at 8:40 PM on October 25, 2010


I'm not there yet: Aside from not liking the consciousness studies community, what is Hacker's view of consciousness?

"Consciousness ‘arises’ from the evolution of living organisms.”

I don't quite get it.
posted by ovvl at 9:02 PM on October 25, 2010


There is a distinct sensation of the smell of an onion that doesn't need to be there for the algorithm to work.

Not true. You wouldn't smell it if it wasn't necessary from an evolutionary perspective. Natural selection would have obliterated it. Why is the smell pungent to you, and less so to me that's just natural selection and environment. For example my eyes process some green shades as red. I don't seem as the majority of the world sees them. There is nothing conscious about it, it's just the way the sensors are set. Then you have your memories and distinct experiences creating some additional responses. The sensation your brain knows as smell is just a highly efficient way of computing the result of the stimuli's and passing it back to the job scheduler and task manager. Onions for you may mean a meal is soon for follow, for me serious heartburn and diarrhea as my body purges the foul root from my system.

I'm sure you have much to say and I only mean to try to understand here. Perhaps these are philosophy 101 arguments.

Eating the apple was a symbol of how human kind acquired self awareness and separated from the animals. I was just making a slight joke.
posted by humanfont at 10:01 PM on October 25, 2010


I read some reviews of Bennett and Hackers Philosophical Foundations of Neuroscience, including Dennett's review that bhnyc linked to.

In the article Hacker was quoted as saying,

"On the current neuroscientist’s view, it’s the brain that thinks and reasons and calculates and believes and fears and hopes. In fact, it’s human beings who do all these things, not their brains and not their minds. I don’t think it makes any sense to talk about the brain engaging in psychological or mental operations.”

That makes sense as far it goes. Given the way we ordinarily use pronouns, we would say "I pick up things," or "I pick things up with my hand," not "My hand picks up things." By analogy, you could argue we should say "I think with my brain" or "I use my brain to think," not "my brain thinks." But I didn't think that could possibly be his whole point.

It turns out that is actually pretty much his whole point. He objects to identifying mental activity like thinking, believing, having emotions with brain activity, not because he subscribes to some sort of dualism, or eliminative materialism, but because actions belong to the whole person not their parts.

One reviewer quotes Hacker and Bennett explaining it this way:

If believing, hoping, wanting, etc., were identical with certain neural states, then the location of the mental state M or of the mental event E that consists in the agent’s becoming M would be the location of the corresponding neural states or events (whatever they might be). But the question ‘Where do you believe that it will rain?’ is answered by giving the location of the predicted rain, not by indicating a location in one’s skull ... Similarly, ‘Where did you acquire the belief that p?’ can be answered by ‘In the library’ or ‘When I was walking on the heath with Jack,’ but not by ‘In my brain, of course.’

I'm hoping was intended as a facetious introduction, and a serious explanation followed. But I'm not sure.

Evidently a large chunk of the book consists of Bennett and Hacker quoting various neuroscientists and pointing out their misattributions of mental processes to the brain. Dennett quotes them going after Crick here:

When Crick asserts that “what you see is not what is really there; it is what your brain believes is there,” it is important that he takes “believes” to have its normal connotations – that it does not mean the same as some novel term “believes*”. For it is part of Crick’s tale that the belief is the outcome of an interpretation based on previous experience or information ...

Uhm. No. I think it's fair to say that even when the "brain believes" it sees something, in this sense, ie. what the person is aware of seeing as the outcome of visual processing, the person may still be aware that it's an optical illusion. though Crick would still identify 'believing it's an optical illusion' with brain activity, just at a different level of processing.

Sorry. I'm not taking this guy seriously. His got a point, kind of. if I interpret what he's saying charitably - you can't separate the brain from the body and the whole of a person's experiences. But his arguments as I've seen them presented aren't valid (though I know the quotations were selected). His criticisms are not a good reason to throw out an entire body research because "The conception of consciousness which they have is incoherent. The questions they are asking don’t make sense. They have to go back to the drawing board and start all over again.”

polymodus, I know he also has an objection to term consciousness, but I don't know what it is, just that it has to do with something he sees as hold-over from the Cartesian concept of mind. If he objects to the identification of consciousness with self, which would be consistent with his main criticism of neuropsych, I'd go along with that.
posted by nangar at 10:04 PM on October 25, 2010 [1 favorite]


You wouldn't smell it if it wasn't necessary from an evolutionary perspective. Natural selection would have obliterated it.

This is not at all true. Evolution is not a scalpel wielded with precision, it's a blunt knife that does enough to provide advantage and leaves behind all kinds of stuff in its wake. Survival of the fittest is not anything Darwin said. Survival of the good enough at the time is good enough for Darwin.
posted by Sparx at 11:54 PM on October 25, 2010 [2 favorites]


Arguments about how our ability to smell or otherwise sense things evolved are really beside the point. The issue at hand is how it is that we have any kind of subjective experience at all.

People on Peter Hacker's side of the fence are saying that no description of how brains work could possibly explain or account for that subjective experience, since the experience itself is different from its explanation. For starters, any such explanation is necessarily verbal while subjective experiences such as smells are clearly not; secondly, brain processes happen inside our skulls while our experiences of such things as smells and colours obviously come from things outside our skulls.

People on Daniel Dennett's side argue that any description of brain processes accurate and precise enough to allow corresponding processes to be implemented on non-human hardware probably would account for subjective experiences, because any object so describable would in fact have such subjective experiences and would be capable of telling us so. In this view, subjective experience is not something conceptually separable from brain processes, but is an inevitable consequence of them: consciousness is not so much a quality of human beings per se as an emergent property of systems organized in a person-like way.

Personally I'm more convinced by Dennett's reasoning. Hacker says that Dennett's position is incoherent; personally I have no difficulty with it, and instead find the standard philosophical zombie construct incoherent. But what would I know? I have never published a paper in a learned journal, so I have no philosophical credibility whatsoever.
posted by flabdablet at 5:00 AM on October 26, 2010 [3 favorites]


People on Peter Hacker's side of the fence are saying that no description of how brains work could possibly explain or account for that subjective experience, since the experience itself is different from its explanation.

This is true for any explanation which doesn't just kick the can down the street. Explaining that leaves are green by saying they are made of green particles doesn't explain ANYTHING. It's not until you get down to the activity electromagnetic waves, photons and electrons, and the light detecting cells in the eyes, etc, that green is explained -- and none of those things are in any real sense 'green'.

Why is ice slippery? It's not because ice is made up of slippery bits, it has to do with the properties of hydrogen and oxygen atoms at cold temperatures. Why does a hot thing feel hot? Well, it has to do with the activity of particles with a certain amount of kinetic energy, etc.

Any explanation of anything, must be described in terms of things which are unlike that which is being explained.

Explaining human consciousness and subjective experience by saying that we're made of 'thinking stuff' is absurd. At some point you have to get down to 'stuff that doesn't think', and explain how the collective activity of non-thinking things gives rise to the subjective experience of thought. It happens. Anything which happens can be explained.
posted by empath at 5:20 AM on October 26, 2010


Well, slap*happy, it's funny you bring up Godel in particular because I was getting ready to bring him up myself. Godel is famous because at his time there was a belief that you could use a mathematical system to prove itself, and Godel's big contribution to human knowledge was proving that this is nonsense.

The thing is, when I first read of Godel I had a hard time understanding just what he'd done. Having grown up with computers and multiple examples of formal systems which could be used to do complicated things automatically I could see intuitively that this was nonsense; it would never have occurred to me to try to design a system like a math proof or computer language that could prove its own correctness because that was obviously a stupid idea. When I found out actual people had tried creating the Principia Mathematica I was dumbfounded.

Fictionalism is a word game, like the ontological argument; at the end of the day you are still either able to measure something or you're not, and establish relationships based on those measurements. Traditionally we do this with numbers, and any silly number-free system that works will end up mapping back to numbers, just as all computers map back to the Turing Machine. One can argue about which implementation is more useful but at the end of the day they're doing the same thing.

Anyway this whole argument arises largely for the same reason that classical arguments for the existence of God arose, which is that we are faced with an important complicated inexplicable thing. When you are trying to explain a massive information processing system that assembles itself based on fractal geometry and you try to do it without benefit of either information theory or chaos theory, you're not going to get very far. The person who understands those tools is going to think problems that look very hard to you are not in fact all that challenging, just as the person who understands Newton and Kepler will not be all that mystified by the motions of the planets.
posted by localroger at 6:00 AM on October 26, 2010


and any silly number-free system that works will end up mapping back to numbers

You actually have it entirely reversed - numbers are mapped artificially and arbitrarily to reality.

Consider this: imprecision is a fundamental part of empirical science and engineering - it's why statistical analysis is so helpful. Reality is not determined by numbers, only described by them in a story-telling language we call math, and even then, only imperfectly at best.
posted by Slap*Happy at 6:35 AM on October 26, 2010


You actually have it entirely reversed - numbers are mapped artificially and arbitrarily to reality.

What you are missing is that if you can map them in either direction then the numbers and not-numbers are exactly the same thing. You have physical objects which can be measured, and you establish relationships between them. You can use numbers to describe those relationships, or you can use some other notational system that avoids the language of mathematics, but if this other system accurately describes the measured relationships then it is essentially the same using different symbols and words.

This is true in exactly the same way that a pure Turing machine and a Turing-complete computer may work in completely different ways, but they can be shown to have identical capabilities.

And the imprecision of measurement does not imply that reality is not determined by numbers; what it implies is that we cannot know for sure whether reality is determined by numbers. Of course this is true in the same sense that we cannot know for sure whether reality consists of little particles zinging around or a computer simulation of such particles in some architecture which is Turing-equivalent to the zinging particles. To assert that either idea is more "real" than the other is absurd.

However, to back up to my original statement, if you cannot describe something using math or some other similar formal system for recording and establishing the relationships between things, then you are not describing anything at all.
posted by localroger at 7:32 AM on October 26, 2010


Consider this: imprecision is a fundamental part of empirical science and engineering - it's why statistical analysis is so helpful. Reality is not determined by numbers, only described by them in a story-telling language we call math, and even then, only imperfectly at best.

Uncertainty in quantum physics is a direct of the math, and has nothing to do with imprecise equipment.
posted by empath at 7:34 AM on October 26, 2010


As far as we know, quantum uncertainty reflects the fact that nature, on the smallest scales, is simply not uniform and repeatable; such regularity as exists on this scale is only discernible across a population of events. So any fictional non-mathematical system with comparable explanatory and predictive power to QM as expressed in mathematics must surely also deal with the same uncertainties.
posted by flabdablet at 8:17 AM on October 26, 2010


People on Peter Hacker's side of the fence are saying that no description of how brains work could possibly explain or account for that subjective experience, since the experience itself is different from its explanation.

This is true for any explanation which doesn't just kick the can down the street. Explaining that leaves are green by saying they are made of green particles doesn't explain ANYTHING. It's not until you get down to the activity electromagnetic waves, photons and electrons, and the light detecting cells in the eyes, etc, that green is explained -- and none of those things are in any real sense 'green'.

There are no essential explanations for anything in science, simply increasingly sophisticated descriptions. Science never explains why, it describes how (and what and when).

Why is grass green? Because the electronic transition in chlorophyl chromophore doesn't absorb in the 530nm wavelenth range and that light gets reflected to our eyes when we look at the leaves of grass. Why does chlorophyl do that? Because the arrangement of atoms in chlorophyl produce electronic structures which interact with light in those frequencies, which we can describe using the Born-Oppenheimer approximation. Why do atoms produce those structures? This follows by an expression of the properties of the basic particles of photons, electrons and atomic nuclei, which is described by the "laws" of mostly-classical quantum mechanics. Why do atoms structure themselves in such a way? Because of the properties of the weak force on stable collections of protons and neutrons, which can be described by the "cloudy bag" model of quantum chromodynamics. Why do protons and neutrons have those properties? Because... ...and so it goes on an on.

There is a bottom in the sense that there is a point at whcich no description currently exists---the "Theory of Everything"---but that's not to say there is a bottom. Even if there is a basic set of "rules" from which the universe appears to be organized, there's no philosopical importance to them. No reason why.

At it's base, there's no more understanding than the answer "because it is" (or better, "because that's what we can infer"). The problem with searching for meaning in observations and hypotheses is that there inherently is none to be found.

This is the same error (some) religious types make. If Hacker's argument can be reduced to some sort of religious point, then it's not accessible by scientific argumentation at all.
posted by bonehead at 9:50 AM on October 26, 2010


that render whole fields of what was once philosophical fodder irrelevant. The first was information theory, the most important works of which being by Claude Shannon and Alan Turing, which completely destroys the whole debate about mono/duo/whateverist notions of things like consciousness

... yet which result in a mono position, as mentioned above. Winning a debate is not destroying it. The momentum of your bluster is carrying you far here, and you have points that I think a lot of people would agree with you on, but attempting to flatly assert that Shannon/Turing have done to philosophy what chemistry did to alchemy is simply ignorant in the very literal sense.

However, to back up to my original statement, if you cannot describe something using math or some other similar formal system for recording and establishing the relationships between things, then you are not describing anything at all.

Care to describe that using math, please?

While we're at this, Godel created a version of the ontological argument, too. How very strange that he didn't just stride in, say that math has made religion irrelevant and add that God was obviously a stupid idea.
posted by bonaldi at 10:52 AM on October 26, 2010


There is a bottom in the sense that there is a point at whcich no description currently exists---the "Theory of Everything"---but that's not to say there is a bottom.

To my mind, an explanation is reached when you have to ask a different question to go further.

Why does my hand not go through solid objects?

"Because they're made of solid stuff" isn't an explanation, because you're still stuck asking the same question.

"Because of electromagnetic repulsion" IS an explanation because you've moved onto an entirely new phenomenon which needs to be explained if you want to go further.
posted by empath at 11:02 AM on October 26, 2010


It is ironic that Hacker accuses his opponents of scientism, when he himself appeals to occult “evolutionary mechanisms” as generating consciousness.

It is also telling that he nowhere mentions Spinoza, who is gaining recognition (see Damasio, for example) as providing the philosophical ground for a rational neuroscience.
posted by No Robots at 11:03 AM on October 26, 2010


Science never explains why, it describes how (and what and when).

No, no, no, no, no, no. Science is not a descriptive activity, it is a predictive activity. A scientific theory (or model, if you will) is only validated by its ability to predict not-yet-observed events. If it can't do that, then you need another model. The careful observation and description of events is prelude to building models that predict. Broadly construed, you could call these models descriptions, but they are more than just descriptive, they are prescriptive. They tell you what will happen causally.

And, as I said upthread, all models are wrong; some are useful. Useful here means they can predict accurately enough for your purpose.
posted by Mental Wimp at 11:25 AM on October 26, 2010


Oops. I said it here. And it paraphrases the statistician G.E.P. Box.
posted by Mental Wimp at 11:27 AM on October 26, 2010


There's nothing remotely religious about Hacker's argument, and he's not a dualist. I think his argument is stupid, but it's possible to wrong without having religious reasons for being wrong.

I think jhandey's probably right. Hacker's disagreeing with Dennett about something. Dennett wrote a book about religion called Breaking the Spell. So Hacker must be defending religion or something, right? There may also be some suspicion of any form of physicalism that's not completely eliminative. (See the argument that you're wrong for claiming you're aware of smelling onions in the thread.)

I'd like to point out that an insistence on strictly eliminative materialism would also involve rejecting most research in the area of neuropsycholgy and cognitive science, but for a different reason than Hacker. The underlying assumption of most research in this area is that processes liking thinking, feeling and reasoning can be understood in terms of brain processes, not that thinking, feeling, etc. aren't real or don't exist - however much proponents of eliminative materialism like philosophical behaviorists claimed or still claim that such an approach is unscientific or superstitious.
posted by nangar at 11:29 AM on October 26, 2010


Mental Wimp, we're making exactly the same point. I'm using the word descriptive in your broader sense. I can't know exactly why a particular blade of blade of grass is green without putting into my spectrometer. I'm using a series of theoretical constructs, well-tested hypotheses, to do that. And yes, they're successively approximative, but generally fit for purpose. That's a very well acknowledged feature of theoretical biology, chemistry and physics.

There's nothing remotely religious about Hacker's argument, and he's not a dualist. I think his argument is stupid, but it's possible to wrong without having religious reasons for being wrong.

The interesting question to me is, is Hacker making a claim that is falsifiable? If he isn't, his argument is "religious" in the sense that it can neither be proved now disproved by reference to experiment. If there is no such claim, then arguing with a neuro-biologist is rather pointless. I'm not saying that his points may not be philosophically interesting, but don't pick fights with the scientific community if you aren't arguing about science. I haven't yet seen anything in his statements that could be accessed by experiment, though I have not (yet) read his book.
posted by bonehead at 12:11 PM on October 26, 2010


Why does my hand not go through solid objects? "Because they're made of solid stuff" isn't an explanation, because you're still stuck asking the same question. "Because of electromagnetic repulsion"...

I'm slicing the bread thick, you're slicing it thin. I do think it's the same problem, just rephrased in a more detailed frame of reference.

The "answer", btw, is the Pauli exclusion principle, not electromag repulsion. A purely em system would produce fuzzy edges to everyday objects even with classical "atoms". This was the problem that Rutherford spent a huge amount of time grappling with. The fact that everything is not made of jelly is a quantum effect.

So, the question, "why does the Pauli exclusion principle work that way?" ("it simple does" is the current best answer) is simply a more general version of the question "Why don't things fall through the floor", in my view. And yes, I view these answers as essentially descriptive (and predictive) but not explanatory. I don't know why QM works, but I can tell you how (to the limits of my capabilities).
posted by bonehead at 12:21 PM on October 26, 2010


The interesting question to me is, is Hacker making a claim that is falsifiable? If he isn't, his argument is "religious" in the sense that it can neither be proved now disproved by reference to experiment.

Well, there are empirical claims which need to be falsifiable to be meaningful and logical claims which only depend on the truth of their initial premises to be meaningful.
posted by empath at 12:26 PM on October 26, 2010


And yes, I view these answers as essentially descriptive (and predictive) but not explanatory.

The fundamental problem with this position is that if you don't make the model itself the explanation, then you are essentially defining away any explanation, because that's all you will ever have. That's why I was so careful to delineate mere description and prediction. "The sky is blue," is fundamentally different than "The sky is blue because it diffracts blue light waves differently that it does other wavelengths." If you don't consider the latter more explanatory than the former, it leaves wide open the question of just what constitutes an "explanation." You're facing infinite regress with no way to stop.
posted by Mental Wimp at 12:29 PM on October 26, 2010


it leaves wide open the question of just what constitutes an "explanation." You're facing infinite regress with no way to stop.

Sure, but as I said above, if the more general context doesn't have an "answer" then where do you go? What's the "explanation" for QM or GR? We can verify that the theory describes and, sure, predicts (within experimental uncertainty), but I can't tell you "why" it works. It simply seems that that's the way the universe appears to work. Existence is it's own "explanation".

It's shaky ground to stand on anwyay. Just like Newton was succeeded by Einstein, I also have no confidence that QM and/or GR are the root descriptions, or "explanations" for matter and energy. It's a matter of faith, but there's almost certainly a more general theory of which QM and GR are approximations. Are there more levels still? It seems possible, if not likely. The pit could indeed be bottomless for all I know.
posted by bonehead at 1:00 PM on October 26, 2010


Sure, but as I said above, if the more general context doesn't have an "answer" then where do you go?

If you've recontextualized a specific phenomenon in terms of a more general one, then I'd suggest that is an explanation. It may not be the 'ultimate explanation', whatever that means, but it's still an explanation.
posted by empath at 1:07 PM on October 26, 2010


jhandey: South Park's underpants gnomes would put it this way: 1) Daniel Dennett rocks because he's a philosopher and he says things I agree with about religion 2) ????????? 3) Whatever Daniel Dennett says about any philosophical matter must be true! Or to put it even more simply - I think Dennett and his buddies are getting way more credit here than they deserve, and I think the reason has nothing to do with the strength of their arguments about consciousness.

nangar: I ithink jhandey's probably right. Hacker's disagreeing with Dennett about something. Dennett wrote a book about religion called Breaking the Spell. So Hacker must be defending religion or something, right?

I'm sorry, but I'm a bit embarrassed for you. Dennett's mind-philosophical arguments have hardly anything to do with his role in the debate about religion, and I don't see why anyone's opinions here about religion should favor either Dennett, Hacker or Nagel.
posted by Anything at 5:53 AM on October 30, 2010


« Older Drunk people are so meta   |   Richard Dawkins vs. Josh Timonen Newer »


This thread has been archived and is closed to new comments