Join 3,516 readers in helping fund MetaFilter (Hide)


Extending the Mind
January 15, 2009 10:50 PM   Subscribe

How Google Is Making Us Smarter: Humans are "natural-born cyborgs," and the Internet is our giant "extended mind."
posted by homunculus (50 comments total) 23 users marked this as a favorite

 
Related post: Is Google Making Us Stupid?
posted by homunculus at 10:51 PM on January 15, 2009 [3 favorites]


No, but the search engine does make our stupidity harder to hide.
posted by ryanrs at 11:02 PM on January 15, 2009 [1 favorite]


Here you go, homunculus.
posted by niles at 11:04 PM on January 15, 2009 [2 favorites]


Great article, homunculus. Surprisingly good, in fact, for Discover, which I usually associate in my mind with fluff.

Incidentally, Chalmers's name sounded familiar. I read him in my Philosophy of Consciousness course years ago. He's most famous for: "...forcefully and cogently argues that all forms of physicalism (whether reductive or non-reductive) that have dominated philosophy and the sciences in modern times fail to account for the most essential aspects of consciousness. He proposes an alternative dualistic view that has come to be called property dualism."
posted by wastelands at 11:06 PM on January 15, 2009


Longtree. Longtree! If you keep using the tablets you'll loose your memory! Like First Flower using the ropes to count the hunt and count the fruits! She can't do anything without her little rope knots telling her if the apples where better this year. Remember when she thought the Eastern forest was ripe? How bad that was? You'd better learn from her, stop this tablet nonsense. It can only end in trouble.
posted by The Whelk at 11:25 PM on January 15, 2009 [5 favorites]


Dualism is silly. Sorry. Well, okay. Maybe not silly. But certainly not scientific.

More importantly, am I the only one that thinks of Wikipedia everytime he watches Ghost in the Shell?
posted by cthuljew at 11:41 PM on January 15, 2009


One (arbitrary, dark) point at which I'd accept an equality between external and internal memory is the state at which disease is transmittable from one to the other. On that matter, I usually screen my Mefi comments for language capable of planting a forest of alien tissue in your mind, but sometimes Videodrome wins. I'd recommend not reading any comments from May-August 2008. (The rest of you, you know your orders.)
posted by kid ichorous at 11:54 PM on January 15, 2009 [3 favorites]


Aw crap. Does this mean I have to start assimilating people? I hate giving them bad news about resistance and its inevitable futility.
posted by spiderskull at 11:58 PM on January 15, 2009 [2 favorites]


My dishwasher is making louder noises. I suppose I should expect this.
posted by queensissy at 12:36 AM on January 16, 2009


Although Clark & Chalmers may be introducing the topic into philosophical discourse, our computer systems were designed with this goal in mind.
posted by honest knave at 12:45 AM on January 16, 2009 [1 favorite]


cthuljew, care to elaborate? I too started off as someone who thought dualism was silly. But I just don't see a way for first person experience to be reductively explained. It seems to be a truly unique thing in the universe. Believe me, I'd like there to be, because consciousness is a hell of a hard problem in philosophy. But I just don't see a way.
posted by wastelands at 12:45 AM on January 16, 2009


Clark and Chalmers coauthored that seminal paper, but Chalmers doesn't actually subscribe to the extended mind hypothesis (note the caveat at the beginning: "Authors are listed in order of degree of belief in the central thesis."). Chalmers isn't a functionalist, so that isn't surprising. Andy Clark is the real spokesman for the extended mind movement. He brought out a new book on the subject a couple of months ago. I haven't looked at it yet, but it's buried somewhere in my to-read pile.

For those interest in the extended mind hypothesis, Rob Wilson's Boundaries of the Self is one of the more sober things that I've read. He has some interesting papers on the subject on his website.
posted by painquale at 12:54 AM on January 16, 2009 [1 favorite]


Socrates: At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth... he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. ...To him came Theuth and showed his inventions... when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them ...this discovery of yours will create forgetfulness in the learners’ souls ...the specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

'n stuff.
posted by OrangeDrink at 12:55 AM on January 16, 2009 [10 favorites]


But I just don't see a way for first person experience to be reductively explained.
Certain Buddhists had a fair crack at that, identifying the impersonal dharmas that accounted for the supposed contents of their minds.
posted by Abiezer at 1:04 AM on January 16, 2009


Google books link explaining that (I think; I'm no expert and may have it all round me neck).
posted by Abiezer at 1:06 AM on January 16, 2009


I think a lot of the discussion about the 'extended mind' puts the 'boundary' at the wrong place. They start by assuming that whatever is within the skin is 'inside', and whatever is not is 'outside'. But a more interesting place to put the interface is at the edge of the brain itself.

Everything in the brain is 'inside', everything not ... including our physical limbs, etc. ... is 'outside'. This way, you can look at our present condition as already being 'cybernetic' - a lump of very special matter, acting as a control agent over the physical appendages attached to it.

So, as we start to add more appendages of our own design and construction (artificial body parts, sensors to allow 'mind control', etc. etc.), nothing essentially changes at all. This easily explains why/how we are so easily able to 'merge with our tools', as is discussed in the linked story.

I'm a cyborg!
posted by woodblock100 at 2:04 AM on January 16, 2009 [3 favorites]


Abiezer, Buddhists argue against a permanent self, but I've never met a Buddhist who argues against first person experience existing, i.e. consciousness itself. In fact, felt experience is just about the only thing some of them believe exists. :-)
posted by wastelands at 2:20 AM on January 16, 2009


I may well have misunderstood the question, but recalled reading the kinds of arguments summarised in the book I linked to at greater length - that's what I take that "Dharmas are momentary elements of experience, flashing in and out of existence and directly knowable to the cognizing mind, which is itself one of these momentary dharmas." (emph. as original).
posted by Abiezer at 2:40 AM on January 16, 2009


We, Borg is another great essay from a few years back about how we'll welcome becoming borg, although remembering how to run may also be useful if/when this happens.
posted by davemee at 2:46 AM on January 16, 2009


Everything in the brain is 'inside', everything not ... including our physical limbs, etc. ... is 'outside'. This way, you can look at our present condition as already being 'cybernetic' - a lump of very special matter, acting as a control agent over the physical appendages attached to it.

Rather than say "controls physical appendages", I might say the brain interacts with the electrical signals it receives, and the interactions don't just go one way. The interesting question to me is what motivates the brain exactly? Or motivates the mind, if that's a coherent question. Even a dog playing fetch seems quite distant from a pure survival instinct
posted by crayz at 2:55 AM on January 16, 2009


I've been working on an algorithm-heavy Python program for the past couple of weeks, and it just struck me how much harder it would be to work on this without Google.

I have a thick book on Python, but often it's just faster to CTRL-T-click textbox-"Python list search ENTER"-click top link. And that's if I'm using (paradoxically) Chrome; in Firefox, a search box is right there on the titlebar. All the documentation, programming examples, specialized modules, and so forth I could want are within ten seconds from my eyes.

In my Commie-Door days I'd have to thumb through my ragged Programmer's Reference Guide or Mapping the C64, and it'd take me much longer to program a much simpler system. The constant exposure to the pages did mean the knowledge stuck in my head longer; I find I had a better attention span for tackling a difficult problem than I do now. But then, I don't have to actually solve so many problems, especially since Python has built-in functions that do so many things. The effort has shifted from crunching through dumb procedural stuff to mining Google results.

But then, would we even have languages like Python without the internet? It, and most successful open source software really, are systems that would not have been possible without it, and once you have that the manifestation of Google, or something with a similar flavor, is basically inevitable.
posted by JHarris at 3:36 AM on January 16, 2009 [1 favorite]


The interesting question to me is what motivates the brain exactly? Or motivates the mind, if that's a coherent question.

I think that might be kind of a tautology; once the brain exists, it simply continues to function. Motivation is no consideration. It has to do something so it does ... something.

It seems to have no 'off' switch. When there is plenty of external input coming in, it keeps busy processing it; when there is less (during sleep, etc.), it replays, reruns, and re-organizes its own materials.
posted by woodblock100 at 4:15 AM on January 16, 2009


Sadly, Google have already performed their first lobotomy - they've killed Notebook.
posted by specialbrew at 4:42 AM on January 16, 2009


Eh, this argument is always stupid to me. I don't really think google use makes you smarter or stupider. Intelligence is not 'having access to a shitload of information' but 'how one interprets that information'. Information is useless if it cannot be interpreted. For example I can go on arxiv.org and download a bunch of mathematics papers, maybe even some that are relevant to my own problems, but this does not matter as I cannot read (i.e. interpret) them. I am not familiar with the terminology, symbology, nor concepts the authors assume I know. Simply 'extending your mind' through the use of google, i.e. increasing the range of information, does not increase your intelligence in this sense. The second problem is the tacit assumption in the articles that the mind is nothing more than a mapping function between input and behavior (i.e. flying a chopper, moving colored blocks around). However if I wanted to do something that is more generative in nature like say compose a song or write a short story, it is not clear an extended mind is going to have any bearing on the quality of said output. Yes I can access other's works, but the process of experiencing other's works, extracting patterns according to my own aesthetic judgment, and reformulating those patterns into a unique composition, is again agnostic to having any sort of 'extended mind'. So, google and the like (i.e. public libraries) allows us to be better informed and not have to reinvent the wheel, but if you think said institutions are doing more, you are probably the type of person who thinks everything the mind does is analogous to moving colored blocks around on a screen.
posted by norabarnacl3 at 5:18 AM on January 16, 2009 [4 favorites]


The most interesting place for the extended mind theory to go vis a vis internet seems to me to be the fact that the most useful extension of my brain is often resources created in part or whole by other people. Or even other people themselves. Solipsism grows more and more attractive every day.

What do you think about that?
posted by Potomac Avenue at 5:22 AM on January 16, 2009 [1 favorite]


The most interesting place for the extended mind theory to go vis a vis internet seems to me to be the fact that the most useful extension of my brain is often resources created in part or whole by other people. Or even other people themselves. Solipsism grows more and more attractive every day.

What do you think about that?


I think you're full of shit.
posted by Potomac Avenue at 5:23 AM on January 16, 2009 [3 favorites]


*moves norabarnacl3 coloured block off screen, gives screen a little rub down*

Just kidding, I totally agree nora

These mechanistic models of the mind are a joke, even the authors of the article have to remind themselves on the main page blurb don't listen to the cynics, as if optimism guarantees reality. This is some 36 years after Dreyfus's critique of artificial intelligence and the general assumptions about intelligence it relied on. It's proof if nothing else that institutional authority, blindspots, confirmation bias and feedback loops have no difficulty replicating themselves on the internet.
posted by doobiedoo at 5:53 AM on January 16, 2009


All tools make humans smarter.
Google is a tool.
Therefore: Google is making humans smarter.
posted by DU at 5:55 AM on January 16, 2009


Computers and the internet certainly do extend the mind but not as much as I think they could. Brains record associations of sensory inputs. If we want computers to help us better learn new things and remember old we have to create better sensory experiences with them. We're only starting to get going on this and we've only recently been building computers that have the processing power to do it.
posted by wobh at 5:56 AM on January 16, 2009


Spot on. Humans aren't natural born cyborgs. If so, UNIX would be the most popular operating system in the world, and the web would have been perfect in 1998, when these philosophers started philosophising.

Potomac, you might like reading Hutchins.
posted by anthill at 5:59 AM on January 16, 2009 [1 favorite]


It'd be funnier if you linked extended mind to Google seppuku. :)
posted by jeffburdges at 6:19 AM on January 16, 2009


Sounds like someone's been reading Charles Stross.
posted by Mister_A at 7:04 AM on January 16, 2009


Intelligence is a wobbly concept -- teh webs makes access to facts fabulously easy, so you can grab whatever date or statistic you might want in seconds, so that lets us all behave as if we have better memories than we do, and a good memory is surely a kind of smarts. But there are people with lousy memories for facts who are great with concepts and problem solving... and then there are ideas like emotional intelligence.
I love that the web is still fundamentally a print medium -- yes, we read the web differently than a book or a newspaper, but I still think that fundamentally the web drives us to read and write more than we used to. How many handwritten letters did you write in 1990? How many emails did you write last week? S-M-R-T!
posted by hephaist0s at 7:05 AM on January 16, 2009


Can mind-matter dualism be squared with evolutionary biology? If the mind really is distinct from the matter that enables it, how/why would such an arrangement have come into being?

Dualistic worldviews all seem to regress to the belief that the mind is somehow "special." Ultimately - and I know it's never stated this way, but - dualism implies that human consciousness is a culmination, an end-point. Which is nonsense, yes?

Good links, all the same.
posted by ParsonWreck at 7:36 AM on January 16, 2009 [1 favorite]


This is no surprise. I'm only an eideteker when there is a computer with internet access within reach. That's why I only use it as an online handle.
posted by Eideteker at 7:55 AM on January 16, 2009


Can mind-matter dualism be squared with evolutionary biology? If the mind really is distinct from the matter that enables it, how/why would such an arrangement have come into being?

The whole point to property dualism is to do an end-run around this problem. Rather than multiplying entities, we multiply properties. Though I am 'just' a body, which is the product of evolutionary events, that body has distinct attributes: height, weight, color, texture, and, oh yes, consciousness. In the process, our understanding of consciousness is radically altered, but if we're honest with ourselves we must admit that we still don't have an adequate understanding of what bodies themselves are, so this confusion isn't unique to consciousness. The end point of this argument is that since all physical bodies have size and weight, though those attributes vary in magnitude, it's possible that all physical entities have varying degrees of consciousness. As an alternative to the pantheist hypothesis, we might say that, like sub-atomic particles or cosmic strings which require cohesion and the right arrangement for an attribute like color to emerge, these physical entities (rocks, grass, or simply proteins) are merely as-yet unconscious objects who await only the appropriate arrangement for consciousness to emerge.

And, just so we're clear, I think the timeline suggests that Stross was reading Chalmers, not vice versa.
posted by anotherpanacea at 8:18 AM on January 16, 2009 [2 favorites]


(forgot to add: in this sense, it's probably best to refer to the view as 'property pluralism.')
posted by anotherpanacea at 8:21 AM on January 16, 2009


Google doesn't create more information it just facilitates access to what already exists.
posted by Pollomacho at 8:28 AM on January 16, 2009


The Internets is making us different.

And better. Screw you disconnected monkey people! I may not have attentions spans or spelling skills but I have a world of semi-authenticated information at my fingertips!
posted by Artw at 9:51 AM on January 16, 2009 [1 favorite]


I love that the web is still fundamentally a print medium -- yes, we read the web differently than a book or a newspaper, but I still think that fundamentally the web drives us to read and write more than we used to. How many handwritten letters did you write in 1990? How many emails did you write last week? S-M-R-T!

Hmm. I feel like I thought letters through more than I think through many of the emails that I send. Also, lost is the illustrated (doodled) letter!
posted by aniola at 10:32 AM on January 16, 2009


A friend of mine was waxing rhapsodic on how wearable internet interfaces will eventually make it so that at any time people could instantly check facts or get definitions. My reaction was

"Hi! My name is..."*checks Google*"...Tom! Nice to meet you!"
posted by happyroach at 11:50 AM on January 16, 2009


Hi Tom I'm A Train Station!
posted by Potomac Avenue at 1:19 PM on January 16, 2009


You're the dude googling the dude who used to remember he was the dude without googling the dude.
posted by Artw at 1:34 PM on January 16, 2009 [1 favorite]


I've come around on dualism, largely by analogy to software.

Windows may be running in RAM, but's not windows is not a property of RAM, and it's not dependent on RAM.

Now, I don't think conciousness is entirely analogous to a computer program, I think a large part of what's happening in the brain is simply an emergent property of complex networks of neurons, but I think focusing merely on the physical interactions of the neurons would be like trying to figure out how an xbox game works by examining the transistors.

There is 'something' there that isn't happening on the physical layer of the brain. To use another computer analogy, it's like the OSI model. Obviously it's encoded in the physical structure of the brain, but there is something being encoded there, that's not 'just' the physical activity of the neurons.

I'm not saying its a soul, I don't believe in souls, but at the same time, I don't think the individual consciousness is entirely identical to the physical body it inhabits. I could probably explain this better, but I haven't thought as deeply about it as I should.
posted by empath at 6:06 PM on January 16, 2009


I also don't believe in a unitary consciousness, either, though. I think the concept of identity is somewhat fluid, and may change as circumstances change (consciousness expanding to encompass a car you're driving or shrinking to the confines of your own brain when you're dreaming).
posted by empath at 6:09 PM on January 16, 2009 [1 favorite]


I've come around on dualism, largely by analogy to software.
That's not dualism, unless you think that booting Windows is creating a whole new separate plane of existence. It could be property dualism, though.

The Extended Mind can be an easier swallow if you think about it in terms of couples. We know that long-term couples begin to rely on each other for tasks. For instance, one partner might know that the other knows how to cook a certain dish, or how to fix the heating. When those tasks need doing, they are automatically delegated to that person.

Now extend that to information -- one partner will remember the names of friends' children and spouses, the other will remember the passwords to the online banking. Etc. This is something that each can rely upon pretty much as if they had it in their own heads. When our couple meet a friends' new partner, the remembering-one will store the information in their head, and the other will store it in their partners' head. When they need to retrieve it, they'll ask for it. This shared storage, each partner focusing on what they do and remember best, makes each one smarter, in aggegrate.

(If the partner's not around, they'll feel like a part of themself is missing, at that point, and will be correspondingly dumber. Facetious, slightly: this also helps explain why a really bad break-up makes you feel like you're losing your mind. In one sense, you are.)

Google, here, acts in a similar way. I don't need to remember how to do this or that esoteric task any longer, as I trust it's in Google, and always have my phone with me. I'm smarter, as long as those around me don't mind a minute-long pause while I access my smarts.
posted by bonaldi at 7:48 PM on January 16, 2009 [2 favorites]


wastelands: I subscribe largely to Hofstadter's view on consciousness. First-person subjective experience is a largely linguistic convention developed by consciousness (namely, use of the word “I”). Its situation in a physical body creates the illusion that it is a singular thing, separate from the world around it. Consciousness itself is just the combination of all the aspects of human intelligence in a sufficiently complex network, namely the human brain. Other animals have many or all the same faculties as humans, but do not have powerful enough hardware for it all to coalesce into complete, conscious self-awareness.

Now, keep in mind, I'm no expert on anything, and claim nothing authoritative, but that's the only view I've come across that makes any real sense to me.

I say dualism is silly (with more or less polite spin on that phrase) for the same reason that saying that a computer running software is dualism is wrong. The software is not a thing, it's a pattern. It's just such a complex pattern that its behavior can be described as completely distinct from the behavior of its substrate, or hardware. We look at the operating system running a computer (and vice versa) and call it “Linux” (or what have you), and just the same way we look at the consciousness running a brain (and vice versa) and call “I”. Dualism is like saying, roughly, “Here's a computer. Now, SOMETHING is making these programs on it run, although I can't for the life of me say what.”

To make it perfectly clear, most of what I'm saying is parroting Hofstadter's GEB and I am a Strange Loop, although the first discusses it in a rather long-winded way, and the second provides little context.
posted by cthuljew at 10:53 PM on January 16, 2009 [1 favorite]


However if I wanted to do something that is more generative in nature like say compose a song or write a short story, it is not clear an extended mind is going to have any bearing on the quality of said output. Yes I can access other's works, but the process of experiencing other's works, extracting patterns according to my own aesthetic judgment, and reformulating those patterns into a unique composition, is again agnostic to having any sort of 'extended mind'. So, google and the like (i.e. public libraries) allows us to be better informed and not have to reinvent the wheel, but if you think said institutions are doing more, you are probably the type of person who thinks everything the mind does is analogous to moving colored blocks around on a screen.

Actually, I am going to disagree with you here. Extended mind is an important component of generating creative output -- but this doesn't really apply in your example. Here is my go at it.

Sometimes I have an idea for a drawing in my head -- and although it can feel fairly concrete and comprehensive, it really is not. In fact, what I need to do is start sketching it out on a piece of paper, working out the details as I draw. There is flow of information here, both outwards (my ideas being put down on paper) and inwards (the newly drawn components observed and assessed by me). The final drawing is not a product of my own skill, but rather, a result of the feedback loop between me and the sketchpad. I have extended my mind through pencil and paper.

Having said that, I do agree with your bigger point: information should be interpretable for it to be considered useful. Mere data mean nothing. I think that Clark and Chalmers would agree with you on this, though.

Bonaldi: That's a great example; comment favourited. Being surrounded by active externalists all the time, I often forget that it is not a view typically accepted by the general public.

Personally, I love the Clark and Chalmers essay, and think that their discussion on notebooks as external memory storage is great. For those of us that store everything on cellphones and palm pilots -- actually, it's Gmail and Google Calendar in my case -- I think that their point especially holds true. I mean, I know that I am dumber not having access to my daily schedule.
posted by tickingclock at 11:16 PM on January 16, 2009 [1 favorite]


A pattern is not material, though. You're talking about something that exists and has properties whether or not it's expressed in a physical medium.
posted by empath at 8:41 PM on January 17, 2009


Jerry Fodor reviews Clark's book: Where is my mind?
posted by homunculus at 8:34 PM on February 6, 2009


« Older The Audacity of Hoops...  |  Apparently, the president-elec... Newer »


This thread has been archived and is closed to new comments