your mind
May 25, 2005 8:05 AM   Subscribe

download your mind Realistically by 2050 we would expect to be able to download your mind into a machine, so when you die it's not a major career problem,' Pearson told The Observer. 'If you're rich enough then by 2050 it's feasible. If you're poor you'll probably have to wait until 2075 or 2080 when it's routine. We are very serious about it. That's how fast this technology is moving
posted by robbyrobs (81 comments total)
 
To paraphrase the article: "Look at me! I make wild predictions to get my name in the news!!!"
posted by Capn at 8:10 AM on May 25, 2005


I bet that, say, Paris Hilton's mind could fit onto a single CD right now.
posted by Faint of Butt at 8:14 AM on May 25, 2005


Aeroplanes? Yoghurts? What kind of creamy, spage-age, wackily-spelled world lies ahead?
posted by santiagogo at 8:17 AM on May 25, 2005


I predict that by 2050, we can expect Ian Pearson to be predicting mind downloading around 2100.
posted by Plutor at 8:19 AM on May 25, 2005


This sounds like yet another technologist afraid of dying. Aubrey De Gray comes to mind as someone else who's ego won't let them believe that there could be a world without their consciousness.
posted by splatta at 8:22 AM on May 25, 2005


Nah, I'm thinking Paris Hilton's mind could fit on a 256mb USB keychain. And I bet you it would be jewel-studded.
posted by icontemplate at 8:23 AM on May 25, 2005


This is just a natural step on the path to manic-depressive robots.
posted by Jart at 8:24 AM on May 25, 2005


The problem with mind downloading as a form of immortality is that just because some machine gets my memories, that doesn't stop me dying.

Actually, it kind of makes it worse...
posted by cleardawn at 8:24 AM on May 25, 2005


" If I'm on an aeroplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it's supposed to be on the ground.'

I can see it now:
"Ladies and gentlemen this is your computer pilot speaking, I've been informed that we are number one for takeoff, however I can't seem to get my techno-therapist on the phone to talk me thru this..." In a woody allen type voice of course.
posted by splatta at 8:26 AM on May 25, 2005


What, prithee, is the benefit to you of having a "mirrored drive" of yourself? When you die, you will still be dead, even if others can hardly tell the difference.

Reminds me of that Bobcat Goldthwait joke...
"I lost my wife. No, I didn't really lose her. I know where she is, but when I go there, there's this new guy doing it. "

As for future applications of artificial intelligence.:
"it's my conclusion that it is possible to make a conscious computer with superhuman levels of intelligence before 2020.". . . "It would definitely have emotions - that's one of the primary reasons for doing it. If I'm on an aeroplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it's supposed to be on the ground."

In other words: How to design a plane that doesn't want to fly.
posted by insomnia_lj at 8:29 AM on May 25, 2005


'We can already use DNA, for example, to make electronic circuits so it's possible to think of a smart yogurt some time after 2020 or 2025, where the yogurt has got a whole stack of electronics in every single bacterium. You could have a conversation with your strawberry yogurt before you eat it.'

YOU: So, uh... how's it goin'?
YOGURT: Oh, you know... can't complain.
YOU: Um... did you watch Survivor last night?
YOGURT: I don't have eyes.
YOU: Oh, right. Sorry.

(uncomfortable silence)

YOU: Listen, I'm going to eat you now.
YOGURT: I understand.
YOU: It's nothing personal.
YOGURT: Hey, that's Life.

(Yogurt starts humming 'That's Life' by Frank Sinatra. You eat quickly. You never did like that song.)
posted by Fuzzy Monster at 8:30 AM on May 25, 2005


Downloading your mind would be akin to creating a clone of yourself with all your memories intact, albeit a silicon-based one. I guess my point is, it wouldn't be "you".

i.e. ditto cleardawn
posted by malaprohibita at 8:32 AM on May 25, 2005


Hmm. What'd need to happen was for bits of the brain to be slowly replaced by electronics, so that there was no "break" in consciousness, and you were still you. Eventually the entire brain would be bio-electronics, and could be transplanted into a new home. Immortality, yey.
posted by bonaldi at 8:33 AM on May 25, 2005


Putting aside philosophy, I'm not even sure that a mental transfer would be technically possible in 50 years. Remember, it's not just a matter of having the processing power necessary to house thoughts. You have to get data from the brain to the machine. And while you're doing this, the brain is still working, sending electrical signals to and fro. You have to record these "thoughts in motion" too. Without interfering with them! (And presumably without killing the patient)
posted by unreason at 8:44 AM on May 25, 2005


This was a favorite dorm-room bull session scenario lo, these 20-some years ago.

bonaldi, if your brain were entirely replaced by man-made components, to what extent would that actually be "you"? Cf. malaprohibita and cleardawn, I'd argue it's not you at all.

If 49% of your brain mass had been replaced, with 51% intact, that's arguably still you. If 51% has been replaced, maybe you're now basically just carrion being devoured by another "life form."

Or maybe if certain brain functions are still organic (whichever ones were deemed Most Important) and the others were mechanical, you'd still be you. Or maybe not.

In other words, at what point does "enhancement" make you non-human?

Pour me another schnapps.
posted by GrammarMoses at 8:44 AM on May 25, 2005


A thousand miles an hour going nowhere fast
clinging to the details of your past
talking 'bout your damage and your wasting my time
wanna be the king of pain, stand in line
all the numbers and the colours and the facts
backed by the rumours and the figures and the stats
I think I'm gonna download my mind
posted by drakepool at 8:47 AM on May 25, 2005


Fuzzy Monster :

YOU: Um... did you watch Survivor last night?
YOGURT: I don't have eyes.
etc
Classic
posted by kenaman at 8:50 AM on May 25, 2005


Cerebral sprawl
posted by weapons-grade pandemonium at 8:51 AM on May 25, 2005


So it'll be like Down and Out in the Magic Kingdom? Sure, sign me up, but I believe it's just wishful thinking.
posted by cameleon at 8:53 AM on May 25, 2005


If memory serves, Daniel Dennett had a lot of fun with this in "The Mind's Eye". One of the essays started with the idea of severing the corpus callosum, and replacing it with a data link, so the two hemispheres could be physically separated. It's still entirely your brain, so it's still you, right? He then proceeded to continue the "dissection" until he had indivual neurons, connected by high speed links, scattered all over. This type of reductionist argument indicates that the pysical brain is never "you". What if you downloaded your brain into a raid array mirrored across a WAN. Where are you?
posted by georgeTirebiter at 9:08 AM on May 25, 2005


To paraphrase Wittgenstein: If you could download a lion's mind you still wouldn't be able to understand it.
posted by StickyCarpet at 9:12 AM on May 25, 2005


MetaFilter: Your favorite Slashdot's recap site
posted by nkyad at 9:14 AM on May 25, 2005


'You can also start automating an awful lots of jobs. Instead of phoning up a call centre and getting a machine that says, "Type 1 for this and 2 for that and 3 for the other," if you had machine personalities you could have any number of call staff, so you can be dealt with without ever waiting in a queue at a call centre again.'

Of course, once you realize that the machine is conscious, you cannot force it into 24/7 slave labor and you're right back where you started.
posted by Bort at 9:22 AM on May 25, 2005


What movie was it, Emilio Estevez played a race car driver, Freejack? Oh yeah. This won't end well but at least Anthony Hopkins dies and Mick Jagger gets alot of cool lines.

I bet that, say, Paris Hilton's mind could fit onto a single CD right now.

Actually, I've got a spare 8 meg compact flash card that would hold everything easily with room for about 7 megs of porn too.
posted by fenriq at 9:25 AM on May 25, 2005


I remember when Dolly the Sheep was the hot topic, and we were trying to think of people worth cloning. Other than my father, my uncle, and Mozart, we didn't really come up with much.

Perhaps they should try try looking for cancer cures?

(fuzzy monster-

That's really quite good. You do this for money?)
posted by IndigoJones at 9:27 AM on May 25, 2005


Actually, I've got a spare 8 meg compact flash card that would hold everything easily with room for about 7 megs of porn too

How would you tell the difference?

Personally, if this happens. The human mind could search the entirity of human knowledge in a few weeks. Then, well, infinite boredom, then insanity. Bad idea.

Our brains are slow for a reason.

And, hell, it'll take the Blue Screen Of Death to a whole new level.
posted by eriko at 9:30 AM on May 25, 2005


You can also start automating an awful lots of jobs. Instead of phoning up a call centre and getting a machine that says, "Type 1 for this and 2 for that and 3 for the other," if you had machine personalities you could have any number of call staff, so you can be dealt with without ever waiting in a queue at a call centre again.'.

Well, assuming that whole peak oil thing is bogus, there are sure going to be a lot of pissed layed off Indians and Chinese.
posted by c13 at 9:32 AM on May 25, 2005


So you can regurgitate ideas from thirty year old science fiction stories and get hailed as "Britain's leading thinker"?
posted by octothorpe at 9:38 AM on May 25, 2005


"What if you downloaded your brain into a raid array mirrored across a WAN. Where are you?"

Nerdsville?
posted by splatta at 9:40 AM on May 25, 2005


Fuzzy Monster, you just made my day.

If 49% of your brain mass had been replaced, with 51% intact, that's arguably still you. If 51% has been replaced, maybe you're now basically just carrion being devoured by another "life form."

Well, I guess you could argue that some parts of the brain are more equal than others. For instance, I'm very attached to my frontal lobes (planning, judgement, language, memory, problem solving and apparently sarcasm detection) and hippocampi (forming new memories and storing them). If you could manufacture a visual cortex, well, if it works who cares, right?
posted by goodnewsfortheinsane at 9:42 AM on May 25, 2005


If memory serves, Daniel Dennett had a lot of fun with this in "The Mind's Eye".

I think that book came out in 1971 and yet the puzzles it poses are as fresh and unsolved and frustrating as ever.

And yet, somehow all this will be solved in time to satisfy the predictions of the latest round of immortality-seekers.
posted by vacapinta at 9:53 AM on May 25, 2005


Why would it only be available for rich folk in its early stages? If the technology hits the point at which it's safe to use on humans, then anyone could afford it. If you live forever, chances are that you'll be able to repay the debts at some point. And once everyone's immortal, credit can be extended indefinitely to everyone. It'll be the monetary equivalent of the Borges story about a community of immortals, who do things like sit in one spot for twenty years, because hey - they've got time to kill.

Anyway, I have trouble with these kinds of predictions, assuming as they do a seamless extrapolation of current trends. The underlying but somehow rarely stated condition for all these fantasies goes If nothing really changes in our habits or economy or supply of ludicrously cheap oil then immortality via machine awaits.

And too often the response to such criticisms amounts to, "Um, you haven't taken into account all the great... technology... that we're gonna invent to take care of all that stuff. Lotsa... great... techno... modules".

In conclusion, fuzzy monster is funny.
posted by palinode at 10:02 AM on May 25, 2005


There's something weirdly inconsistent about this guy's thinking. He talks breathlessly about transforming human minds into superintelligent conscious software -- I'd always thought of that as 'uploading', personally, but whatever -- but the big benefit he sees as a result of that is the ability to automate call centers. Eh?

If it's a human (or even nonhuman) consciousness in the machine, it isn't exactly "automated" anymore, then, is it? A software version of me is going to be just as unlikely to want to work tech support as the meat-based version, thanks very much.
posted by ook at 10:02 AM on May 25, 2005


else who's ego won't let them believe that there could be a world without their consciousness.

Come on, now. Death sucks -- for you, yes, but mostly for people who know you. The only times it doesn't suck are when your life is no longer worth living, which in itself sucks just as much. I'm all for immortality.
If I'm on an aeroplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it's supposed to be on the ground."

In other words: How to design a plane that doesn't want to fly.
"Terrified" was a bad choice of words: the human experience of terror evolved for situations in which you have to run fast, not ones in which you have to do detailed fine motor work and make lots of judgements in front of an instrument panel. In that sense, Pearson's thinking seems a little naive (or maybe he was just trying to be pithy), but that doesn't mean it's totally wrong.

If 49% of your brain mass had been replaced, with 51% intact, that's arguably still you. If 51% has been replaced, maybe you're now basically just carrion being devoured by another "life form."

Greg Egan's 1990 short story "Learning to be me" addresses pretty much exactly this issue. In fact, anyone who's interested in this stuff should just buy the Axiomatic collection right now.
He then proceeded to continue the "dissection" until he had indivual neurons, connected by high speed links, scattered all over. This type of reductionist argument indicates that the pysical brain is never "you".

What if you downloaded your brain into a raid array mirrored across a WAN. Where are you?
Ah. You'll want to read Egan's Permutation City, which takes the thought experiment to its breaking point, stretching it from the nature of consciousness to the nature of reality in general.
posted by Tlogmer at 10:18 AM on May 25, 2005


And once everyone's immortal, credit can be extended indefinitely to everyone.

Yeah, but what would YOU do to make money to repay this credit? I mean, if someone's computer is running your "mind", presumably anything new you think of is their's anyway.
posted by c13 at 10:18 AM on May 25, 2005


"...so when you die it's not a major career problem"

Considering they won't let me telecommute while I'm still breathing, I doubt they will when my heart's not beating either.
posted by alumshubby at 10:19 AM on May 25, 2005


I honestly don't see the benefit of having an airplane that experiences a "fear" of crashing over one that is programmed to interpret and attempt to correct dangerous situations. Or is that just a less-cool of saying the same thing?
posted by 4easypayments at 10:21 AM on May 25, 2005


Once you're uploaded, you could "clone" your consciousness just by copying bits. If all the clones are networked, it would be like having multiple minds, all working at once. I'm not sure what that would feel like, but I imagine it would be something like being a freaking GOD!!! It would be really good for doing crosswords. Or running call centers, whatever.
posted by mr_roboto at 10:22 AM on May 25, 2005


To clarify: the article makes Pearson seem like a bit of a tool. But it's good to see this type of thinking at least getting mainstream press, even if the memetic vector isn't the best.

I thought about trying to make that sentance not rhyme, but it's stupid-sounding enough that it kind of works.
posted by Tlogmer at 10:22 AM on May 25, 2005


I can just see the longterm ramifications...

"You had an accident, but I restored you from backup... you should be fine now."

"Thanks, doc. Hey... wait! That wasn't one of my memories before. HEY! What did you do to me! I'M A FREAK!"

Huh? Let me check to find out what the problem is. Oh, oopsie! Somehow, you were backed up onto a drive that contained a clown-sex VR program....


posted by insomnia_lj at 10:23 AM on May 25, 2005


"'We're already looking at how you might structure a computer that could possibly become conscious. There are quite a lot of us now who believe it's entirely feasible."

Hasn't this guy seen The Terminator or The Matrix? Don't go there. Set up an international treaty (similar to the NPT) if you have to.

Vernor Vinge: "Good has captured the essence of the runaway, but does not pursue its most disturbing consequences. Any intelligent machine of the sort he describes would not be humankind's 'tool' -- any more than humans are the tools of rabbits or robins or chimpanzees."

"If it's a human (or even nonhuman) consciousness in the machine, it isn't exactly "automated" anymore, then, is it? A software version of me is going to be just as unlikely to want to work tech support as the meat-based version, thanks very much."

[SPOILER for Vinge's story "The Cookie Monster" follows]

Yeah, but the software version of you could be reset over and over again, so that you always think you're on your first day of a temporary job. Maybe you wouldn't even know that you're running on computer hardware instead of meat, if there was a good enough simulation of your cubicle.
posted by russilwvong at 10:23 AM on May 25, 2005


that's arguably still you
What about if my arm was replaced with a bionic one? Surely it's still "my" arm? If the "me" software gets a hardware upgrade, it's still running "me". Only the upgrade must be done bit-by-bit to avoid interruption of service.

Tech metaphors. Now we really are in /. Toto.
posted by bonaldi at 10:27 AM on May 25, 2005


This would raise identity theft to a whole new level.
posted by Cyrano at 10:50 AM on May 25, 2005


I'm only going for this if we can save the yogurt, too.
posted by effwerd at 10:55 AM on May 25, 2005


I thought that's what flickr, friendster, delicious, audioscrobbler et al. were for.
posted by muckster at 11:01 AM on May 25, 2005


GrammarMoses: "if your brain were entirely replaced by man-made components, to what extent would that actually be "you"? ... In other words, at what point does "enhancement" make you non-human?"

Also known as the Ship of Theseus paradox.
posted by Plutor at 11:04 AM on May 25, 2005


The interesting fact about this is not when this is going to happen, but the fact that it is possible in the first place. The concept itself is what I find fascinating about the idea of the singularity - the abrupt moral questions about the nature of self and the distinction between "I" and "You" that it transforms into a blurry grey area.

Really, there is no difference between an implantable hippocampus then a PDA, an artificial exoskeleton and a car, or "merging two minds" and a collaborative novel. The devil is in the details, and in drawing these questions from the easily avoidable background noise of philosophical ponderings into the realm of pragmatic concerns.
posted by iamck at 11:15 AM on May 25, 2005


Weren't we, um, supposed to have colonized the moon by now?
posted by Specklet at 11:17 AM on May 25, 2005


Wow, elevating human conciousness to the next level, a higher form. The trasmutation of humanity into immortality.

the benifit: More efficent callcenters!

Seriously, this guy is a dumbass. Not very smart at all. You don't need a real human conciousness to run a callcenter. Already I've called automated CAs that can tell what I say by computer. So instaid of:

"What state are you in, for Iowa, press 1, for Nebraska, press 2 for New York.."

I get:

"What state are you in, for Iowa, say 'Iowa' for Nebraska, say 'Nebraska' for New York.."

But that extra text is mostly there to help people through the process of talking to a computer.

An Airplane computer that's well programed is going to be able make rational choices that will maximize the probability of the plain not crashing, just as well (or better) then a "scared" person.

It's an intresting idea. It's something I thought about a lot as a kid. If you duplicated your mind, the new copy would still feel like you, but you'd still be just as afraid of death as before. I mean, death ultimately is the end of conciousness, right?

On the other hand, if I could do a 'fade-over' where my brain was copied neuron by neuron, then I might go for it.
posted by delmoi at 11:19 AM on May 25, 2005




Now, if I download my brain into a working robot and then have the robot do my job, will I have to pay taxes twice?
posted by effwerd at 11:36 AM on May 25, 2005


i've wondered for a while,

with all the "next-gen" video game consoles that keep becoming more and more powerful, might something like this shape the future of video games? staring at a screen and manipulating polygons with a controller is fun, but we've been doing it for years.

what if the NEXT generation of gaming encompasses out-of-body experiences (or recreational drug use lol) where you truly DO up-/download your mind into the game, and see, touch, taste, feel and hear everything around you in a virtual world? heh just like the matrix i suppose.

i ask because this doesn't seem terribly plausible as a means of achieving immortality. perhaps with this technology we might be able to "re-format" our minds or delete particular memories we do not wish to keep (abuse, bad relationships, etc), or install things in our minds (learning stuff, tetris) but i don't see how a piece of hardware could ever hope to emulate distinctly human aspects like spirituality, or creativity for starters.
posted by Ziggy Zaga at 11:38 AM on May 25, 2005


What about if my arm was replaced with a bionic one? Surely it's still "my" arm? If the "me" software gets a hardware upgrade, it's still running "me".

The bionic arm is pure hardware; peripherals are made to be plugged in to any machine. If we were in a Terminator movie together, I could unscrew my bionic arm and screw it onto your body and theoretically it would be the same arm, an arm that I would argue is not "you" or "me" or anybody.

If the brain is hardware, it's hardware that's modified every moment by new data, differently in every machine; in some cases creating new software functionality for itself to process all the data. And not all brains have the same composition to begin with (neurochemicals, etc.) Result: identity, personality, idiosyncracy.

I don't want to belabor the metaphor (too late!) but surely there's a difference between something that automatically upgrades itself (the actual brain) and a collection of spare parts that gets swapped in for the original...?
posted by GrammarMoses at 11:49 AM on May 25, 2005


Sorry, meant to include link to this entry on identity and change in conscious beings.
posted by GrammarMoses at 11:54 AM on May 25, 2005


I'm wondering (while we're in this bong-fueled dorm session) what would happen if, after my mind had been downloaded (uploaded) into this...ummm robot computer thing....maybe put a mask of me on it, y'know...

I'm wondering if what if all the stuff that I don't consciously remember gets uploaded as well, and if so, what would that feel like? I would definitely not be "me" if that were to happen.
posted by kozad at 12:34 PM on May 25, 2005


russilwvong, you might be further intrigued to see the extent to which these ideas have penetrated even pop/hip hop culture.

"You know what we're doing right now? You and I, we're interfacing."
posted by kimota at 12:42 PM on May 25, 2005


These fantastic claims are not made by a science fiction writer or a crystal ball-gazing lunatic. They are the deadly earnest predictions of Ian Pearson, head of the futurology unit at BT.

The whole idea that "crystal ball-gazing lunatic" and "head of the futurology unit" are somehow distinct concepts is quite amusing. Why no, I'm not a quack... I'm a FUTUROLOGIST!
posted by casu marzu at 1:11 PM on May 25, 2005


Special, special thanks to all you up thread who refused to take this seriously. FuzzyMonster, especially: Damn straight. Or bent. As the case may be.

But as long as at least some folks are taking this seriously....

First, I won't do more than mention the fact that you are not the copy -- other folks pointed that out.

Second, and a bit less obvious: If the machine doesn't have the same mode of experiencing the world that you have, it won't be you. If it doesn't have sensory experiences that can at least map to ours, its experience of the world will be sufficiently different that the new-existence "you" won't be recognizable as you, even if you were alive to see it.

This is, in fact, the most basic problem with olde skoole hard-AI: It's careless with its thought experiments. For example, Ned Block's "Chinese Nation Mind" argument is a stupid pointless waste of time, until you realize that Block et al never realized that "functional isomorphism" requires that, like, functions be isomorphic. Which is to say, if you are going to substitute chinese guys with string for neurons, then you'd better make sure that chinese guys with string can actually model the functions of neurons. If they can't, then no "chinese nation mind" can actually have anything relevant to say about whether you can make human minds out of guys jerking string. And that if they do, you can actually organize them into a functionally isomorphic whole. If you can't, then your "experiement" is invalid, and demosntrates nothing.

"Right stuff" my ass....
posted by lodurr at 2:48 PM on May 25, 2005


lodurr, while it's a brilliant attempt at quickly putting the problem to rest, you really haven't "solved" anything with your observation until you defined "you."

your body is in a constant state of flux. do you define yourself as a particular point in time? of course not - your cells are constantly dying and regenerating and changing. "You" are not the same "you" you were 2 minutes ago, let alone 2 years ago.

i once heard an analogy of the self to the pattern a river makes when it encounters a rock - you are not the rock, but the water. therefore, "you" exist as a pattern, which makes it somewhat harder to prove that there is actually ever a "you" existing at all...
posted by iamck at 3:01 PM on May 25, 2005


GrammarMoses: the brain is not 'upgrading', it's merely changing - just like a computer hooked up to a digital video camera isn't upgrading as it records what it sees, only changing the configuration of bits and bytes. You might argue that the brain is 'improving' over time with all this 'stimulation', but that's not necessarily the case. (i.e, alcoholics and accountants).

The whole idea of whether personality is transferable comes down to whether or not you believe Turing's proposition that (to abstract) indicates if there's no possible perceivable difference, there is no difference. Most objections I've seen to this have been special pleading or arguments from magic - that there's either something intrinsic to the human mind that makes it unmechanical or that demands unbroken existance.

I find this wholly unconvincing - and suspect it derives primarily from the arbitrary demands of ego, much like concepts of the afterlife. Evolutionarily speaking, the drive to enjoy continuous consciousness makes sense. Rationally, however, it gets in the way and makes one say silly things because of personal discomfort with the alternative.

On preview: iamck: Bang on. Ego tells us there is a 'me' but fails to tell us what it is, except for some magical 'sense of self' that can be rationally explained out of relevance.
posted by Sparx at 3:08 PM on May 25, 2005


Sparx: Ego tells us there is a 'me' but fails to tell us what it is, except for some magical 'sense of self' that can be rationally explained out of relevance.

Please do so.

All you have are two realms. The mental, where there's the sense of a singular self, and the physical, where we observe the putative substrate of the mental, to be in constant flux with no permanent component. In accord with the materialistic dogma, the mental unity is declared an 'illusion' without any explanation of 'how' apart from the insinuation that since the physical realm is primary, it must somehow be so.
posted by Gyan at 4:19 PM on May 25, 2005


The mental is abstracted out because it has to be. As long as we have the Problem of Other Minds, and are giving the benefit of the doubt to the people around us (that is, as long as you're assuming, sans evidence, that everyone around you isn't an automation "pretending" to be conscious), it makes sense to treat nonbiological systems that act like they're conscious as conscious.

you are not the copy

Thought experiment: you're in a coma for seven years. During that time there's no higher brain function, and your constituent atoms replace themselves (as happens normally in a seven-year-span). Then you wake up. Different atoms, separated from you by time, but the same neural patterns: equivalent to machine-uploading or what have you.
posted by Tlogmer at 5:28 PM on May 25, 2005


c13 writes "Yeah, but what would YOU do to make money to repay this credit? "

Presumably, you'd be stuck working in a brain powered call center.
posted by Orb at 6:10 PM on May 25, 2005


what nonsense
posted by BlackLeotardFront at 6:40 PM on May 25, 2005


... "downloading your mind" ? Wouldn't it be more of an upload? I hope they can outfit my brain with some hardware that can accomodate multiple minds. But what if I switched over to my pirated copy of Bill Gates' mind, to do some devious business dealing, and then it took over and wouldn't let me switch back? What then, futurologist?

If the machine doesn't have the same mode of experiencing the world that you have, it won't be you. If it doesn't have sensory experiences that can at least map to ours, its experience of the world will be sufficiently different that the new-existence "you" won't be recognizable as you, even if you were alive to see it.

So if you lose your eyesight in a laser pointer accident, you will no longer be you? What if I get my sense of smell upgraded with some canine-derived biotech? Would I still be recognizable as "you"? If I take some hallucinogenic drugs, do I become someone else? Who? Where does he go when I come back? What's going on? Was I the same person twenty years ago? Where is the face I had before I was born? If I replace five-eigths of my brain with a high-power computer rig, giving me super-human banjo-playing ability, would the old "me" be gone forever? What if I replace my feet with roller-blades? Same deal? I look forward to donating my personality to science when I die.
posted by sfenders at 6:44 PM on May 25, 2005


StickyCarpet: "To paraphrase Wittgenstein: If you could download a lion's mind you still wouldn't be able to understand it."

This is the most important point mentioned yet, although it might be subtle. Daniel Dennett, for example, misses this, although it's no wonder; the philosophy he's working with isn't particularly complex, and consists mostly of Descartes and the 20th-century angloamericans, the weakest philosophers when it comes to thinking about the mind.

In short: if it can be 'downloaded' into a machine, it's not a mind. An arrangement of material is not understanding. The inner workings of a playstation are still equivalent to rearranging several rocks, as anyone who's ever looked at how a computer actually works knows; the act of thinking means more than that. It's somehow fitting that Mr. Pearson says that now "when you die it's not a major career problem." If the mind is really something so simple as an arrangement of matter, then death is hardly as frightening as getting a new job. And thought means nothing.

Sparx: "Ego tells us there is a 'me' but fails to tell us what it is, except for some magical 'sense of self' that can be rationally explained out of relevance."

Disregarding everything else you'd have to throw away if that were true, your statement is self-contradictory. If minds are merely arrangements, then rationality means nothing. Look around you: true materialism isn't scientific. 'Rationalism' is just a 'magical' illusion by the dictates of materialism, and 'science,' the hope of understanding things, is just a superstition; there's no way for matter to comprehend matter without being matter, and our brains obviously can't be everything.
posted by koeselitz at 6:56 PM on May 25, 2005


In short: if it can be 'downloaded' into a machine, it's not a mind. An arrangement of material is not understanding. The inner workings of a playstation are still equivalent to rearranging several rocks

Shorter: Arranging "several" rocks in the form of a small pile of rocks is not functionally equivalent to arranging "several" rocks in the form of the Edinburgh Castle. A computer model of the castle is not a castle. But a robot-built copy based on that model... that you could make a decent fortress out of.
posted by sfenders at 7:28 PM on May 25, 2005


you are not the copy

What does the copy have to say about that?
posted by vacapinta at 8:00 PM on May 25, 2005


An uploaded copy of my brain would be as much me (and as much not-me) as that old man 40 years from now with my name who is gonna be swearing at me for not saving more for retirement.
posted by straight at 9:52 PM on May 25, 2005


Thanks, kimota. That was awesome. Blade Runner influence there, too.
posted by russilwvong at 10:04 PM on May 25, 2005


Koeselitz: you make the traditional mistake of thinking that without mind there's nothing. Of course there's something - the matter that has, via the means of its replication, inherited the survival trait of the self-concept.

Just because that matter doesn't operate under the dictates of folk-psychology, doesn't mean it doesn't operate in some other sense - it merely has a tendency to delude itself as to how it operates. Rationalism and science are the best means available to determine how it actually works because it is not tied to specific assumptions about the self. And, increasingly - though I'll admit not yet overwhelmingly - the evidence is that what we call the mind (or more specifically, consciousness) is a mechanistic side effect of how we work.

Just because it's counter-intuitive does not indicate it's incorrect. cf. Newtown vs Einstein. You don't have to throw anything away that you didn't already have except a few misconceptions of your own importance and perhaps a few value judgements.

People who make the claim that the mind is something special, something immaterial and extra, without being able to demonstrate what, are up the same philosophical creek as the theologians - the evidence is against you, Occam is against you and your anthropocentrism is showing.
posted by Sparx at 10:14 PM on May 25, 2005


Sparx: A few posts earlier, I asked you to posit how the self can be explained away. I'm still waiting.

People who make the claim that the mind is something special, something immaterial and extra, without being able to demonstrate what, are up the same philosophical creek as the theologians - the evidence is against you, Occam is against you and your anthropocentrism is showing.

Nothing anthropocentric in observing that the mental does not supervene on the physical, hence the mind is immaterial even if it is caused completely by the brain. Occam, taken to its extreme, leads to solipsism. Occam is just a pragmatic trimming dictate, not a rule.
posted by Gyan at 10:27 PM on May 25, 2005


When will these scientists quit focusing on things we DON'T want and pool their resources to a single-minded focus on flying cars!?? IT'S BEEN THE TWENTY-FIRST CENTURY FOR FIVE YEARS! WHERE'S MY DAMN FLYING CAR!

Any kind of effort in duplicating human memories and thought without duplicating the carbon-based brain that houses it is going to be more for the benefit of people who survive you. Your brain gets duplicated, you die, they can still access your accumulative knowledge. You won't be able to learn anything new. There won't be a way to interact or communicate with your memories or thoughts. If they manage to find a way where a duplicated brain can continue to learn and change organically, then we're talking about crap more frightening than mere cloning. I doubt it'll ever come to that.

Imagine being able to study the raw thought processes of minds like Asimov, Einstein, or Twain, rather than just reading their words. It'd change the whole functionality of any library. Perhaps some would find this unethical or whatever, but the actual brains would be dead, so they wouldn't complain, and as long as the duplicated brain remained static and unable to organically evolve, you're not creating new life.
posted by ZachsMind at 1:33 AM on May 26, 2005


When will these scientists quit focusing on things we DON'T want and pool their resources to a single-minded focus on flying cars!?? IT'S BEEN THE TWENTY-FIRST CENTURY FOR FIVE YEARS! WHERE'S MY DAMN FLYING CAR!

ZachsMind, you've just given me the platform for my presidential candidacy in 2008: Fuck stem cells. Arts funding? Who needs art when you've got flying cars?
posted by IshmaelGraves at 6:39 AM on May 26, 2005


Tlogmer - I loved "Axiomatic" (just returned it to the library today). Egan is the best author nobody has ever heard of, IMHO. Quite possibly the best science fiction writer living today.
posted by spazzm at 7:21 AM on May 26, 2005


mental does not supervene on the physical

Gyan, what's a placebo?
posted by Sparx at 2:24 PM on May 26, 2005


Sparx, completely psychological; does nothing to prove your point. Come out with it directly.
posted by Gyan at 2:29 PM on May 26, 2005


Ok - that's a little glib. I can't rationalise the mind away without a several page essay that I'm unprepared to go into right now. Read as much neuroscience and philosophy of mind as you can - the neuroscience being particularly important. Don't stick with the layman's guides either as there's way too much subtlety in what's actually going on. Then, once so informed of the current state of play (and I stated the evidence was considerable, but not overwhelming - I just choose to bet with the considerable), ask what would actually be lost if such mechanistic determinism were the case. Now that's some interesting philosophy.

On preview: what's unpsychological about the mental? Make sense, man!
posted by Sparx at 2:53 PM on May 26, 2005


Sparx: Don't stick with the layman's guides either as there's way too much subtlety in what's actually going on.

I have, in my shelf, from the uni library, the latest definitive reference on cognitive neuroscience. You might want to read the section on consciousness; it clearly highlights the problems with a materialistic approach to consciousness. In any case, you're defering to an appeal to authority. Some arguments, please.
posted by Gyan at 3:03 PM on May 26, 2005


I was attempting to avoid appealing to authority (I never said you'd reach the same conclusions - I just wanted to make sure we were 'on the same page', as it were)

Possibly your immediate sources are more up to date than mine at present (I metafilt at non-science based work, so it's not unlikely). The best you'd be getting is my impressions, and if you've got that book beside you can selectively quote me into a million quivering pieces.

I'm happy to take it to email for a more considered exchange (In fact, it could be fun). Just post a reply (I sure everyone else has gone home by now) and I"ll put my home email on my profile for an hour or so.

If not - never let it be said that I would not relinquish my opinions in the face of greater evidence. I"ll have a hunt and see if the local university has a copy. Thanks for the heads up.
posted by Sparx at 3:24 PM on May 26, 2005


An email exchange would be perfect. Post here when you put your email.
posted by Gyan at 6:54 PM on May 26, 2005


« Older A Spreading Treason   |   The Virginia Watchdog Newer »


This thread has been archived and is closed to new comments