Join 3,375 readers in helping fund MetaFilter (Hide)


What does it feel like to be someone else?
June 23, 2013 8:03 AM   Subscribe

On being an octopus

via
posted by latkes (47 comments total) 28 users marked this as a favorite

 
Background:

What is it like to be a bat?

What is it like to be Bat Boy?
posted by Navelgazer at 8:19 AM on June 23, 2013 [6 favorites]


what is it like to be yourself?
posted by Postroad at 8:22 AM on June 23, 2013 [2 favorites]


Well before Nagel, Jacob von Uexküll tried a similar exercise for a range of animals. His "A stroll through the worlds of animals and men" (pdf) makes for entertaining and thought-provoking reading as he tries, on the basis of reason, to determine qualitative characteristics of the worlds of paramecia, ticks, snails, fish, flies, dogs, and even other humans.
posted by stonepharisee at 8:34 AM on June 23, 2013 [4 favorites]


The first book of The Once and Future King has a number of charming short sections on this too, imagining what it's like as the young protagonist is magically transformed into a fish, an ant, an owl, etc.
posted by LobsterMitten at 8:37 AM on June 23, 2013 [3 favorites]


Ding!

In the first corner we have the agents of science and logic who will argue that, no matter how often we observe animals of whatever simplicity or distance of relation from us exhibiting such andronormative behaviors as affection, curiosity, problem-solving, and grief, that said animals are in no way comparable to humans and that making such comparisons is unreasonable and foolish.

Ding!

And in the other corner we have the agents of woo who will argue that, despite the existence of chaos theory, strange attractors, fractals, and information theory, that the very expression of traits like affection, curiosity, problem-solving, and grief demonstrates that mere matter cannot be responsible for these complex and irreducible behaviors and that the reason pots and calculators do not also display such behaviors must be due to an absence of woo.

Let the popcorn munching commence!
posted by localroger at 8:42 AM on June 23, 2013 [4 favorites]


I can only imagine.
posted by The Whelk at 8:52 AM on June 23, 2013


I know of another guy who's done a lot of thinking about being an octopus: virtual reality pioneer Jaron Lanier.
posted by scalefree at 9:01 AM on June 23, 2013


Thank you so much for posting this! It is great! As I have mentioned before I am in fact actually an octopus so this continued support and recognition is absolutely delightful. Our continued attempts to understand each other can only lead to great things now and in the future and I would like to extend the tentacle of friendship to each of you. Thank you all for colonizing the land and continuing to grow and explore with us as a species. We enjoy vacationing in your aquariums and communing with your scientists and look forward to a great partnership for the benefit of all cephalopodkind.
posted by Mrs. Pterodactyl at 10:12 AM on June 23, 2013 [10 favorites]


...who will argue that, no matter how often we observe animals people of whatever simplicity or distance of relation from us exhibiting such behaviors as affection, curiosity, problem-solving, and grief, that said animals people are in no way comparable to humans you and that making such comparisons is unreasonable and foolish.
posted by oneswellfoop at 10:23 AM on June 23, 2013 [2 favorites]


I'm not really sure how to interpret localroger's comment. The "despite" in the "woo" sentence implies that the listed things support the "science" side's argument, except that they have absolutely nothing to do with it. And the "science" argument is an extreme position that very few scientists defend these days. I guess the comment works as a parody of almost entirely uninformed partisans taking extreme opposing positions. But in this (and many) contexts, such people are relatively rare on MeFi.

Anyway, the extreme behaviorist anti-anthropomorphic position was the product of a time and place that combined a nineteenth century deterministic view of biology with the traditional dualist anthropocentrism — basically a judeo-christian view of man's dominion over nature, that only humans have souls, and that animals are basically automatons. Like all dualist human exceptionalism, it's actually profoundly anti-scientific. It's ironic that it considers itself scientifically rigorous.

The linked article discusses the problems with this sort of thing. My view is that insofar as there's an essential difficulty, that difficulty exists with regard to all other minds, including the minds of other people. We are intuitively aware of this in many different respects, but one example of this is the phrase, "if I were you". The construction itself acknowledges that there's an inherent contradiction — to experience someone else's self-experience would require that we have their perspective on their self-experience, not our own. Even if we were able to experience someone else's memory, we'd still be experiencing the memory of their experiences from the context of our own self-perception and our own memories. We can't truly and completely "be" someone else while also being ourselves so we can't truly and completely know what it's like to be someone else.

"If I were in your shoes" is a phrase that is more realistic about what's possible. This construction recognizes that only a hybrid is possible, an intermediate perspective that combines self with a limited perspective of other. And that's pretty much what our operative theory of mind does. It's what we do with fictional narratives, what we do every day with other people.

And it works. Mostly. Sometimes it badly fails.

It works insofar as the distance between the context of self and other is small. The larger the gap, the more inaccurate is the model and consequent comprehension.

Many argue that the gap between human and non-human is too large for any meaningful or useful result. But both evolutionary science and the recent history of the science of animal behaviorism indicate that many animals are far more like humans than we previously thought they were. And of course we previously thought animals were utterly unlike us — our science, our culture, began from the presumption that humans and animals are qualitatively different, created as qualitatively different.

The truth is far from that extreme and yet clearly non-human cognition must be much more alien to us than, say, any two (healthy) extremes within the range of human cognition.

So, as this pieces touches upon, the rule-of-thumb should really be based in the biology — that is to say, the more similar are the brain functions and anatomy, in conjunction with considerations of evolutionary relatedness, and also in consideration of convergent evolution with regard to function, the more solid is our footing when we imagine (a given aspect of) an animal's cognition in the shape of our own. That may not be very solid ground; it's usually going to be uncertain and treacherous. But it's no more an essential mistake than it is to do this with with other people. It's a quantitative, not a qualitative, difference.
posted by Ivan Fyodorovich at 10:47 AM on June 23, 2013 [9 favorites]


Please support my new web comic, Nick Fury: Agent of Woo.
posted by Joey Michaels at 10:56 AM on June 23, 2013 [2 favorites]


The discussion of octopodes is interesting because of the distributed nature of their nervous system, with apparently heavy processing power in their extremities and relatively slender connections between those and their brains. It suggests the possibility that they are distributed systems and the extremities have a lot of local autonomy. Given that they exhibit remarkably intelligent behavior, the question of how they experience their world subjectively is one of several absolutely fascinating ones with immense implications in a number of disciplines.
posted by George_Spiggott at 11:17 AM on June 23, 2013 [2 favorites]


To see the opening salvo from the agents of woo look no further than the first comment from the article:
Many years back I read one book of most humbug writer Richard Dawkins who showly wrote that animals have no consciousness not seen them weeping just like man.
Maybe the fight seems a bit unfair when that's the start point, but I'm game.


grabs popcorn
posted by Ned G at 11:29 AM on June 23, 2013


Nick Fury: Agent of Woo

You mean Nick Fury will be working for Jimmy Woo? Because if Jimmy's usual team is involved we get to ask "What is it like to be Gorilla Man?"
posted by justsomebodythatyouusedtoknow at 11:39 AM on June 23, 2013 [1 favorite]


Not my first time addressing this topic on the Blue.
posted by localroger at 11:41 AM on June 23, 2013


[Couple comments deleted; we can have this discussion without namecalling.]
posted by LobsterMitten at 11:41 AM on June 23, 2013


stonepharisee, for just that reason, there was a period where I briefly confused Nagel and von Uexküll.
I know a lot of people who respect PGS, but I found this piece completely unconvincing.
Descriptions are not completely powerless, though, in helping us get a grip on what the experience of another might be like. What a description can do, often very effectively, is prompt memories and guide the imagination—it can elicit memories of experiences that one has actually had and guide the construction of variations on these memories.
Right: descriptions can evoke imagination of what it might be like to be a man, a woman, a bat, an octopus, whatever. If you want to claim, however, that the subjectivity of another—especially a really foreign other kind of creature entirely—is actually not beyond our ken, you need to do better than "I can imagine what it might be like to be an X". No one will deny that you can imagine that—the imagination isn't utterly baffled by the request to imagine what it's like to be an octopus; it's not like the request to imagine what it's like to be an ashrop. But if your imagination doesn't reach some point beyond which it just stipulates "and this is all experienced octopusly", then, well, I think you've got kind of a weak imagination.* And if you think your imagination of the subjective experience of whatever is accurate and not just compelling fabulation, then all I can say is that I don't know what it's like to be so sure of oneself.

INCIDENTALLY, and possibly TMI-ly, it was no accident that I mentioned men and women before bats and octopodes above! Isn't it interesting that the reason Tiresias knows what sex is like for both men and women is that he was both a man and a woman, owing to the intervention of the gods? I—a dude—don't have to go to the weirder parts of the animal kingdom to wonder, fruitlessly, what a certain kind of subjective experience is like; I think that I will never be able to understand what vaginal intercourse is like. I can imagine something being inside me (I can have something inside me, though not the same part of me), and all that, but I just don't think that would really get it.

* Compare the problem of imagining a chiliagon, and distinguishing in imagination a chiliagon from a 999-gon. You can imaginatively represent to yourself two near-circles and in imagination stipulate that the one you imagine to be on the left is the 999-gon and the one you imagine to be on the right is the chiliagon. But you're not, I think, fulfilling the request in the right way, if that's what you do.
posted by kenko at 12:05 PM on June 23, 2013 [4 favorites]


localroger: "And in the other corner we have the agents of woo who will argue that, despite the existence of chaos theory, strange attractors, fractals, and information theory..."

How, precisely, do these branches of mathematics relate to subjective consciousness? Lay invocations of chaos theory and fractals set my woo detector beeping.

Subjective consciousness is a philosophical problem before it's a scientific one, because the subjective experiences of another conscious entity (assuming one even exists!) are inherently unobservable. Since science depends on objective observations, that's a huge difficulty. The most parsimonious assumption is that similar biological structures gives rise to similar subjective experiences, but I know of no way to empirically verify such a claim.
posted by Wemmick at 12:28 PM on June 23, 2013


I—a dude—don't have to go to the weirder parts of the animal kingdom to wonder, fruitlessly, what a certain kind of subjective experience is like; I think that I will never be able to understand what vaginal intercourse is like.

This seems to me a rather startling failure of imagination.

As it happens my personal kink involves a lot of vicarious experience and most of the time I spend with my partner I'm much more concerned with her feelings than my own. Over the years my inner mental model of her -- which has to include not only her femaleness but her masochism -- has gotten quite good at predicting how she will react when I do something to her.

It probably helps that just about every bit of flesh in my bits has an analogue in hers, even if they're doing much different things. It does not seem like such a stretch to me to invert the idea of penetration into envelopement, of being filled and gripping instead of filling and being gripped, and the rather obvious thing (which was probably Tiresias' real clue) that women are not arranged so that they need to ejaculate a limited bodily resource for maximum sensation.

Anyway, if you follow my not my first time link you'll see that it was nonsocial wasps, let alone octupi, that convinced me of the commonality of consciousness among sufficiently complex living things. I was also heavily influenced by Erich Harth's Creative Loop theory which shows computer simulations making the same kind of perceptual mistakes that living things do, not because they were programmed to but because of underlying algorithmic misdirection.

Harth paints an interesting picture of the living organism as a control panel and the mind as a very sophisticated hill-climbing algorithm, which uses the organism's inputs -- senses -- to build a model of its state, including its environment, and then attempts to optimize its situation according to control variables such as feelings and emotions. I would propose that evolution found an elegant and very successful solution to that problem sometime around the Cambrian Explosion, and that the differences which have evolved since then are more of magnitude than of kind.
posted by localroger at 12:29 PM on June 23, 2013 [1 favorite]


How, precisely, do these branches of mathematics relate to subjective consciousness?

They explain how complex structure and behavior emerge from simple inputs and rules in the presence of information storage.

Subjective consciousness is a philosophical problem before it's a scientific one

Nice land grab, but I'm not letting it go unchallenged. The first question is, how do you know that you have a subjective consciousness? Well, you just do -- some fellow named Pascal knocked that one out of the park.

So, how do I know you havea subjective consciousness? Well, I know I do, and through a long period of observation I've noticed that other people tend to act the way I would act in similar situations, suggesting that they are either putting on a very convincing act or they really have an inner experience similar to my own. Considering how we all got here, the second supposition is much more convincing.

And right here consciousness just became a thing for science to study, because we have good reason to suspect it is a thing that actually exists and has certain properties, which might be malleable (either through different programming experience or physical storage layout) without altering the fundamental algorithm.
posted by localroger at 12:37 PM on June 23, 2013


localroger: "They explain how complex structure and behavior emerge from simple inputs and rules in the presence of information storage.

That's a sufficient explanation for complex behavior. To go from complex behavior to subjective experience assumes that the former implies the latter.

Nice land grab, but I'm not letting it go unchallenged.

Science doesn't rest on philosophy? I'm generally of the opinion that those who claim science has dominion over philosophy are the ones making the land grab.
posted by Wemmick at 12:49 PM on June 23, 2013


To go from complex behavior to subjective experience assumes that the former implies the latter.

You seem to have this strange idea -- at least it's strange to me -- that "subjective experience" is something other than the functioning of an algorithm. I believe this is the basic mistake of the camp I called "agents of woo" in my first post; because our own personal subjective experiences seem so profound, we find it hard to accept that they are just emergent properties which result from the firing of neurons and induced synaptic growth. (And if we read Harth, most likely a lot of thermal noise too.)

Anyway, human behavior is hard to model not because it is "subjective" but because it is complex, and much of the woo-waving derives from a disbelief that simple processes like evolution and neural firing could result in something as awesome as ourselves. Yet that is exactly the chasm which is spanned by the bridge of chaos theory.

I'm generally of the opinion that those who claim science has dominion over philosophy are the ones making the land grab.

The last four hundred years have seen a regular parade of concepts picking up stakes and moving from philosophy to science to science's autistic child, engineering. One day consciousness will make that move, and it will be understood as a thing we can duplicate, modify, copy, and extend. I am as confident of this as I am that I know what is wrong with the drumfiller I'm supposed to work on tomorrow in a town 90 miles from here.
posted by localroger at 1:00 PM on June 23, 2013


"Yet that is exactly the chasm which is spanned by the bridge of chaos theory."

I'm trying to stay out of this because I agree with you that consciousness is (mostly) an emergent property of a complex system. But you're oversimplifying things, you're overconfident in your assertions, and you're mistaken about the science supporting the idea that all examples of consciousness are essentially similar.

I say this as someone who's read many books on this topic since the mid-eighties and almost did an internship at SFI.

You don't need chaos theory to make the basic argument you're making — which is that consciousness is a valid object of scientific study and that we have numerous good reasons to believe that it's not unique to humans. You certainly don't need fractals and information theory for this. Invoking chaos theory, given the vast reams of horrid pop-science written about it, just reduces your credibility.

I find Wemmick's line in the sand at subjectivity to be ... curious. If that's where he's going to draw the line, then the reasons he's drawing the line there create epistemological problems in general and the validity of a particular topic within science is among the least of concerns.

But that said, you err in the other direction, implying (if I understand you correctly) that there's some kind of universality in the nature of consciousness and thus conscious experience, on the basis of a false idea that there's a single kind of consciousness that arises as an emergent property of a sufficiently complex information processing system. And that's just wrong. Nothing in in any of the topics/fields you mention supports that claim.
posted by Ivan Fyodorovich at 1:42 PM on June 23, 2013 [1 favorite]


on the basis of a false idea that there's a single kind of consciousness that arises as an emergent property of a sufficiently complex information processing system.

No, no, no, that's not what I'm saying at all. I am saying that it is an emergent property of a particular information processing system. It is a particular algorithm which we are busy not discovering because we are mistakenly convinced that it is far more complicated than it really is, because we think it is unique to very complex animals like ourselves rather than being, like sugar metabolism and photosynthesis, something that was invented very early in the evolutionary tree and maintained and extended because it works well.

I agree that chaos theory isn't necessary to make that point, but where it is useful is explaining how relatively simple processes result in both complex design and complex behavior, which is a sticking point for many people. Chaos theory explains why you do not need God for the universe to look like it does, or woo for humans to act as we do.

It is, in this particular argument, a common, deep, and persistent belief that only woo can explain the fabulousness of our intricate inner experience which drives much of the opposition.
posted by localroger at 1:55 PM on June 23, 2013


I think the chapter on Octopus (full text) from The Book of Barely Imagined Beings is a more vivid, and I hope equally accurate picturing of the sensory inputs that an octopus has, if not reaching into Searle's Chinese Room argument quite as much as this discussion does.
posted by ambrosen at 2:00 PM on June 23, 2013 [3 favorites]


One man's chaos theory is another man's woo, localroger. If you're going to bash straw men as not understanding things as well as you do, you have an immense burden to show that you actually do understand them. Otherwise you're using science terms in a cargo cult fashion: mysticism in acquired science drag.
posted by George_Spiggott at 2:00 PM on June 23, 2013 [1 favorite]


"It is, in this particular argument, a common, deep, and persistent belief that only woo can explain the fabulousness of our intricate inner experience which drives much of the opposition."

Sure, I agree with that. Like you, I have very little patience with anthropocentric exceptionalism.
posted by Ivan Fyodorovich at 2:14 PM on June 23, 2013


George, I am a programmer. I spent several years of my spare time writing programs to explore fractal mathematics. I can assure you I am not just repeating names of cool things like a cargo cultist.

It is not woo that two lines of code draw the Mandelbrot Set. Neither is it woo that our genetic code, weighing in at around 7 gigabytes, manages to build a brain five orders of magnitude more complicated than that. The latter phenomenon (stated various ways) is frequently given as a justification for belief in God.

It isn't yet accepted, largely because of the "chaos as woo" chorus, but in time chaos theory will be seen as the most important discovery in mathematics since Calculus. Calculus is what made it possible for us to explain the workings of the physical world and to perform a wave of engineering feats unparalleled in human history. Chaos is what will make it possible to explain how life came to exist without divine creation and to understand how simple and duplicable biological processes create the consciousness we use to explore that world.
posted by localroger at 2:24 PM on June 23, 2013


Mysticism: "The divine spirit is immanent in god's handiwork".
Scientism: "Consciousness is an emergent property of certain complex systems."

In either case, if you don't show your work, it's not science, it's imaginative myth-building in one vocabulary or the other. Science is about showing your work. "Emergent property" is not a free-standing concept, it's a placeholder for the math that you've done elsewhere, which you can point to.
posted by George_Spiggott at 2:24 PM on June 23, 2013


"Emergent property" is not a free-standing concept, it's a placeholder for the math that you've done elsewhere, which you can point to.

You obvoiusly don't understand what an emergent property is. You can't show the math for an emergent property because the whole reason it's called emergent is that it emerges from a seemingly inadequate and unrelated method.

Or, to put it another way, can you "show the math" where the Mandelbrot Set comes from, starting with the algorithm that generates it?
posted by localroger at 2:28 PM on June 23, 2013 [1 favorite]


"the math" in this case was a figure of speech, a synonym -- yes, inexact -- for the "show your work" in the previous sentence. You're making a claim, that there is a category of people who do not understand the origins of consciousness the way you do. "Emergent property" is a high-level characterization that still needs to be backed up if everyone's going to agree on your dichotomy of people and where you sit among them that is your initial thesis in this thread.
posted by George_Spiggott at 2:33 PM on June 23, 2013 [1 favorite]


(To be honest, I have no expectation whatsoever that this thread will produce useful insights into what generates what we regard as consciousness -- such reading as I've done suggests to me that to even approach the larger issue from that direction sort of puts the cart before the horse, in part because attempting to say exact things about a very imperfectly defined term like "consciousness" is a bit silly. I'm more interested in the "here come the dumb people, who will not invoke the undeniably correct concepts I now do" kickoff, which I find hugely entertaining. )
posted by George_Spiggott at 2:46 PM on June 23, 2013


Speaking as someone who spent a couple of years writing actual fractal generators (after casino gaming simulation, this sort of thing is top of the list for "don't believe the results unless you wrote the code yourself") I can say that "emergent property" is a fairly precise thing. Having, in my case, seen lots of such properties emerge from code that had no obvious business creating them.

Now, just as I believe you probably have an inner experience similar to mine because you show other characteristics shared by myself and other humans, having seen these simple algorithms create improbably organized and complex results it seems that this phenomenon which I have observed myself is a far more likely explanation for how life and consciousness work than claims that these processes are somehow exceptional and beyond the means of science to resolve.

I have also seen -- and coded my own test versions of -- Erich Harth's thalamus model which I think perfectly explains the phenomenon of "perception." I actually think the living algorithm has some self-calibrating properties which Harth's version doesn't, but it's also been tweaking itself for 500 million years. "Perception" is a substantial and very important part of the larger algorithm called "consciousness," which I think is subject to being similarly understood.
posted by localroger at 2:56 PM on June 23, 2013


Well, I'm sorry to seem rude and I don't doubt your credentials in the areas you describe in the slightest. (I'm a SW engineer of very long standing myself.) I do wonder if there isn't a tool == hammer; problem == nail thing going on here but that's not important.

I do think it's essential to understand what we mean by consciousness, and insofar as my reading has led me in the direction of any conclusion, I am absolutely in agreement with you and Ivan that anthropic exceptionalism is pure bollocks. I strongly suspect we do not differ qualitatively from other critters, but only quantitatively, in terms of the breadth and complexity of inputs and heavy use of symbolic abstraction, the conceptual and physical remoteness of many of the objects our mental models must interact with, and the significance of detailed memory to our motives. And all that is more or less an evolved system of behaviors, not divine woo as I'm sure you agree.
posted by George_Spiggott at 3:06 PM on June 23, 2013


consciousness is (mostly) an emergent property of a complex system.

Isn't this just another way of saying consciousness is magic? Seems like a massive category error to me.

the idea that consciousness is identical to (or emerged from) unconscious physical events is, I would argue, impossible to properly conceive—which is to say that we can think we are thinking it, but we are mistaken. We can say the right words, of course—“consciousness emerges from unconscious information processing.” We can also say “Some squares are as round as circles” and “2 plus 2 equals 7.” But are we really thinking these things all the way through? I don’t think so. --Sam Harris
posted by Golden Eternity at 3:09 PM on June 23, 2013


Isn't this just another way of saying consciousness is magic?

No. Within chaos theory emergence is pretty tightly defined and there are sharp limits to the types of system which can produce it. Life just happens to be such a system.
posted by localroger at 3:25 PM on June 23, 2013


The role of consciousness as a reductive symbolic interface to the body and external world is why I find the octopus's subprocessors so interesting. The octopus's consciousness just says to the arm "grab that thing" and the arm works out the details. The octopus's consciousness says "tell me about it", and the arm sends back only the interesting, distinctive aspects of the thing.
posted by George_Spiggott at 3:27 PM on June 23, 2013


(Sorry, mods, I accidentally posted [stupid badly-placed trackpad] and had to fix up a partial post in edit mode.) What the octopus does is not qualitatively different from what we do -- a tennis player is long past having to think about what his feet are doing 99+% of the time. He has non-cognitive neurological matter that has long since learned to manage those details, and only needs broad instructions. Our optic nerve and vision centers do something similar. But the physical model of the octopus provides an interesting and relatively extreme example that's very thought provoking.
posted by George_Spiggott at 3:32 PM on June 23, 2013


"What the octopus does is not qualitatively different from what we do -- a tennis player is long past having to think about what his feet are doing 99+% of the time."

Yeah. The enteric nervous system is a really good example for your argument. It's closely related to the argument that localroger is making because consciousness is almost certainly much more distributed than we think it is. The notion that there's some locus of consciousness situated somewhere in the brain, that consciousness itself is necessarily an integral functional unit like, say, short-term memory, that's something that people like Dennett were rightly criticizing twenty years ago.

It's tempting but probably quite wrong to imagine that the more distributed nervous function of the octopus you describe implies an alienation of the octopus's "self" from what its arm is doing. As you point out, muscle memory allows us to have reaction times and purposeful motions that happen too quickly for the brain to be involved, yet we aren't alienated from those motions. We (usually) don't think to ourselves, wow, my hand just caught that ball all on its own, I had nothing to do with it. That action and experience is integrated into our sense of self, even our sense of agency.

There's no reason that this wouldn't be as true for the octopus. Even with highly complex activity that is distributed away from the brain, there's no reason why proprioception and whatever else won't just go ahead and integrate that activity into the octopus's sense of self.

Conversely, we can see this integration broken with various medical pathologies. Which shows that the function and the experience are not neurologically isomorphic.

But that said, this isn't exactly an argument that an octopus would have a sense of integral self like we do, either. I don't think that consciousness as we understand it is a spandrel, but I do think that it's just one among many possible adaptations to similar environments.
posted by Ivan Fyodorovich at 4:29 PM on June 23, 2013 [1 favorite]


It's tempting but probably quite wrong to imagine that the more distributed nervous function of the octopus you describe implies an alienation of the octopus's "self" from what its arm is doing.

Very well put, and it was lazy framing on my part to imply anything else.

One thing I find as a programmer, and I'm guessing it's the same for you, localroger, is that this ability to offload rote tasks to non-cognitive subprocesses can extend with sufficient practice to some incredibly complex and seemingly cognitively intensive tasks: notably, making use of technical docs -- even completely new and unfamiliar ones -- with little or no unproductive conscious reading of the surrounding material. One way I find I differ from my non-programmer friends is how frequently I can just about instantly find and comprehend the contextually significant details in a dense page of text, or the significant page in a book full of material, far more rapidly than anything you might call "reading" could do it. It's like years and years of practice have taught my visual cortex how to accept broad hints about the general category of phrases and text structure to look for, and then recognize the tiny grains of useful information in a sea of textual chaff. The only people I know who can do this to anything like the same degree are corporate lawyers. I'm sure there are a lot of other fields that indirectly train people to do similar things.
posted by George_Spiggott at 5:40 PM on June 23, 2013 [2 favorites]


muscle memory allows us to have reaction times and purposeful motions that happen too quickly for the brain to be involved

Do you mean this literally?
posted by Steely-eyed Missile Man at 6:09 PM on June 23, 2013


notably, making use of technical docs -- even completely new and unfamiliar ones -- with little or no unproductive conscious reading of the surrounding material

Oh yes I totally do this. Right before Wordstar ditched it had a feature that let you see your whole document an illegible level of super-demagnification, so you could only see things like chapters and paragraph structure. It was an amazing way to view a novel. You could actually see plot arcs and character development expressed in the way paragraph usage varied. We need to look at ourselves like that. What we learn when we do will change everything.
posted by localroger at 6:28 PM on June 23, 2013 [1 favorite]


The only people I know who can do this to anything like the same degree are corporate lawyers.

Well that's fucking depressing.
posted by localroger at 6:29 PM on June 23, 2013


"Do you mean this literally?"

Only partly. I thought that some very fine motor control for very familiar motions are the result of calibrations, so to speak, integrated peripherally. I can't find any references for that, however. There are reflex motions that are all local, but that seems trivial.

I have a strong memory of a discussion of some very fast adjustments in certain tasks that are quicker than round-trips to the brain. But thinking about it now that you've challenged it, I realize that I can't see how that would work.
posted by Ivan Fyodorovich at 6:40 PM on June 23, 2013


Related post.
posted by homunculus at 4:22 PM on June 24, 2013 [1 favorite]


Unusual Offshore Octopods: Lady Octopus Attracts Mates with a Glowing Kisser
posted by homunculus at 4:23 PM on June 24, 2013 [1 favorite]


Being a sandpiper: Animals have thoughts, feelings and personality, so why has science taken so long to catch up with animal consciousness?
posted by homunculus at 12:12 PM on July 4, 2013


« Older Because of nationwide shortages, Washington hospit...  |  I just appreciate silence In a... Newer »


This thread has been archived and is closed to new comments