Your Cortex Contains 17 Billion Computers
February 19, 2018 7:08 PM   Subscribe

It's a neural networks of neural networks up in your noggin. (Dr. Mark Humphries for Medium)
Your cortex contains 17 billion neurons. To understand what they do, we often make analogies with computers. Some use these analogies as cornerstones of their arguments. Some consider them to be deeply misguided. Our analogies often look to artificial neural networks: for neural networks compute, and they are made of up neuron-like things; and so, therefore, should brains compute. But if we think the brain is a computer, because it is like a neural network, then now we must admit that individual neurons are computers too. All 17 billion of them in your cortex; perhaps all 86 billion in your brain.

And so it means your cortex is not a neural network. Your cortex is a neural network of neural networks.
posted by filthy light thief (40 comments total) 26 users marked this as a favorite
 
I'm afraid this whole cortex thing may become confusing...
posted by elsietheeel at 7:20 PM on February 19, 2018 [9 favorites]


What does my jessamyn contain?
posted by Chrysostom at 7:34 PM on February 19, 2018 [40 favorites]


So this is basically saying that if you want to approximate a cortex with a deep network, you need to have alternating HUGE-small-HUGE-small layers, where the HUGE layers (the dendrites) have 10000 times the size of the small layers (the hillocks).
posted by a snickering nuthatch at 7:38 PM on February 19, 2018 [1 favorite]


My jessamyn is large and contains multitudes and looks up in perfect wonder at the stars.
posted by cgc373 at 7:39 PM on February 19, 2018 [14 favorites]


Mines a lobstermitten
posted by Annika Cicada at 8:01 PM on February 19, 2018 [12 favorites]


yo dawg i heard you like neural networks
posted by Halloween Jack at 8:05 PM on February 19, 2018 [16 favorites]


Maybe one day we can figure out how to take advantage of the non-linearity inherit in the transistors we deposit on silicon. We'll probably need to get good at deeper 3D ICs, too. (The state of the art neural chip, the Google TPU, is essentially 8-bit)

Meanwhile we'll just hum the "I Dream of Jeannie" theme to ourselves over and over, with our 86 billion neurons.
posted by RobotVoodooPower at 8:09 PM on February 19, 2018 [4 favorites]


Can I use them to mine Bitcoins?
posted by Brocktoon at 8:28 PM on February 19, 2018 [8 favorites]


Meanwhile we'll just hum the "I Dream of Jeannie" theme to ourselves over and over, with our 86 billion neurons.

Speak for yourself - I'll be using my neurons to their fullest potential.
♪... that is, of course, unless the horse is the famous Mr. Ed! ♪
posted by sysinfo at 8:34 PM on February 19, 2018 [5 favorites]


Damn it. I just spent the last 15 minutes doing exactly that. Can I get a firmware update?
posted by sysinfo at 8:49 PM on February 19, 2018 [1 favorite]


Can I get a firmware update?

Certainly! It also comes with U2's latest album pre-installed.
posted by slater at 9:06 PM on February 19, 2018 [12 favorites]


Ehh... I've listened to Songs of Experience. I'm good with the Mr. Ed brainloop.
posted by sysinfo at 9:25 PM on February 19, 2018


I'm still running on mathowie 1.0... I guess I'm really overdue for an upgrade.
posted by oneswellfoop at 9:36 PM on February 19, 2018 [2 favorites]


What does my jessamyn contain?

Multitudes!
posted by Meatbomb at 9:49 PM on February 19, 2018 [2 favorites]


It gets worse! In many of the parts of the brain that aren't cortex (like spinal cord, retina or thalamus) sometimes those dendrites that can do their own local computations, they actually have their own synaptic output that doesn't need the whole neuron to spike. In fact, it's pretty common in the nervous system of invertebrates that dendrites have their own outputs, and right now we have no good experimental understanding of the details of when these are active. And those axons that produce most of the output? They actually have synaptic inputs sometimes, usually inhibition that can suppress the activity of a given part of the axon. So not only is one neuron more like a neuronal network in its integration, but it has qualitatively different modes of talking to the rest of the brain. Interestingly, the cortex doesn't seem to have these complications as much. I suspect this difference reflects something profound since evolutionarily older parts of the vertebrate brain have them, but I have no idea what that would be.
posted by Schismatic at 10:05 PM on February 19, 2018 [11 favorites]


That's a lot but ya' still only use 10% ... Well one point seven billion computers is still enough for government work.
posted by sammyo at 10:25 PM on February 19, 2018


What a coincidence, I was just learning about this anime.
posted by Apocryphon at 11:41 PM on February 19, 2018 [1 favorite]


My cortex once had a mathowie but it got better.
posted by loquacious at 12:53 AM on February 20, 2018 [7 favorites]


My jessamyn is large and contains multitudes and looks up in perfect wonder at the stars.

She also looks up things on the Internet, if you ask her nicely.
posted by tommasz at 4:11 AM on February 20, 2018 [2 favorites]


Your Cortex Contains 17 Billion Computers

Is this the new "things ER docs removed from people's rectums last year" thread?
posted by Jacqueline at 4:31 AM on February 20, 2018 [4 favorites]


sammyo: “That's a lot but ya' still only use 10% ...”

Mine goes to eleven.
posted by Construction Concern at 5:24 AM on February 20, 2018


What does my jessamyn contain?

Just Cortex it.
posted by Fizz at 6:22 AM on February 20, 2018


This article isn't making sense because

- The Universal approximation theorem applies to multilayer perceptrons
- Kurt Hornik showed in 1991 that the activation function need only be non-constant, not nonlinear
- A computer is a Turing machine, no ifs/ands/buts.

But if we think the brain is a computer, because it is like a neural network, then now we must admit that individual neurons are computers too

Bullshit. We think the brain is a computer because *handwave* something something Church-Turing hypothesis. That had nothing to with neural network theory.

Sufficiently complex neural networks of certain types are computers. A recurrent neural network using rational numbers, linear activation, and finite nodes is Turing complete. This was also shown in the '90s.

So whether individual biological neurons are computers doesn't follow from whether a network of them makes a computer. That's just illogical reasoning. I could use the same syllogism to claim that since a whole brain is conscious, therefore individual neurons are conscious too.

It would be convenient and evocative if it turns out single neurons also satisfy some variant of the CT hypothesis, that they have the full power of computation. Maybe it happens at the molecular level where genetic processes do cool protein packaging and therefore information encoding stuff. But the article isn't about that, it's about showing that a neuron can be modeled by 2-level neural networks with nonlinear activation. That's not what a computer is.

It's fine to use neural network computational models in neuroscience. But in its effort to dispel the "transistors picture" of brain science, the author is tellingly making the same class of mistake: appeal to neurobiological structure.

And finally, by not referencing to prior body of knowledge in basic computer science, it's easy to see why this would happen. The author belabors under the impression that computation is what makes a computer. It's not. It has to be arbitrary computation. A instance of a computer can represent an arbitrary class of functions; an instance of a neuron can only represent one function. That's a world of difference.
posted by polymodus at 7:11 AM on February 20, 2018 [13 favorites]


Can I use them to mine Bitcoins?

Sure! Here's your starter guide.
posted by JoeZydeco at 7:20 AM on February 20, 2018 [2 favorites]


sorry i'm late, i was something something plunkbat something markov chains something something generating fractal cryptocurrency
posted by cortex at 8:02 AM on February 20, 2018 [6 favorites]


I could use the same syllogism to claim that since a whole brain is conscious, therefore individual neurons are conscious too.

HEADLINE UPDATE: Your Brain Contains 17 Billion Brains
posted by AlSweigart at 8:31 AM on February 20, 2018


"sorry i'm late, i was ... generating fractal cryptocurrency"

I'm sitting here giggling while thinking up activities that could be a euphemism for.
posted by Jacqueline at 9:31 AM on February 20, 2018 [1 favorite]


So the territory is not the map...but instead it is lots of maps!
posted by srboisvert at 9:56 AM on February 20, 2018


Our brain evolved by layering programming on top of programming, so yeah.

That's always been the core problem with thinking of brains as our kind of computer or program. Clearly there's a physical layer (though where exactly it ends and how exactly it works still remain unsolved problems), but from a programming standpoint the simple fact is it doesn't work like any normal computer program.

We're comparing brains to neural networks (the computer kind) these days largely because that's the model that at least has some vague resemblance. But that isn't very encouraging from an understanding the brain standpoint because we don't really know how neural networks (the computer kind) work.

How does the Google ranking algorithm work? How does the Amazon recommendation algorithm work? Answer: no one knows. They're technically trade secrets, but the real truth is that they're evolved programs and their inner workings are a complete mystery even to the programmers at Google and Amazon. Google can't tell you how the ranking algorithm actually works because they don't know.

What they mean when they say it's a trade secret is that the conditions they've trained their neural network for are a trade secret. The algorithm itself is just incomprehensible.

And, if something as relatively simple as a ranking or recommendation algorithm (or a simple facial recognition algorithm, or whatever) is essentially working from emergent properties of code that is fundamentally incomprehensible then the odds of us being able to figure out the details of how the human brain works and do even simple editing anytime soon are pretty low.

Which, in a way, is good news. You don't have to worry about the big bad whoever nefariously editing your brain in the near future. Or the midrange future for that matter.

But it's bad news too. If you're, for example, an addict and you'd love to have your brain edited to make you less addictive then you're out of luck for the near future.

The sheer complexity and possibly impossible nature of figuring out how to program a brain is why so many of the people who want Robot Jesus to save them are counting on a singularity. Because they're educated enough about computers to recognize that brain programming is not merely a difficult problem, but perhaps even an impossible (for our level of sapience anyway) problem, so they're pinning their hopes on a hypothetically infinitely intelligent machine who will hand us the solutions out of the goodness of its code. I'm not even slightly convinced that's going to happen anytime soon, or possibly ever.
posted by sotonohito at 10:24 AM on February 20, 2018 [6 favorites]


I don't think that's what 'singularity' means, it's not a term implying a god computer, just a point where predicting the future becomes impossible because innovation is happening so quickly and widely that no one can keep up.

And are Google and Amazon rankings and recommendations really generated by neural networks now? I thought PageRank and product clustering were just sort of fairly basic linear algebra ideas, sped up massively and parallelized by cheap computing hardware.
posted by ver at 1:11 PM on February 20, 2018 [1 favorite]


A computer is a Turing machine, no ifs/ands/buts.

I disagree. Analog computers are computers, but are not Turing machines, as they are subject to stochastic phenomena. And arguably the sense of computation for the brain is better captured by analog, rather than digital, computation. Certainly individual neurons are much more like analog computers than digital computers.

So whether individual biological neurons are computers doesn't follow from whether a network of them makes a computer. That's just illogical reasoning. I could use the same syllogism to claim that since a whole brain is conscious, therefore individual neurons are conscious too.

That's not really the argument he's making though. His argument is: "It is generally accepted that (neural networks are computers, the brain is a neural network, therefore the brain is a computer). But individual neurons are also neural networks. Therefore under the same premise individual neurons are also computers." I think your criticism is fair that the first premise is not universally accepted, but his reasoning from that premise is sound enough.

It would be convenient and evocative if it turns out single neurons also satisfy some variant of the CT hypothesis, that they have the full power of computation.

Unless I've misunderstood you, I think you've moved the goalposts here. You stated that a computer is a Turing machine, but now you're insisting that a single neuron must be a universal Turing machine to count as a computer. The computational properties of two-layer feed-forward networks are still interesting and important, particularly when they have dynamic state, as Humphries is arguing they do. Personally I'm fine with the definition of "computer" being more general and including anything that computes, whether or not it does so as a UTM.

Personally I do actually think Humphries is overreaching somewhat by implying that individual neurons are computers in the same sense that the brain is a computer, but I don't think his argument is as prima facia nonsensical as you do. A better version of his argument would probably be something like, "If you want to model the brain as a neural network with nonlinear integrate-and-fire nodes, you need to understand that the nodes are actually dendritic processes, not neurons." But that doesn't make as catchy a story for a Medium article.
posted by biogeo at 4:41 PM on February 20, 2018 [1 favorite]


Yeah, I think "computer" is being used loosely here to mean both a Turing-equivalent system (with the typical proviso that tape length is bounded in practice) in the case of a brain and a function approximator in the case of a neuron. All the article is really saying is that individual neurons are much more closely modeled by multilayer networks than by single perceptrons, and so are more versatile function approximators than we tend to give them credit for in lay discourse.
posted by invitapriore at 5:19 PM on February 20, 2018


ver Neural net, possibly not. But evolutionary programming yes. And complex enough that it blurs the line. If you've ever looked at evolutionary built code it's a total jumble of gibberish, the individual instructions are comprehensible, but the whole structure of the program isn't really something human minds can grasp. You can flow chart and try to basically reverse engineer how it works, and sometimes you can figure it out. But often you can't.

No one actually writes the algorithms for Amazon, Google, Netflix, et al. Not really.

Evolutionary programming works exactly like the name implies. You start with some code, mutate it, and test it to see how well it does whatever job you want it to do. Cull the failures, blend and mutate the survivors, and repeat, and repeat, and repeat, and repeat, and repeat. A few million generations later you'll have code that does the job, but how it works is a black box. There's meaning in there, of course, but it's emergent behavior and not easily analyzed. Any individual line of code make sense, the program as a whole doesn't so much.

All the humans do is design the tests to decide how the code will be evaluated. And that, by the way, is tricky as hell. If you don't watch out you can make a test that you **THINK** is training code to identify bunnies, but you didn't notice all your rabbit pictures were taken with one model of camera and all your non-rabbit pictures were taken with another so in fact you've trained the algorithm to identify pictures from a model X camera and you've wasted days or weeks or even months of computer time on bad training.

I don't think that's what 'singularity' means, it's not a term implying a god computer, just a point where predicting the future becomes impossible because innovation is happening so quickly and widely that no one can keep up.

It's a bit of both. Yes, in general when people say "singularity" they're talking about what you said. But the machine god people also use the term when talking about designing a self improving intelligence that will start at human smart, double its brainpower, then double that in half the time, then double that in half the time, then double that in half the time, and so on giving a logarithmic intelligence curve and you get a "singularity" in that you can't predict what happens after you have a close to infinitely smart machine.

Many of them, like Ray Kurzweil are basically looking for robot Jesus to save them from death, Kurzweil is interesting in that he's very predicting a singularity (in the robot Jesus sense) conveniently before his own likely age of death. And he's counting on that to bring about the tech needed to upload minds and understand brain programming so he can be immortal.
posted by sotonohito at 5:22 PM on February 20, 2018 [1 favorite]


All the article is really saying is that individual neurons are much more closely modeled by multilayer networks than by single perceptrons, and so are more versatile function approximators than we tend to give them credit for in lay discourse.

I agree that that's the main takeaway of the article. I didn't like that it oversells the headline by claiming that neurons are computers, because the article didn't do the work and show that. Nobody calls a Boolean circuit a computer. A circuit is a computational device, implies a non-uniform model of computation i.e. circuit complexity class. Unless the piece shows that neurons are basically universal functions and thus equivalent to universal Turing machines and by the extended Church-Turing thesis are for practical purposes the same as my laptop i.e. capable of running variable programs, it doesn't fit the definition of a computer. Even my Mac dictionary has it right:

computer |kəmˈpyo͞odər|
noun
an electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program.
• a person who makes calculations, especially with a calculating machine.

It's the difference between my Android phone, and an abacus or one of these things. I was just really surprised by the article's equivocation.
posted by polymodus at 6:04 PM on February 20, 2018


Maybe one day we can figure out how to take advantage of the non-linearity inherit in the transistors we deposit on silicon.

It's been done.

The structures that result share many of the characteristics that we ourselves possess that make us resistant to mass replication: nobody actually knows how they work, or whether they will continue to work under the stress of untested environmental conditions, and there is certainly no way to guarantee that what appears to be a replica on a nominally identical hardware substrate will function satisfactorily or even at all.

There is a fundamental difference between structures rooted in evolution and those rooted in engineering, and that is that evolution is absolutely no respecter of the engineering principle that requires deliberately ignoring the full spectrum of any given component's capabilities, instead using only those aspects known to be reliably replicable under a well-specified range of conditions.

I must admit to being somewhat concerned about a growing attitude in software development grounded in the view that everything we make now is so fucking complex that we might as well give up on trying to have a clue about how any of it actually works, let's just keep pasting library code together and see what sticks, and as long as it Works On My Machine we just push it out worldwide with Docker because fuck engineering, maaan, software is organic now.

Last thing we need is to be doing the same thing with hardware.
posted by flabdablet at 8:52 PM on February 20, 2018


sotonohito wait, you're saying the big companies are using software evolution in the wild now? Like the artificial life concept — randomly mutating source code abstract syntax trees and compiling the results and seeing what sticks? (I thought pretty much all code was still written by hand, even if it is designed to model a learning system.)

I read a Peter Watts novel once where evolved self-modifying programming was a plot point, first in that some evolved systems had selected for an attribute the designers hadn't intended, then (in the background) that other evolving systems had escaped and were now spamming the web with copies of their code like it was a natural ecosystem they were trying to win CPU cycles in. Scary if the preconditions for that are being set up IRL!
posted by ver at 2:12 AM on February 21, 2018 [1 favorite]


ver Yup. Unlike in the Watts novel, the code is evolved for specific purposes and mostly not evolving in the wild. They evolve it until it does whatever job well enough, then take that and deploy. The only evolution happening is on the development servers, not the actual production machines (or at least so I assume, Google et al aren't exactly open about how they're doing it, but evolution on production machines seems very unikely).

But yes, evolved code has been used for a while now. It's been the only way to actually get some tasks done. Stuff like Amazon's recommendation algorithm exceeded the ability of human software engineers to do in a reasonable time (or perhaps to do at all) a long time back.

The thing that was really holding back evolved code was computer speed, it takes several million generations to get something worthwhile, and in the old days that'd take too long for any sufficiently complex bit of evolved code. These days a decent server farm can run those million generations in a few weeks so we're seeing more evolved code in use.
posted by sotonohito at 5:53 AM on February 21, 2018 [1 favorite]


I, on the other hand, estimate that my computer contains 17 billion cortexes (and cookies, obviously).
posted by Namlit at 7:09 AM on February 21, 2018


These particular 38 comments (mostly jokes) mean we have officially met with Metafilteruselessness on this topic.

But still, art is great and whatever. 🙄
posted by anarch at 9:04 PM on February 22, 2018


One more Cortex-related joke...
posted by oneswellfoop at 11:05 PM on February 22, 2018


« Older devoid of verse numbers and footnotes, so it reads...   |   Shattered convictions I thought were reflecting... Newer »


This thread has been archived and is closed to new comments