Testing Nexus on 'NIMH' mice
August 30, 2016 12:21 AM   Subscribe

Nanowire Mesh Monitors Mouse Brains - "Injectable 'neural lace'* brain-computer interface works in mice for months at a time." (via)

*"Can we just inject electronic circuits through a needle into the brain, or other tissue, and then connect it, and then monitor? Yes, we can, and that's where we are today."
posted by kliuless (31 comments total) 14 users marked this as a favorite
 
Great, after years of wireless mice, the IEEE wants us Togo back to cabled peripherals?
posted by GenjiandProust at 4:10 AM on August 30, 2016 [12 favorites]


Recent developments in nanobot for drug delivery have been in the science news, combine the tech!
posted by sammyo at 4:20 AM on August 30, 2016 [1 favorite]


Previously.
posted by I-baLL at 4:49 AM on August 30, 2016 [1 favorite]


vivisection trigger warnings would be appropriate - graphic photographs of creatures being tortured for "science".
posted by mfoight at 6:11 AM on August 30, 2016


Now if only someone posts about super-intelligent owls! All my Secret of NIMH dreams will come true.
posted by Fizz at 6:12 AM on August 30, 2016 [5 favorites]


mfoight There are no photographs of vivisection in any of the linked articles. While you are, of course, free to object to animal research you might be advised to get your terms right.

There is a photo of a rat with a doohicky on its head, that's about as grotesque as it gets.

I'm also curious about your scare quotes around the word science. Even if you have moral or ethical objections to animal experimentation, the research here is hardly done simply for the amusement of people who enjoy cutting rats. Their goals are clearly established, the purpose of the current round of experimentation is clearly established, this is as straightforward an example of science as you're likely to find.

You can, of course, object to the methods. But your implicit claim that no science is being done seems not to match what is going on.
posted by sotonohito at 6:25 AM on August 30, 2016 [30 favorites]


On the one hand, I like the idea of a neural lace. On the other, yeesh is it a huge potential abuse. Even just the idea of standard malware, say an ad server infesting your visual cortex and overlaying everything with ads for Nike, is bad enough. The potential for abuse in terms of censorship (sorry Citizen, the government has decreed that the new police drones are classified, they will be edited out of your vision, as will all anti-government propaganda) and possibly other, even worse things, is scary as all get out.

OTOH, an open source, trustworthy, neural lace so I can look up things on Wikipedia, have a map overlay, a clock, and so on would be truly amazing.

Can you get realtime linkage to another person's senses eventually? Even leaving aside the obvious sex application for that (not to mention recording and playing back feelies), there's so many other nifty and useful applications. Letting the doctor link into your senses and they can feel your symptoms firsthand instead of having to rely on your description just for starters.

And, of course, a neural lace brings actual mind mapping within reach. Not for several generations of the tech of course, but maybe one day. If that can happen maybe we really can back ourselves up and after our bodies die live on as a simulated brain in a computer simulated environment.

Probably that last won't happen soon enough to save me from death, I'm 41 and tech always takes a lot longer to roll out than we'd like to hope. I'll almost certainly die and cease to exist. But perhaps my child, or or his children, will be immortal.
posted by sotonohito at 7:45 AM on August 30, 2016 [2 favorites]


Their goals are clearly established Yes, this is just a catch up piece to make it look like we haven't been doing this for years. Their goals may be what they state, but those funding the research have entirely other goals. Outer Kazakhstan is starting to look good.
posted by Oyéah at 8:09 AM on August 30, 2016


This is a fascinating development. Thanks for the post.

Sotonohito, I agree that there is huge potential for a brain-machine interface from this work. However it strikes me that there is a huge technical gulf between "allow a person to interface with the Web" and "directly interfere and manipulate internal visual signals that both originate and terminate in a person's head". Not just technical either, but a huge difference in terms of what would have to be medically implanted.

Like any technology, I'm sure there is the potential for abuse here. Good and evil are human concepts after all. But the near-term implications look phenomenal and the downside looks distant (at least for now).
posted by Arandia at 8:40 AM on August 30, 2016 [2 favorites]


Human research.
Animal research.
No research.

When it comes to physiology - pick one.
posted by Punkey at 8:44 AM on August 30, 2016 [3 favorites]


...brain-computer interface works in mice for months at a time."

and then what? D:
posted by sexyrobot at 8:51 AM on August 30, 2016


Then you get a headache during the evening news hour for the rest of your life, no wait, that is already happening.
posted by Oyéah at 8:56 AM on August 30, 2016 [4 favorites]


Yes, this is just a catch up piece to make it look like we haven't been doing this for years. Their goals may be what they state, but those funding the research have entirely other goals.

Of course we've been doing work like this for years. This isn't a secret or something that this article is trying to cover for. Work like this is how we understand how the brain works, and how we've developed technologies like deep brain stimulation to treat Parkinson's disease and other neurological disorders. It's how we're able to advance development of new potential therapies like transcranial magnetic stimulation for treatment-refractory depression.

I don't know what you're trying to imply about the goals of "those funding the research." Why don't you say what you mean?

I'm personally excited by this technology. Anyone who does research with animals has drilled into them during their training the "three R's" of animal research ethics: whenever possible, the researcher has an ethical obligation to find ways to replace more sensitive species with less sensitive species or non-animal models where possible, reduce the number of animals used for a study and acquire more data from fewer animals, and refine their methods to minimize any pain or distress imposed on their animals. This nanoelectrode technology vastly increases the amount of information we can gain from each animal, without inducing any additional pain or distress (the brain has no nerve endings and so cannot feel these electrodes). As someone who cares a great deal about laboratory animal welfare as well as scientific advancement, I regard this as a big win.
posted by biogeo at 8:56 AM on August 30, 2016 [6 favorites]


I stand uncorrected sotonohito.

vivisection -"the practice of performing operations on live animals for the purpose of experimentation or scientific research." The word vivisection doesn't imply a pleasure principle involved in the act. And there really are photographs of vivisection in the linked articles - thus the call out to not be subjected to viewing creatures being actively vivisected - so use an explicit / graphic tag.
posted by mfoight at 9:38 AM on August 30, 2016


Well, it's clearly post-operation. It is definitely not "being actively vivisected". Vivisection clearly happened in the past, but that is different. I mean, not to judge what you personally find objectionable, but it's a mouse with a chip glued to its skull, presumably some 8 months after it was attached. No brain tissue or even blood is visible.

Serious question - would you have similar concerns viewing pictures of a disabled person with an integrated prosthetic? Is it the flesh/chip connection that is bothering you? I can understand being bothered by that, but that is not the same thing as being bothered by a vivisection.
posted by Arandia at 10:00 AM on August 30, 2016 [1 favorite]


"Vivisection" is a loaded pejorative with an intentionally ambiguous meaning, encouraging us to apply the natural and appropriate revulsion we feel at live, unanesthetized dissections (which are no longer done for scientific research, barring some very unusual cases which automatically trigger intense oversight from regulators and veterinarians) to surgery performed by trained professionals under anesthesia. In the past, the term was even used by people who objected to human medical surgery on religious/moral beliefs that cutting into a human is wrong even if done to help them. The term is still used by some people to describe the surgeries done on animals by veterinarians and physicians as part of their surgical training.

Animal research is an important and difficult ethical question. Most people ultimately are in favor of it if it is done with appropriate care and oversight. I believe it's important to be able to discuss this issue with a clear-headed moral assessment of the facts as they are, not on the basis of falsehoods and distortions. Terms like "vivisection" distort the moral landscape of this issue and make it much more difficult to discuss.
posted by biogeo at 10:20 AM on August 30, 2016 [9 favorites]


[mfoight, you've made your point, and there's now a description of the photos in the thread for folks who might be bothered by what they show. Let's leave the general sidebar over animal research now.]
posted by LobsterMitten (staff) at 10:27 AM on August 30, 2016 [2 favorites]


No one seems to have addressed the really important issue here, which is, is this a possible backdoor into the Culture?
posted by Halloween Jack at 10:31 AM on August 30, 2016 [3 favorites]


No one seems to have addressed the really important issue here, which is, is this a possible backdoor into the Culture?

Ya, very possibly. I think Sotonohito was alluding to something like this above. Obviously there is still a long way to go, but a stable and reliable brain-machine interface (BMI) is one of the biggest obstacles for such a development.

Obviously just personal speculation on my part, but what happens when a personal and reliable BMI is developed? Well, one of the simplest applications is being able to look up things like photos from your computer instantly. When it becomes as easy to view a photo as to recall a memory, I think in a very real way the photos accessible over the BMI will become our memories. That is to say: we will have successfully uploaded a portion of our brain onto the computer.

Other brain functions are clearly a different matter, but I see no compelling reason (given sufficient time) that the end result of this won't be the eventual migration of our consciousness out of our flesh-and-blood bodies.
posted by Arandia at 10:47 AM on August 30, 2016


I think one of the most exciting things about this method is mentioned in the Nautilus article:
That this mesh-like structure, which can be injected because it has size, scale, and mechanical properties very similar to the neural network, or neural tissue, turns out to have no immune response, which is unheard of.
(My emphasis.) This is actually really huge, if it holds up in species other than mice. One of the big challenges with neural prostheses is that the brain's immune system (which is a bit different from the rest of the body's immune system) identifies the implant as a foreign object and responds by basically walling it off. This isn't too damaging to the brain tissue, but it ruins the electrical connection between the electrodes and the neurons. For stimulators, this actually isn't a big deal, as they don't need to target individual cells. But for neural prostheses used for brain-machine interfaces (currently in development to restore motor function to people with spinal cord damage), this basically provides a hard limit on the lifetime of the implant. No immune response means that this limitation is lifted.
posted by biogeo at 10:59 AM on August 30, 2016 [4 favorites]


> ...brain-computer interface works in mice for months at a time."

and then what? D:


The natural lifespan of mice is only about two years. "Months at a time" in this case basically means "the lifetime of the animal."
posted by biogeo at 11:02 AM on August 30, 2016


Ian M. Banks coined the term "neural lace," did he not? That people are referring to this as a neural lace seems to me to indicate that yeah, parallels with The Culture are part of what makes this so interesting.

The potential for abuse with such a neural prosthetic is immense as well, as Banks himself pointed out. They are described in Excession as being possibly the most efficient and comprehensive tool for torturing a humanoid being, if turned to that use.

Of course, this is only a small—though remarkable—step toward such interfaces. It would be pretty incredible though if this turned into a usable technology in my lifetime. It's a technology that would need to be developed and deployed with utmost caution, but the potential for good here is enormous.
posted by Anticipation Of A New Lover's Arrival, The at 11:03 AM on August 30, 2016 [2 favorites]


Personally, I doubt that we are going to see this kind of technology implanted in healthy people in our lifetimes, if ever. But I think it's very likely we'll see something similar to this used to create neural prostheses for people with sensory or movement disorders in the relatively near term, maybe even as soon as the next decade. Less sophisticated electrodes are already in use for such prostheses, so there is fertile ground for this technology to make a big impact. At a somewhat more distant horizon, perhaps two or three decades from now, it's entirely possible that this could be used to treat certain cognitive disorders (e.g., severe, treatment-refractory depression) with more specificity than current deep brain stimulators can achieve. Further out yet, I can see this as part of a research program developing neural recording technologies working with similar resolution but noninvasively. That is something that would very likely be applied to healthy people, and the ethical implications are something we need to consider very carefully.
posted by biogeo at 11:16 AM on August 30, 2016 [3 favorites]


> no immune response, which is unheard of.

(My emphasis.) This is actually really huge


Eh... that's more the simplified 'they lied to you in highschool' kind of explanation.

There are absolutely immune responses in neural tissue. The actual thing is that the brain (in virtually all organisms, there might be outliers) is "immunologically privileged" (along with testes and the part of the kidney under the adrenal cap, iirc).

That is, there are very few - if any - adaptive immune cells (white blood cells, of many different varieties) in these privileged tissues that can survey, present potentially antigenic epitopes that can be scanned, and then elicit cytotoxic responses.

However, there are plenty of glial cells in the brain that express a spectrum of different Toll-like receptors (among other innate immune receptors) that can detect the pathogen/damage associated molecular patterns from microbes (and viruses) and produce an inflammatory response.

So, if you inject living (sometimes dead) cells from one individual into another (this is especially dramatic if the species are different) antigen presenting cells will sample it, present some bits of it, and if a cytotoxic cell comes by and recognizes the epitope as 'dangerous' will go HAM on it.

If you inject those same cells into the brain (or under the adrenal cap) those cells won't get sampled and won't get attacked.

Lipopolysaccharide (LPS) is a P/DAMP and is a major component of gram-negative bacterial cell walls. It is detected by TLR4 receptors and CD2 and glia in the brain are fully capable of detecting LPS and mounting an inflammatory response.

For example, the inflammatory symptoms of viral and bacterial meningitis is partially an innate immune response by glial cells in the brain recognizing that the brain has a viral/bacterial infection.
posted by porpoise at 3:06 PM on August 30, 2016


You're right, the brain's immune system is definitely different from the rest of the body, thanks to the blood-brain barrier, and largely lacks an adaptive immune system. But the brain is perfectly capable of mounting an innate immunity inflammation response in a variety of contexts, including to electrodes, and this is absolutely a major challenge for brain-machine interfaces. I'm not extrapolating from high school biology, here -- this is something I've personally seen happen. More specifically, you get a buildup of glial cells around the electrode, as the body attempts to wall off the foreign object. This isn't especially harmful to the nervous tissue, but these glial cells are great electrical insulators, and they kill the impedance of your electrodes. The term electrophysiologists use for this is gliosis. An electrode that provokes no immune response, and thus prevents gliosis, really is a huge deal. Currently, the best strategy for dealing with gliosis is to just rely on recording techniques that do fine with high impedance, insulated electrodes, but those recording techniques lack the resolution to identify individual neurons. It turns out you can do quite a lot even with this so-called "multi-unit activity," but there's a variety of reasons why single-unit recordings are advantageous.
posted by biogeo at 3:36 PM on August 30, 2016


Let's leave the general sidebar over animal research now

Well, a general argument over it may not be called for. But surely discussing the ethics of this research is on topic for this discussion? I'm not convinced this is ethically defensible from my point of view, although that is obviously a subjective call. Doing general research on animals is tricky, because the goal is some future project enabled by this. It certainly isn't helping the mice, and it may or may not be helping us. Its not really clear how it affects the mice. I'm sure it went through review, but of course the threshold for review is itself a product of ethical decisions that rely on essentially subjective value judgements for what is an acceptable amount of harm to cause in pursuit of knowledge.
posted by thefoxgod at 5:13 PM on August 30, 2016 [2 favorites]


I wish there were some scale where you could do hype to substance ratio, maybe separate ones for "research benefit" and "practical applications." So we could easily tell if we're all on the same page about what's crazy but fun speculation, and realistic.

So this seems really to have a lot of substance but (as you can guess) I think anything practical is guesswork, probably requiring like six more breakthroughs of the same level of this one to get to functionality. My takeaway on this part from the IEEE:
Statistical analysis of the signals they recorded showed that they were picking up activity from individual neurons, and that they could follow the same neurons over time. This ability could provide neurologists with a detailed map of what’s going on in, say, the visual cortex during learning, or let them watch the process by which memories are formed and how that process degrades with age.
Notice the two part split. At present you can (sometimes?) resolve signals from individual neurons, but clearly you can't pick the neurons. You can't look at connections between neurons directly or query a neuron and see if it did something. I assume basically you're getting random sampling from regions of the brain, with being able to tell if a specific neuron fires repeatedly in different context being the most significant improvement over current effects.

This might, as the second part says, improve mapping what happens with aging but even that diagnostic is speculation--it's a new tool to query neural activity. Seems highly likely it will discover new things (the substance part, which I think is real) but is it guaranteed to be more useful than existing tools?

The Nautilus spin (that this is a key step in defending us from AI that Elon Musk is scared of but doesn't actually exist) just annoys me no end. Good science shouldn't need to crap made up about it to be interesting.

In terms of the more modest predictions in-thread, if anyone wants to do a be that in the next fifteen years there are no human brain deployments of this tech (even for research) and the maximum realized upside is "improved understanding of brain function from animal studies*" you can send me a MeMail. I'll give you 12:1 odds, too**. (I'd need to think more about the odds I'd give on the prosthesis idea and what counts as "this tech" in that context, but basically it seems unlikely to me too.)

*Just in case I'm not clear, that is a pretty awesome upside, no?

**If you're wondering the specific odds are not based not on careful Bayesian analysis but because I'm having a glass of wine as I type. A bottle of wine vs. a case seems about fair.

posted by mark k at 9:13 PM on August 30, 2016


if anyone wants to do a bet that in the next fifteen years there are no human brain deployments of this tech

I'll play - but it will all come down to definitions. With my loose definitions, I'd say there'll be pre-clinical trials possibly within 5 years and definitely within 10 years. I would not be surprised if a crude version (even looser definition) of this has not been done. I know that it has been tried for peripheral nerves but no notable success to date.

With strict definitions ("reading" [and therefor recording] consciousness and "replaying" experience via direct electrophysiological manipulation of neurons) is not something that's going to happen short of a revolution in miniaturization (atomic-scale programmable robots that can transmit data but not protrude what they observe). And even then. For it to happen would need revolutions in multiple disciplines.
posted by porpoise at 9:41 PM on August 30, 2016


But surely discussing the ethics of this research is on topic for this discussion?

Well, I understand and respect the mods' desire not to have this turn into a general discussion about animal research ethics, but to the extent we can discuss the ethics of this specific research, I agree it's a valuable discussion to have. But given how difficult it can be to have discussions around animal research, I'll understand if the mods would prefer us not to go there.

I'm not convinced this is ethically defensible from my point of view, although that is obviously a subjective call. Doing general research on animals is tricky, because the goal is some future project enabled by this. It certainly isn't helping the mice, and it may or may not be helping us.

In my opinion, this is actually one of the more straightforward cases of clear benefit in animal research. There is a demonstrable scientific need that affects a huge number of research projects, namely, that our electrodes seriously limit the amount of data we can collect, over both space (limited number of neurons recorded) and time (limited lifetime of the electrode). These electrodes promise a dramatic advance for both of those problems. I can't really overstate how important that is. Technical discussions of electrodes is one of the most common shop-talk topics in the electrophysiology community, for a good reason. Being able to record hundreds of individual neurons simultaneously for months at a time gives a valuable new tool for investigating how neural computations are performed, how memories are formed and recalled, how motor plans are prepared and executed, and so on. From a basic science perspective, this is a huge deal for the neuroscience of sensory systems, motor systems, learning and memory, emotion, and neural computation, to name a few. But more pragmatically, electrodes of this type have more immediate potential to improve human health, by improving neural prostheses for people with full or partial paralysis, and even visual prostheses for people with optic nerve damage.

And, of course, although we typically describe (and fund) research of this type in terms of implications for human health, the reality is that the biomedical advances developed also improve veterinary medical practice, as well.

Its not really clear how it affects the mice.

During neurosurgery, human patients are often kept awake while surgeons record neural activity, to help localize specific regions of brain tissue which may need to be removed or spared. When these electrodes are purely recording, people don't notice anything, because the brain lacks nerve endings. When stimulating, surgeons can produce a variety of fascinating effects depending on the site, most of which people describe as weird but harmless. In this case, the mice were implanted with electrodes in somatosensory cortex (which produces the sense of touch) and the hippocampus (involved in memory). Stimulation of somatosensory cortex is usually described by neurosurgery patients as producing tingling or numbness, or phantom touch perceptions depending on the stimulation parameters. Stimulation of the hippocampus produces a variety of weirder effects, including confusion and a dreamlike state. The primary purpose of this research was not to stimulate the brains of these mice, but some stimulation was included to assess the function of the electrodes. We might infer, then, that at most, the mice experienced something similar to the effects described by human neurosurgery patients: that is, mostly nothing, and occasionally some weird phenomena.

Of course, we can't know what mice really experience, but our reports from humans who've experienced similar circumstances are the best and only basis we have to making an inference. In this case, I think we have good reason to believe that to the question of how these electrodes are affecting the mice, the answer is, "probably not much."
posted by biogeo at 10:40 PM on August 30, 2016 [1 favorite]


With my loose definitions, I'd say there'll be pre-clinical trials possibly within 5 years and definitely within 10 years

"Pre-clinical" is indeed pretty loose. What do you mean by that? Arguably the published finding, of "no immune response," is something like pre-clinical.

If you mean human-study enabling efficacy trials, no. (Efficacy trials not intended to validate for humans, sure--success or failure, who knows.)

I know that it has been tried for peripheral nerves but no notable success to date.

Unless I'm missing something (totally possible), this was just published as a new finding and is indeed novel and they are merely talking about passive take up of electronics in a non-directed, non-therapeutic way. Unless you are in contact with researchers in the group are you referring to other similar tech not being discussed here?
posted by mark k at 11:13 PM on August 30, 2016


It'll take many decades before healthy individuals can be enhance using such technologies, partially because making enhancements sounds hard. It's completely different for treating psychological disorders characterized by large scale brain malfunctions though, like serious OCD is even a candidate.
posted by jeffburdges at 2:25 PM on August 31, 2016


« Older I Want To Believe   |   Meet the meticulous artist behind those happy... Newer »


This thread has been archived and is closed to new comments