The Next Step for Neuromorphic Chips
November 20, 2016 6:33 PM   Subscribe

 
Optical computing has long been the great white hope of computer science.

Oooh. Really?
posted by Tell Me No Lies at 7:00 PM on November 20, 2016 [2 favorites]




Ok, I put the brakes on here:

Optical computing has long been the great white hope of computer science.

?
posted by bonobothegreat at 7:38 PM on November 20, 2016 [6 favorites]




I don't understand what this is supposed to mean, but I don't know if that's because it's nonsense or because I'm not a physicist.

It's not exactly nonsense, but it sounds like a shortcut made by someone who doesn't entirely understand either. The relevant thing is that it's analog computing, and the greater bandwidth is leveraged to multiplex multiple analog signals into one.
posted by Slothrup at 8:28 PM on November 20, 2016


Yeah, "great white hope" is kind of a troubled idiom that I think should have got cut by the editors.

Anyway photonic computing has always been appealing to try to cut optical digitizers out of communications systems but more importantly to creates systems that aren't susceptible to EMP issues. With the end of the cold war I'm sure photonic computing research funding fell off quite a bit. A neural network is an interesting application as it wouldn't need to be clocked and they're all the rage these days. I'm curious how the system is trained or whether they've manually embedded node weightings in their test setup.
posted by GuyZero at 9:56 PM on November 20, 2016 [1 favorite]


I'm curious how the system is trained or whether they've manually embedded node weightings in their test setup.

I was wondering this too. It also seems clear that the self modifying nature exists only as long as the power is on. So it might be great to solve a certain class of problems, but without some way to save and load it's state, it's a fairly limited device. But I also think there's some handwaving by the author over 'threshold', so maybe there's a subtlety there that I'm missing.
posted by lumpenprole at 8:56 AM on November 21, 2016


next step is a positronic brain and we all know how that will end
But we only have two Data points.
posted by ryoshu at 9:36 AM on November 21, 2016 [6 favorites]


Tacky prose and overblown rhetoric is not uncommon for even respectible tech publications, and this may be an important development, yet...

They go on to demonstrate how this can be done using a network consisting of 49 photonic nodes.

This is a smaller chip than any since the 1960's, not even a prototype, it's a lab experiment designed for a paper and probably a grant proposal. Raw speed, EMP protection, integration with fiber, and heat reduction are all critical elements that would make a photonic integrated circuit a huge leap in tech progress. The neural nets (Big Data) are using math and software that may have recent tweaks but has been around a long time but the growth of the basic computing infrastructure has made working with terabytes of data almost trivial, I have a small library in my kindle, "photonic" could quite literally let you carry around the library of congress and analyse it on the fly. So clumsy worded exclamation, perhaps tacky may not be unwarranted.
posted by sammyo at 9:45 AM on November 21, 2016 [1 favorite]


… but more importantly to creates systems that aren't susceptible to EMP issues. With the end of the cold war I'm sure photonic computing research funding fell off quite a bit.

I'm sure that's about to change, if it hasn't already, given the US military doctrine of "full-spectrum dominance" and the increasing EM countermeasures we're seeing from Russia, China, etc.
posted by Kabanos at 10:06 AM on November 21, 2016


The idea with photonics is that photons don't interact with eachother nearly as much as electrons do. A photonic system, where the only interactions are specified, could potentially run a lot faster, and a lot more reliably, using significantly less power and generating much less heat.

If you haven't noticed, we've hit a few of silicon's limits. We're getting around them by shifting architectures, but serial execution is actually getting slower as more cores generate more heat.

Baby steps, of course, is just having pure optical interconnects instead of constantly shifting between elecrical and optical.
posted by effugas at 3:56 PM on November 21, 2016


Not to mention we'll probably be able to run photonic systems in 3D, and that light moves a lot faster than electrical charge even in a medium.
posted by effugas at 4:49 PM on November 21, 2016


Ok, I put the brakes on here: "Optical computing has long been the great white hope of computer science."

editor's reply to email:
"[...] we should have known better. We've changed it. Thank you for writing to us."



[first post! been reading mefi for well over a decade, and *this* is what gets me to sign up?]
posted by oban at 9:55 AM on November 22, 2016 [3 favorites]


OK, we've got optronics, the EmDrive is looking good... feeling a little better about 2017.
posted by Halloween Jack at 11:56 AM on November 22, 2016


« Older How To Call Your Reps When You Have Social Anxiety   |   a trouble of the land Newer »


This thread has been archived and is closed to new comments