The Deep Mind of Demis Hassabis - "The big thing is what we call transfer learning. You've mastered one domain of things, how do you abstract that into something that's almost like a library of knowledge that you can now usefully apply in a new domain? That's the key to general knowledge. At the moment, we are good at processing perceptual information and then picking an action based on that. But when it goes to the next level, the concept level, nobody has been able to do that." (previously: 1,2) [more inside]
Two approachable visual presentations of simple neural networks: one showing how a soft activation function allows the successive layers of a neural network to distort the input until the different classes are separable, and the other showing how a hard step activation function can be represented as carving out polygons in the space of inputs. Don't be intimidated by the rather condensed summaries above- the actual articles are very readable.
"OpenWorm is an attempt to build a complete cellular-level simulation of the nematode worm Caenorhabditis elegans. Of the 959 cells in the hermaphrodite, 302 are neurons and 95 are muscle cells. The simulation will model electrical activity in all the muscles and neurons. An integrated soft-body physics simulation will also model body movement and physical forces within the worm and from its environment." -- Bonus: explore the worm's cellular anatomy in 3D (WebGL required.)
SynthNet is a brain emulator. Unlike most modern software neural networks, it works at the electrochemical level. Each neural structure in it is generated from a genetic virtual machine that executes instructions in a genetic assembly language.
SynthNet is at an early stage, but right now, it can emulate a classic fear-conditioning experiment.