Two approachable visual presentations of simple neural networks: one showing how a soft activation function allows the successive layers of a neural network to distort the input until the different classes are separable, and the other showing how a hard step activation function can be represented as carving out polygons in the space of inputs. Don't be intimidated by the rather condensed summaries above- the actual articles are very readable.
"OpenWorm is an attempt to build a complete cellular-level simulation of the nematode worm Caenorhabditis elegans. Of the 959 cells in the hermaphrodite, 302 are neurons and 95 are muscle cells. The simulation will model electrical activity in all the muscles and neurons. An integrated soft-body physics simulation will also model body movement and physical forces within the worm and from its environment." -- Bonus: explore the worm's cellular anatomy in 3D (WebGL required.)
SynthNet is a brain emulator. Unlike most modern software neural networks, it works at the electrochemical level. Each neural structure in it is generated from a genetic virtual machine that executes instructions in a genetic assembly language.
SynthNet is at an early stage, but right now, it can emulate a classic fear-conditioning experiment.