Google has received a flurry of media attention recently for their psychedelic image generation technique via neural networks (called inceptionism). A handful of fascinating links have sprouted up concerning neural network techniques, so I've collated a handful here with some commentary.
As always, wikipedia is the best place to start. The specialized articles on convolutional neural networks and recurrent neural networks are also worth checking out. Much of the recent buzz has been due to advances in deep learning coupled with GPU techniques for employing neural network algorithms.
The aforementioned Google research blog post. As far as I can tell, this was first leaked on reddit a day or so before the Google blog post went up. This 2014 preprint from Microsoft research has some more, similar neat images, as well as insight into a related set of techniques. Honestly, the whole machine learning subreddit is pretty great.
A really great blog post on the unreasonable effectiveness of recurrent neural networks. Includes a Paul Graham text generator, a Wikipedia text generator, a LaTeX text generator, and a Linux code generator. Also includes source code if you want to build your own!
UC Berkeley's own Vision and Learning Center hosts a really great open source toolkit for working with neural networks called Caffe.
A slightly processor intensive site which performs live image generation via an adversarial neural network. Essentially, a network tries to produce an image that will "look real" for another function's definition of "real"-ness.