Neural Networks

A Unified Design Pattern for Continuous Parameter Optimization

Almost two years ago I developed a neural network library called MindsEye, which has largely sat idle since the release of TensorFlow. Recently however I’ve wanted to follow up on research involving neural networks, but I wanted a “pure” java option I could use for research. And so I decided it was time to revive my old project. In this release, I have reviewed all of the code and made many improvements.
Deblurring with TensorFlow

Deblurring with TensorFlow

Blurred Image Deblurred Image Recently, Google open-sourced a toolkit called TensorFlow which provides a platform for neural networks. It provides a native core written in C, and many examples written in Python. Although the architecture is extensible and will hopefully will be usable from Java/Scala application code in the future, I took some time recently to evaluate it using Python to perform deconvolutions (a.k.a. deblurring), the same task I recently wrote about using my own NN library.
RE: The anatomy of my pet brain

RE: The anatomy of my pet brain

In my last post, I talked about a new project I was working on to explore convolutional neural networks (CNNs). I’ve spent much of the time since playing with and iterating on this library, and I wanted to take a moment to share what has been built so far. I’ve ended up with a library of 30 network layer types which can be wired in an arbitrary directed acyclic (non-recurrent) graph/network and perform gradient descent training and optimization.
Fun with Deconvolutions and Convolutional Neural Networks in Java

Fun with Deconvolutions and Convolutional Neural Networks in Java

I’ve gotten to an interesting point in my latest project, inspired by Google’s fascinating recent work with convolutional neural networks. The project can now apply inverse convolution operations using multiple fitness functions. I wanted to explore the technology of image processing neural networks from the ground-up, so I started by building the fundamentals of a backpropagation neural network library. Building the basic components and solving the initial problems has been interesting, and surprisingly complex.