In my last post, I talked about a new project I was working on to explore convolutional neural networks (CNNs). I’ve spent much of the time since playing with and iterating on this library, and I wanted to take a moment to share what has been built so far. I’ve ended up with a library of 30 network layer types which can be wired in an arbitrary directed acyclic (non-recurrent) graph/network and perform gradient descent training and optimization.
I’ve gotten to an interesting point in my latest project, inspired by Google’s fascinating recent work with convolutional neural networks. The project can now apply inverse convolution operations using multiple fitness functions.
I wanted to explore the technology of image processing neural networks from the ground-up, so I started by building the fundamentals of a backpropagation neural network library. Building the basic components and solving the initial problems has been interesting, and surprisingly complex.
Happy Fathers Day!
I’ve just finished up a review of the next project in my queue: Volumetry. The readme on github has a bunch of pretty pictures now and should be helpful to anyone interested in the research. If it looks interesting, I encourage you to run the code yourself; the 2d images aren’t nearly as interesting as the 3d models.
Included below are some snippets from the new documentation:
Happy Friday!
I’ve just finished reviewing and updating the next project in my backlog of old research to publish. It is an experiment in how to efficiently serialize a Markov tree. I got interested in the idea when exploring some of the curious properties of a Markov tree, specifically one based off a fixed population of N-grams derived from a continuous string. It turns out that most of the data in a piece of text, if not all, can be absorbed into the Markov tree structure and then encoded in the tree’s serialized form in a more efficient manner than is obvious for the string itself!
NOTE: This article is only a draft, but I wanted to make sure it finally gets published in some form. Revisions and improvements (e.g. illustrations) to come.
In this article I will briefly summarize a research project I have been playing with off and on for about a year. My goal is to provide a workable introduction to my research project and a brief discussion of the theory and rationale.
<charsequence, charsequence="“><charsequence, charsequence="“><charsequence, charsequence="“> </charsequence,></charsequence,></charsequence,>
This can be translated into a grammar very simply: Grammargrammar = GrammarBean.get(XmlTree.class);
This translation happens according to a number of rules to translate various java types into grammar structures: * __Terminal Classes__ – Java classes are converted into sequence elements, where each field in the class is an element in the sequence. * __Super Classes__ – Java classes with the @Subclasses annotation become choice elements.
Today I will talk about the family of cellular automata related to Langton’s Ant. For detailed background I refer you to the excellent articles in Wikipedia, but in short an “ant” is a localized cellular automaton. It has a position on an image, and according to the color at that position it takes an action which modifies its own state, the state of the board, and its position on the board.
Welcome to Simia Cryptus, or “yet another code monkey” blog. And happy 2012! I’d like to start off this blog with a general introduction of what I intend to post. I will post roughly once a week, and I will write about the most interesting things I can think of related to the discipline of software development. I don’t like reading walls of text, so I will try to write posts that can be read in less than 10 minutes.