Exponential-capacity Associative Memory
This is a collaboration between Prof. Amin Shokrollahi, Dr. Amin Karbasi (Yale), Dr. Raj K. Kumar, Dr. Lav R. Varshney (UIUC) and Prof. Wulfram Gerstner.
In this project, we focus on improving the performance of artificial neural associative memories by using techniques from graph-based error correcting codes (such as LDPC) and exploiting the inherent structure of the input patterns in order to increase the pattern retrieval capacity from O(n) to O(a^n), where a > 1. The main idea is that it is much easier to memorize more patterns that have some redundancy, like natural scenes, to memorize the more random patterns.
We propose an online learning algorithm to learn the neural graph from examples and recall algorithms that use iterative message passing over the learned graph to eliminate noise during the recall phase. We gradually improve the proposed neural model to achieve the ability to correct a linear number of errors in the recall phase. We also propose a simple trick to extend the model from linear to nonlinear regimes as well.
We will also show how a neural network with noisy neurons rather counter-intuitively–achieves a better performance in the recall phase. The results of this approach is also used in graph-based error correcting codes to improve the short-length performance of such codes and achieve their asymptotic performance much earlier.
Some Useful Resources
- Code-base (on Github)