The use of a pattern-based memory with high capacity and real-time recall has the potential to provide human-like performance in a wide range of applications. The present invention accomplishes this through the use of polychronous groups (PCGs). The whole concept of polychronization is very new as there is little research on PCGs and the use of PCGs as a type of memory.
The theoretical potential is quickly understood within the computational neural science community; however, no one previously understood how to make use of such a huge memory capability for practical applications. By way of example, Izhikevich introduced the concept of polychronous groups (PCGs), and also showed that PCGs respond to different inputs (See Izhikevich, Eugene M., “Polychronization: Computation with Spikes”, Neural Computation 18, 245-282 (2006), which is incorporated herein by reference). However, no published work exists on methods for determining the uniqueness of PCGs, or making use of unique PCGs to drive the development of a full system including spike encoding and reconstruction.
Although recurrent neural networks have been studied since the '90s, there was no practical way to control and use them until around 2000 when Maass and others developed their use, calling them Liquid State Machines (LSM) (See Maass, Wolfgang; Natschläger, Thomas; and Markram, Henry. “Real-time computing without stable states: a new framework for neural computation based on perturbations”. Neural Computation 14 (11): 2531-60 (2002), and Maass, Wolfgang; Markram, Henry, “On the Computational Power of Recurrent Circuits of Spiking Neurons”, Journal of Computer and System Sciences 69 (4): 593-616, (2004)). LSMs have a theoretical capacity bounded by N where N is the number of neurons in the liquid. In practice, their capacity with noisy input is typically below 0.20N. Other associative memories, such as Hopfield/Grossberg networks, have a theoretical capacity of ˜0.14N, but cannot practically be used at this capacity.
Although another research group described experiments with a small scale network identifying how many PCGs form under varying parameters, they did not provide a means for determining if a PCG is unique or how to use them in a memory (See Maier, W and B. Miller, “A Minimal Model for the Study of Polychronous Groups,” arXiv:0806.1070v1 (2008)).
Yet another research group, Mart et. al, described two types of PCGs, structural and dynamical (See Martinez, R. and H. Paugam-Moisy, “Algorithms for Structural and Dynamical Polychronous Groups Detection,” C. Alippi et al. (Eds.): ICANN 2009, Part II, LNCS 5769, pp. 75-84, (2009)). Although they defined PCG in a useful way, they did not analyze uniqueness of PCGs.
Iannella et al. described how a spiking neural model can be used to approximate any non-linear function (See Iannella, N. and Back, A., “A spiking neural network architecture for nonlinear function approximation,” Neural Networks for Signal Processing IX, 1999. Proc. Of the 1999 IEEE Signal Processing Society Workshop, August 1999, p. 139-146). While Maass et al. described a universal approximation property of liquid state machines with spiking neurons (See Maass, W., T. Natschl, and H. Markram, “A Model for Real-Time Computation in Generic Neural Microcircuits,” in NIPS 15, 2001).
As noted above, previous researches have not yet determined how to use PCGs in memory applications and, instead, have focused on traditional memory techniques. Current programmable computer memories typically use either random access memory (RAM) into hard storage locations or pattern-based access into distributed neural networks. RAM is efficient for storing bits and bytes of data such as numbers, but does not efficiently store spatio-temporal patterns such as sequences of multimedia data. Google's image and video search engines and Microsoft's MyLifeBits are examples of these types of systems. Metadata must be created by humans and stored together with static data structures such as image frames that are combined into temporal sequences. Access to stored sequences is through sequential or in some cases parallel search techniques that must look at every data item stored in hard memory locations. An alternative to these approaches is distributed networks such as Hopfield/Grossberg networks, sparse distribute memories, and recurrent networks. These methods train weights on links between nodes (neurons) using large numbers of sample patterns. Access to stored patterns is through a cue or stimulus that is input to the network, and recall fidelity depends on the signal-to-noise ratio of the input data and the completeness of the input cue. Network models have the advantage that they are capable of noise tolerant, distributed representation of spatial patterns, and parallel recall without search. These methods have difficulty recalling patterns outside the training set, have difficulty generalizing, and most importantly, have low capacity to store patterns, typically about 15% of the number of nodes.
Spiking neural models offer an improvement over previous distributed network models by directly encoding the temporal aspects of the input signal. The newest research on spiking models has resulted in the development of a model that has high fidelity to living neurons and synapses by incorporating conduction delays between neurons and using a learning rule known as spike-timing-dependent plasticity (STDP). These models have been shown to self-organize into polychronous groups (PCGS—time-locked but not synchronous groups). PCGs have great potential for storing large numbers of patterns similar to the human brain. For this kind of spiking neural model with N neurons, there are N! possible PCGs in a fully connected network with delays. Computer simulations using random stimuli have already shown that there are more PCGs than neurons and potentially more PCGs than synapses.
While both PCG's and neural spiking neural models have been described, their properties have not been well understood and no one heretofore mentioned has applied the technology to work backwards from the set of neural spike codes that activate the PCG to the functional approximate to pick the best match to the existing input.
The use of a pattern-based memory with high capacity and real-time recall has the potential to provide human-like performance in a wide range of applications where current programmable machines have failed or been too limited. Thus, a continuing need exists for a reconstruction system that is operable for working backwards from a set of neural spike codes to identifying a best match and reconstructing an input signal in real-time.