A key challenge for neural modeling is to explain how a continuous stream of multi-modal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real-time. Alternative computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. An alternative computational model is for example a “Liquid State Machine” such as described in: W. Maass et al, “Realtime computing without stable states: A new framework for neural computation based on perturbations,” Neural Computation, vol. 14, no. 11, pp. 2531.2560, 2002”, which does not require a task-dependent construction of neural circuits.
FIG. 1 illustrates a model of a Liquid State Machine neural processor 10, wherein a function of time (time series) 12 is injected as input into a “liquid filter” 14, creating at any given time t a “liquid state”16 which is transformed by a memory-less readout map 18 to generate an output 20.
The input function 12 can be a continuous sequence of disturbances, and the target output 20 can be some chosen function of time that provides a real-time analysis of this sequence. In order for the Liquid State Machine 10 to map input functions of time 12 to output functions of time 20, the Liquid State Machine generates, at every time t, an internal “liquid state” 16, which constitutes its current response to preceding perturbations, i.e., to preceding inputs. In contrast to the “finite state” of a finite state machine (or finite automaton) the liquid state 16 consists of analog values that may change continuously over time.
Importantly, whereas the state set and the state transition function of a finite state machine is in general constructed for a specific task, the liquid states and the transitions between them need not be customized for a specific task. In a physical implementation this liquid state can consist of all information about the current internal state of a dynamical system that is accessible to the readout modules.
In mathematical terms, this liquid state is simply the current output of some operator or filter 14 that maps input functions 12 onto functions 16. In contrast to the liquid filter 14, the readout map 18 can in general be chosen in a task-specific manner. There can actually be many different readout maps, that extract different task-specific information in parallel from the current output of filter 14.
B. Schrauwen, “Compact hardware for real-time speech recognition using a Liquid State Machine,” Proceedings of International Joint Conference on Neural Networks,” Orlando, Fla., USA, Aug. 12-17, 2007 discloses an implementation of real-time, isolated digit speech recognition using a Liquid State Machine as a recurrent neural network of spiking neurons where only the output layer is trained. This implementation provides a scalable, serialised architecture that allows a compact implementation of spiking neural networks that is still fast enough for real-time processing. However, this implementation is based in digital circuitry and is not suited for low power applications.
There exists a need for an implementation of a Liquid State Machine suitable for low power applications.