US 7,321,882 B2 | ||
Method for supervised teaching of a recurrent artificial neural network | ||
Herbert Jaeger, Koenigewinter (Germany) | ||
Assigned to Fraunhofer-Gesellschaft zur Foederung der Angewandten Forschung e.V., Munich (Germany) | ||
Appl. No. 10/398,914 PCT Filed Oct. 05, 2001, PCT No. PCT/EP01/11490 § 371(c)(1), (2), (4) Date May 29, 2003, PCT Pub. No. WO02/31764, PCT Pub. Date Apr. 18, 2002. |
||
Claims priority of application No. 00122415 (EP), filed on Oct. 13, 2000. | ||
Prior Publication US 2004/0015459 A1, Jan. 22, 2004 | ||
Int. Cl. G06E 1/00 (2006.01); G06E 3/00 (2006.01); G06F 15/18 (2006.01); G06G 7/00 (2006.01); G06N 3/00 (2006.01); G06N 3/02 (2006.01) |
U.S. Cl. 706—30 [706/15] | 41 Claims |
1. A method for constructing a discrete-time recurrent neural network and training it in order to minimize its output error,
comprising;
constructing a recurrent neural network as a reservoir for excitable dynamics (dynamical reservoir network;
providing means of feeding input to the dynamical reservoir network;
attaching output units to the dynamical reservoir network through weighted connections; and
training the weights of the connections only from the dynamical reservoir network to the output units in a supervised training
scheme.
|