Artificial Neural Networks with Synaptic Delays
The use of delays in the synapses of a network induces an explicit handling of the temporal dimension of the information being processed. We have made use of these network topologies for different tasks, such as prediction, robot navigation, etc.
In addition and given their pattern recognition and prediction capabilities, we have carried out quite a bit of work on the recognition of signals without any windowing or preprocessing. One example is the case of ECG processing. Another is the prediction of future values for different chaotic series. In order to carry out all of these processes we have developed a generalization of the backpropagation algorithm in order to be able to train the delays as well as the weights. We have called it “Dicrete Time Backpropagation” (DTB).
Duro, R.J. and Santos, J. (2003), “Modeling Temporal Series Through Synaptic Delay Based Neural Networks”, Neural Computing and Applications 11:224-237.
and Duro, R.J. (2001), “Pi Units in Temporal
Time Delay Based Networks Trained with Discrete Time Backpropagation”,
International Journal of Computers, Systems and Signals, Vol. 2, No. 1,
Santos, J. and Duro, R.J. (2001), “Influence of Noise on Discrete Time Backpropagation Trained Networks”, Neurocomputing, Vol 41, No. 1-4, pp. 67-89.
Duro, R.J., and Santos, J. (1999), “Discrete Time Backpropagation for Training Synaptic Delay Based Artificial Neural Networks”, IEEE Transactions on Artificial Neural Networks 10(4):779-789.
Santos, J. and Duro, R.J (2001), “Π-DTB, Discrete Time Backpropagation with Product Units”, Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence - Lecture Notes in Computer Science 2084:207-214.