Abstract
We present new techniques for modeling the feedback loops of recurrent neural networks, including networks that incorporate tapped delay lines or gamma delay lines. Very fast simplified programs result. Examples of applications include signal prediction and dynamic-model matching. We also suggest interesting future research on improved programs for time-series recognition and classification.