Hava T. Siegelmann
Analog recurrent neural networks have attracted much attention lately as powerful tools of automatic learning. We formally define a high level language, called NEural Langage, which is rich enough to express any computer algorithm or rule-based system. We show how to compile a NEL program to a network which computes exactly as the original program and requires the same computation time. We suggest this language along with its compiler as the ultimate bridge from symbolic to analog computation, and propose its outcome as an initial network for learning.