Building Neural Net Software

Report Portuguese OPEN
Neto, João Pedro ; Costa, José Félix (1999)
  • Publisher: Department of Informatics, University of Lisbon

In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurrent (analog, rational weighted) neural nets. The goal was not efficiency but simplicity. Indeed we used a number-theoretic approach to machine programming, where (integer) numbers were coded in a unary fashion, introducing a exponential slow down in the computations, with respect to a two-symbol tape Turing machine. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and subsymbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource boundedness to speed up computations over neural nets, through suitable data type coding like in the usual programming languages. We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language called netdef. Each netdef program has a specific neural net that computes it. These nets have a strong modular structure and a synchronisation mechanism allowing sequential or parallel execution of subnets, despite the massive parallel feature of neural nets. Each instruction denotes an independent neural net. There are constructors for assignment, conditional and loop instructions. Besides the language core, many other features are possible using the same method. There is also a netdef compiler, available at
  • References (11)
    11 references, page 1 of 2

    [GRUAU et al. 95] Gruau, F.; Ratajszczak, J. and Wiber, G., A Neural Compiler, Theoretical Computer Science, [141] (1-2), 1995, 1-52.

    [LESTER 93] Lester, B. P., The Art of Parallel Programming, 1993, Prentice Hall.

    [MCCULLOCH and PITTS 43] McCulloch, W. and Pitts, W., A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematical Biophysics, 5, 1943, 115-133.

    [MINSKY 67] Minsky, M., Computation: Finite and Infinite Machines, Prentice Hall, 1967.

    [NETO et al. 97] Neto, J. P., Siegelmann, H., Costa, J. F., Araujo, C. S., Turing Universality of Neural Nets (revisited), Lecture Notes in Computer Science - 1333, SpringerVerlag, 1997, 361-366.

    [SIEGELMANN and SONTAG 95] Siegelmann, H. and Sontag, E., On the Computational Power of Neural Nets, Journal of Computer and System Sciences [50] 1, Academic Press, 1995, 132-150.

    [SIEGELMANN 96] Siegelmann, H., On NIL: The Software Constructor of Neural Networks, Parallel Processing Letters, [6] 4, World Scientific Publishing Company, 1996, 575-582.

    [SIEGELMANN 99] Siegelmann, H., Neural Networks and Analog Computation, Beyond the Turing Limit, Birkhauser, 1999.

    [SGS-THOMSOM 95] SGS-THOMSON, Occam® 2.1 Reference Manual, 1995.

    [SONTAG 90] Sontag, E., Mathematical Control Theory: Deterministic Finite Dimensional Systems, Springer-Verlag, New York, 1990.

  • Metrics
    views in OpenAIRE
    views in local repository
    downloads in local repository

    The information is available from the following content providers:

    From Number Of Views Number Of Downloads
    Repositorio da Universidade de Lisboa 1 0
Share - Bookmark