The computational power and complexity of discrete feedforward neural networks

Barbara Borowik, Sophie Laird

Abstract


The number of binary functions that can be defined on a set of L vectors in R^N equals 2^L . For L>N the total number of threshold functions in N-dimensional space grows polynomially (2^N(N-1))while the total number of Boolean functions, definable on N binary inputs, growsexponentially ( 2^2^2), and as N increases a percentage of threshold functions in relation to the total number of Boolean functions - goes to zero. This means that for the realization of a majority of tasks a neural network must possess at least two or three layers. The examples of small computational problems are arithmetic functions, like multiplication, division, addition, exponentiation or comparison and sorting. This article analyses some aspects of two- and more than two layers of threshold and Boolean circuits (feedforward neural nets), connected with their computational power and node, edge and weight complexity.

Full Text:

PDF


DOI: http://dx.doi.org/10.17951/ai.2004.2.1.135-141
Date of publication: 2015-01-04 00:00:00
Date of submission: 2016-04-27 10:11:10


Statistics


Total abstract view - 462
Downloads (from 2020-06-17) - PDF - 0

Indicators



Refbacks

  • There are currently no refbacks.


Copyright (c) 2015 Annales UMCS Sectio AI Informatica

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.