
doi: 10.1007/bf03159051
A computationally efficient sigmoidal activation function is presented, called a double-exponential signal function, and the properties are compared with other signal functions. The sigmoidal function is monotonously increasing, continuous in all derivaties, and its output is 0.5 for zero input. The weight multiplication can be replaced by an addition when the training of the network is performed offline. We also present an approximation of this signal function, called a polygonal signal function, reducing the computational effort solely to bit sets and shift operations.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
