Downloads provided by UsageCounts
{"references": ["B. Widrow, J. McCool, and M. Ball, \"The Complex LMS algorithm,\"\nProc. of the IEEE, April 1975.", "M. S. Kim and C. C. Guest, \"Modification of backpropagation for\ncomplex- valued signal processing in frequency domain,\" Int. Joint\nConf. on Neural Networks, San Diego, vol. 3, pp. 27-31, June 1990.", "G. M. Georgiou and C. Koutsougeras, \"Complex domain\nbackpropagation,\" IEEE Trans. On Circuits and Systems - II: Analog\nand Digital Signal Processing, Vol.39, No. 5., May 1992.", "A. Hirose, \"Dynamics of fully complex-valued neural networks,\"\nElectronics letters, vol. 28, no. 13, pp.1492-1494.", "N. Benvenuoto and F. Piazza, \"On the complex backpropagation\nalgorithm,\" IEEE Trans. On Signal Processing, vol. 40, no. 4, April\n1992.", "J. Wang, \"Recurrent neural networks for solving Systems of complex\nvalued linear equations,\" Electronics letters, vol. 28, no. 18, pp. 1751-\n1753.", "Y. Deville, \"A neural network implementation of complex activation\nfunction for digital VLSI neural networks,\" Microelectronics Jjournal,\nvol. 24, pp. 259-262.", "M. R. Smith and Y. Hui, \"A data exploration algorithm using a complex\ndomain neural network,\" IEEE Trans.on Circuits and systems-II: Analog\nand digital signal processing, vol. 22,no.2.", "H. Leung and S. Haykin,\" The complex backpropagation algorithm\",\nIEEE Trans. On signal Processing, Vol. 39,No.9, September(1991).\n[10] T. Nitta, \"An extension of the back-propagation algorithm to complex\nnumbers,\" Neural Networks, vol. 10, no. 8, pp 1391-1415, 1997.\n[11] P. J. Werbos, andJ Titus, \"An empirical test of new forecasting methods\nderived from a theory of intelligence: The prediction of conflict in Latin\nAmerica,\" IEEE Trans.on Systems, Man and Cybernetics, September,\n1978.\n[12] P. E. Gill, and M. H. Wright, \"Practical optimization,\" Academic Press,\n1981.\n[13] W. J. J. Rey, \"Introduction to robust and quasi-robust statistical\nmethods,\" Springer- Verlag, Berlin.\n[14] B. Fernandez, \"Tools for artificial neural networks learning,\" Intelligent\nengineering systems through Artificial Neural Networks, ASME Press,\nNew York, pp. 69-76, 1991.\n[15] K. Matsuoka, and J. Yi, \"Backpropagation based on the Logarithmic\nerror function and elimination of local minima,\" in Proc. Of the\nInternational Joint Conference on Neural Networks, Singapore, vol. 2,\npp. 1117-1122, 1991.\n[16] V. Ooyen, Nienhaus, \"Improving the convergence of backpropagetion\nalgorithm,\" Neural Networks, vol. 5, pp. 465-571.\n[17] A. Prashanth, \"Investigation on complex variable based backpropagation\nalgorithm and applications,\" Ph.D. Thesis, IIT, Kanpur, India, March\n2003."]}
The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
split activation function., Complex backpropagation algorithm, complex valued neural network, complex errorfunctions
split activation function., Complex backpropagation algorithm, complex valued neural network, complex errorfunctions
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 7 | |
| downloads | 6 |

Views provided by UsageCounts
Downloads provided by UsageCounts