Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Programación Matemát...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

A Review of Deep Learning Fundamentals – the XOR Neural Network Model

Authors: Alejo Mosso Vázquez; David Juárez-Romero; José Alfredo Hernández-Pérez; Darvi Echeverría Sosa; Jimer Emir Loría Yah; Ramiro José González Horta; Gerardo Israel de Atocha Pech Carveo; +1 Authors

A Review of Deep Learning Fundamentals – the XOR Neural Network Model

Abstract

This paper explores the fundamentals of Deep Learning by searching a simple Neural Network model of the XOR function for the forward and backward signals flowing through this model. Our purpose is to reach a deeper understanding of some outstanding concepts of Deep Learning, which would enable us to get the significance of it while the Neural Network model of the XOR function is trained by the backpropagation algorithm. The chosen Neural Network model contains just one hidden layer with four neurons and an output layer with one neuron. Although this model is not a deep neural network, its hidden layer carries the enough concepts of Deep Learning. The sigmoid is used as the activation function in all neurons. A derivation of a simple version of the Stochastic Gradient Descent algorithm is presented, which is used to minimize the output error, and then by backpropagating it we come to the backpropagation algorithm. Numerical results are presented, which shows the convergence of the output error and that of a selected weight and their analysis summarize the understanding of the fundamental concepts of Deep Learning.

Keywords

QA76.75-76.765, Backpropagation algorithm, Deep learning, Computer software, neural networks, XOR function, stochastic gradient descent algorithm

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
gold