
The goal of the work is to investigate and compare the effectiveness of four deep learning optimization algorithms SGD, Adam, Adagrad and RMSprop in forecasting and pattern recognition tasks. The results of the work – in the process of the work, a comparative analysis of deep learning optimization algorithms was carried out. For this, 5 data sets were selected and prepared for model training. Models with different architectures were created according to each data set. Models were compiled and trained. At the compilation stage, the loss function for each model was determined according to the task, as well as the metrics by which the performance of the model will be evaluated. After training the model, the corresponding losses and metrics were measured on the test data. All results were analyzed and conclusions drawn.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
