
The Models, GroundTruth, and 2021P directories contain preserved data to be used for verifying the AI models described in 'A Deep Learning Approach for Automatic Ionogram Parameters Recognition with Convolutional Neural Networks. Please use the code from GitHub and install all the necessary libraries using the requirements.txt file. The Models, GroundTruth, and 2021P directories need to be extracted and stored in the same directory as the ModelsEvaluation.pyand RunEvaluation.py files. The RunEvaluation.py script consists of two functions: ModelsEvaluation.Evaluate() and ModelsEvaluation.IonogramShow(). The outputs are Evaluation_FOF2.csv, Evaluation_FOF1.csv, etc., which contain GroundTruth, Predictions, Difference, MAE, and RMSE values, along with an ionogram_date.png image that includes the relevant GroundTruth and Prediction parameters.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
