
This is the replication package of "Experimental evaluation of architectural software performance design patterns in microservices," accepted for publication in the Journal of Systems & Software (JSS), 2024, DOI: https://doi.org/10.1016/j.jss.2024.112183. This package includes the material to replicate our experiments: - Paper preprint - Repository: The code used to generate the results can also be found on GitHub. For detailed replication instructions refer to the README file inside the muBench-experiment-1.1.0/gssi_experiment folder. - Experimental results: This study's raw experimental results (i.e., CPU usage and response delay) can be found in the experimental-results folder. However, note that the results have also been added to the relevant folders in the repository to make it easier to run the Python notebooks.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
