Downloads provided by UsageCounts
One of the main goals of the TREX project is the implementation of optimised libraries for QMC computation (QMCkl) and I/O for interoperability and synergic applications of TREX codes as well as for use by other software packages outside TREX. A key ingredient to successfully accomplish the above target is ensuring that the libraries do provide an improvement in terms of performance, and not at the expense of accuracy. Also, since development is an ongoing process, it is vital to be able to detect early on if a new update caused regressions, in terms of accuracy, performance or stability. To address this concern, a set of benchmarks has been compiled. These benchmarks are intended to offer high-level tests, executable by users of the QMCkl or I/O libraries in order to check their installation as well as by developers of these libraries to detect any regression in terms of accuracy or performance. They will also allow monitoring of the overall evolution of performance during the development of the libraries. The present document is intended as a short user guide to this set of benchmarks. Its purpose is to describe how to exploit it along with the associated repository of performance reports, and how to add new benchmarks to the set. Chapter 2 recaps the content of the deliverable, chapter 3 defines how to use reference values for checking the benchmark results, chapter 4 presents the performance reports, and chapter 5 describes the necessary elements to provide for adding another benchmark to the list. Appendix A: presents the list of benchmarks, along with all necessary information for executing and analysing them. Appendix B: describes how to generate a performance analysis report for a benchmark. Finally, Appendix C: presents the correspondence between the existing reports and the benchmarks versions.
QMC
QMC
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 1 | |
| downloads | 5 |

Views provided by UsageCounts
Downloads provided by UsageCounts