Views provided by UsageCounts
This archive contains detailed results from QComp 2020 as well as the necessary scripts and data to reproduce them. Visit http://qcomp.org for more information for QComp. Overview of Contents - `qcomp.org/` contains the state of our website from the timepoint of the competition. This includes: - All benchmark files, browsable at `qcomp.org/benchmarks/index.html` - Detailed competition results in a human-readable format, browsable at `https://qcomp.org/competition/2020/` - `logs/` contains the raw logfiles and data gathered by our scripts - `scripts/` contains scripts to replicate the whole competition - `toolpackages/` contains a package for each participating tool which includes - Instructions for obtaining and installing the tool - a file `invocations.json` listing the commandlines used in QComp 2020 - a file `tool.py` providing functionalities to obtain the result from the tool output.
{"references": ["On Correctness, Precision, and Performance in Quantitative Verification - QComp 2020 Competition Report, ISoLA 2020"]}
Competition, Quantitative Formal Models
Competition, Quantitative Formal Models
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 13 |

Views provided by UsageCounts