Views provided by UsageCounts
This is the replication package for the paper: "Rebasing in Code Review Considered Harmful: A Large-scale Empirical Investigation", published at the 19th IEEE International Working Conference on Source Code Analysis and Manipulation (SCAM'19). The replication package is composed of the raw results from each of our 4 research questions. In addition, we include the results from the small empirical study we performed to evaluate the methodology we proposed to handle rebasing operations in code review data. Each of the reported results has been obtained by following the procedures and heuristics described in the paper. This publication uses the CROP dataset of code review data: https://crop-repo.github.io/
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 9 |

Views provided by UsageCounts