
In recent years, different national and international regulatory authorities, notably the FDA, have made their adverse event repositories publicly available, offering user-friendly dashboards. This has led to a deluge of low-quality, misinterpreted research using adverse event reporting databases (e.g., FDA’s Adverse Event Reporting System - FAERS). Such publications producing thousands of statistical associations presented as “safety signals” can create unscientifically grounded alarm with considerable impact on healthcare provider practices and patient behaviors. As specialists in pharmacovigilance, we are committed to increasing the value of the research conducted in the field. We believe the moment is crucial to share our concern and prevent the scientific literature from being flooded with weak, catastrophist studies that could harm patients and science itself. In this article, we perform a bibliometric analysis showing the exponential rise in scientific publications in the field of pharmacovigilance, we describe spontaneous reporting systems and signal detection analyses, their bias and suited interpretation. Moreover, amid the recent rise in publications of pharmacovigilance studies, notably those on FAERS, numerous articles display features that make them hard to distinguish from those known to be that of ‘paper mill’ articles. Then we discuss the pros and cons of publishing these studies, advocate for global collaborative efforts to reshape, promote and increase best practices for conducting pharmacovigilance research. Lastly, we propose criteria to identify high quality studies and robust safety signals coming from adverse event reporting database, thus assisting stakeholders, including reviewers, editors, regulators and researchers in critical appraisal of relevant findings.
disproportionality analysis, Pharmacovigilance
disproportionality analysis, Pharmacovigilance
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
