
doi: 10.12758/mda.2023.04
There has been a great deal of debate in the survey research community about the accuracy of nonprobability sample surveys. This work aims to provide empirical evidence about the accuracy of nonprobability samples and to investigate the performance of a range of post-survey adjustment approaches (calibration or matching methods) to reduce bias, and lead to enhanced inference. We use data from five nonprobability online panel surveys and compare their accuracy (pre- and post-survey adjustment) to four probability surveys, including data from a probability online panel. This article adds value to the existing research by assessing methods for causal inference not previously applied for this purpose and demonstrates the value of various types of covariates in mitigation of bias in nonprobability online panels. Investigating different post-survey adjustment scenarios based on the availability of auxiliary data, we demonstrated how carefully designed post-survey adjustment can reduce some bias in survey research using nonprobability samples. The results show that the quality of post-survey adjustments is, first and foremost, dependent on the availability of relevant high-quality covariates which come from a representative large-scale probability-based survey data and match those in nonprobability data. Second, we found little difference in the efficiency of different post-survey adjustment methods, and inconsistent evidence on the suitability of ‘webographics’ and other internet-associated covariates for mitigating bias in nonprobability samples.
methods, data, analyses, 17(7), 171-206
Erhebungstechniken und Analysetechniken der Sozialwissenschaften, Sozialwissenschaften, Soziologie, sampling error, Stichprobe, volunteer online panels, Online-Befragung, Umfrageforschung, calibration, sample, data capture, Methods and Techniques of Data Collection and Data Analysis, Statistical Methods, Computer Methods, matching methods, nonprobability sampling, Stichprobenfehler, survey research, online survey, post-survey adjustment, nonprobability sampling; volunteer online panels; post-survey adjustment; calibration; matching methods; benchmarking, benchmarking, Datengewinnung, Social sciences, sociology, anthropology, ddc: ddc:300
Erhebungstechniken und Analysetechniken der Sozialwissenschaften, Sozialwissenschaften, Soziologie, sampling error, Stichprobe, volunteer online panels, Online-Befragung, Umfrageforschung, calibration, sample, data capture, Methods and Techniques of Data Collection and Data Analysis, Statistical Methods, Computer Methods, matching methods, nonprobability sampling, Stichprobenfehler, survey research, online survey, post-survey adjustment, nonprobability sampling; volunteer online panels; post-survey adjustment; calibration; matching methods; benchmarking, benchmarking, Datengewinnung, Social sciences, sociology, anthropology, ddc: ddc:300
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
