<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
Pretrained weights for the machine learning models used in the paper "Electrostatic Discovery Atomic Force Microscopy". The weights are stored in the Pytorch .pth format. Several sets of weights are provided: base: The base model used for all predictions in the main paper and used for comparison in the various test in the supplementary information of the paper. single-channel: Model trained on only a single CO-tip AFM input. CO-Cl: Model trained on alternative tip combination of CO and Cl. Xe-Cl: Model trained on alternative tip combination of Xe and Cl. constant-noise: Model trained using constant noise amplitude instead of normally distributed amplitude. uniform-noise: Model trained using uniform random noise amplitude instead of normally distributed amplitude. no-gradient: Model trained without background-gradient augmentation. matched-tips: Model trained on data with matched tip distance between CO and Xe, instead of independently randomized distances.
citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |