
Data Size Effects on Pre-Training Small-BERT (Part 1) This record pertains to Data Size Effects on Pre-Training Experiments. It includes the following files: small-binidx-0-ms-1234-ds-1234.tar.gz small-binidx-0-ms-1234-ds-2345.tar.gz small-binidx-0-ms-2345-ds-1234.tar.gz small-binidx-1-ms-1234-ds-1234.tar.gz small-binidx-1-ms-1234-ds-2345.tar.gz small-binidx-1-ms-2345-ds-1234.tar.gz small-binidx-2-ms-1234-ds-1234.tar.gz small-binidx-2-ms-1234-ds-2345.tar.gz small-binidx-2-ms-2345-ds-1234.tar.gz small-binidx-3-ms-1234-ds-1234.tar.gz small-binidx-3-ms-1234-ds-2345.tar.gz small-binidx-3-ms-2345-ds-1234.tar.gz Each tar file contains all model artifacts (checkpoints, random-number generator states, optimizer states etc.), training logs (Tensorboard, MLFlow and Weights & Biases), and evaluation results, configuration files, run scripts, SLURM sbatch driver scripts, and any additional artifacts generated during the experiments. Preprint: https://doi.org/10.48550/arXiv.2603.13627
Masked language modeling, Molecular property prediction, ADME, Maksed language modeling, Pretraining, Finetuning, Chemical language models, BERT
Masked language modeling, Molecular property prediction, ADME, Maksed language modeling, Pretraining, Finetuning, Chemical language models, BERT
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
