Downloads provided by UsageCounts
Transfer fine-tuned BERT models by phrasal paraphrases. transferFT_bert-base-uncased.pkl bases on the bert-base-uncased model transferFT_bert-large-uncased.pkl bases on the bert-large-uncased model For usage, please refer to our GitHub page. https://github.com/yukiar/TransferFT For details of these models, please refer to our paper. Yuki Arase and Junichi Tsujii. 2019. Transfer Fine-Tuning: A BERT Case Study. in Proc. of Conference on Empirical Methods in Natural Language Processing (EMNLP 2019). https://arxiv.org/abs/1909.00931
When you have publications using our models, please cite our EMNLP2019 paper.
sentence representation, paraphrase, pre-trained model, NLP, BERT
sentence representation, paraphrase, pre-trained model, NLP, BERT
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 25 | |
| downloads | 1 |

Views provided by UsageCounts
Downloads provided by UsageCounts