Downloads provided by UsageCounts
These are the contextual embeddings models for Old French developed for the article “BERTrade: Using Contextual Embeddings to Parse Old French” (Grobol et al., 2022). They are meant to be used with the 🤗 Transformers library and are known to work with versions from 4.12.0 to 4.18.0 of this library (other versions might be compatible, but we offer no guarantee). See the article for a list of the resources and the settings used to develop these models as well as assessment of their suitability for syntactic dependency parsing of Old French. References: Grobol, Loïc, Mathilde Regnault, Pedro Javier Ortiz Suárez, Benoît Sagot, Laurent Romary, and Benoit Crabbé. 2022. “BERTrade: Using Contextual Embeddings to Parse Old French.” In Proceedings of the 13th International Conference on Language Resources and Evaluation. European Language Resource Association.
Contextual embeddings, Old French, Neural Network, Transfer Learning, Natural Language Processing
Contextual embeddings, Old French, Neural Network, Transfer Learning, Natural Language Processing
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 21 | |
| downloads | 24 |

Views provided by UsageCounts
Downloads provided by UsageCounts