Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Report . 2025
License: CC BY
Data sources: ZENODO
ZENODO
Report . 2025
License: CC BY
Data sources: Datacite
ZENODO
Report . 2025
License: CC BY
Data sources: Datacite
versions View all 2 versions
addClaim

RoBERTaSense-FACIL: A Technical Report and Model Selection Study for Meaning Preservation in Easy-to-Read Spanish Texts

Authors: Diab, Isam; Suárez-Figueroa, Mari Carmen;

RoBERTaSense-FACIL: A Technical Report and Model Selection Study for Meaning Preservation in Easy-to-Read Spanish Texts

Abstract

RoBERTaSense-FACIL is a Spanish Transformer-based model fine-tuned to evaluate meaning preservation in Easy-to-Read (E2R) text adaptations. The model builds upon RoBERTa-base-bne and incorporates a balanced dataset of expert-validated E2R adaptations together with automatically generated hard negatives designed to introduce structural, semantic, and cross-textual distortions. This technical report describes the full methodology used to construct the dataset, the hard negative generation framework, the fine-tuning process, and a comparative evaluation of three models: MeaningBERT, RoBERTa-base-bne, and a BERTScore-based regression variant. Results show that the fine-tuned RoBERTa-base-bne, referred to as RoBERTaSense-FACIL, achieves the most robust and reliable performance for binary meaning-preservation classification in Spanish E2R texts. Data availability:The datasets and intermediate scripts used in this work cannot be made publicly available due to privacy and copyright restrictions. However, access may be granted upon reasonable request for academic research purposes. Model availability:The RoBERTaSense-FACIL model is publicly available on Hugging Face:https://huggingface.co/oeg/RoBERTaSense-FACIL

Related Organizations
Keywords

easy-to-read, roberta, spanish, meaning preservation

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Green