Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: ZENODO
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: ZENODO
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
versions View all 3 versions
addClaim

SE-PQA: a Resource for Personalized Community Question Answering

Authors: Kasela, Pranav; Braga, Marco; Pasi, Gabriella; Perego, Raffaele;

SE-PQA: a Resource for Personalized Community Question Answering

Abstract

Personalization in Information Retrieval is a topic studied for a long time. Nevertheless, there is still a lack of high-quality, real-world datasets to conduct large-scale experiments and evaluate models for personalized search. This paper contributes to fill this gap by introducing SE-PQA (StackExchange - Personalized Question Answering), a new resource to design and evaluate personalized models related to the two tasks of community Question Answering (cQA). The contributed dataset includes more than 1 million queries and 2 million answers, annotated with a rich set of features modeling the social interactions among the users of a popular cQA platform. We describe the characteristics of SE-PQA and detail the features associated with both questions and answers. We also provide reproducible baseline methods for the cQA task based on the resource, including deep learning models and personalization approaches. The results of the preliminary experiments conducted show the appropriateness of SE-PQA to train effective cQA models; they also show that personalization improves remarkably the effectiveness of all the methods tested. Furthermore, we show the benefits in terms of robustness and generalization of combining data from multiple communities for personalization purposes. Performance on all communities separately: Community Model (BM25 +) P@1 NDCG@3 NDCG@10 R@100 MAP@100 $\lambda$ Academia MiniLM 0.438 0.382 0.395 0.489 0.344 (.1,.9) MiniLM + TAG 0.453 0.392 0.403 0.489 0.352 (.1,.8,.1) Anime MiniLM + TAG 0.650 0.682 0.714 0.856 0.683 (.1,.9,.0) Apple MiniLM 0.327 0.351 0.381 0.514 0.349 (.1,.9) MiniLM + TAG 0.335 0.361 0.389 0.514 0.357 (.1,.8,.1) Bicycles MiniLM 0.405 0.380 0.421 0.600 0.365 (.1,.9) MiniLM + TAG 0.436 0.405 0.441 0.600 0.386 (.1,.8,.1) Boardgames MiniLM 0.681 0.694 0.728 0.866 0.692 (.1,.9) MiniLM + TAG 0.696 0.702 0.736 0.866 0.699 (.1,.8,.1) Buddhism MiniLM + TAG 0.490 0.387 0.397 0.544 0.334 (.3,.7,.0) Christianity MiniLM 0.534 0.505 0.555 0.783 0.497 (.2,.8) MiniLM + TAG 0.549 0.521 0.564 0.783 0.507 (.1,.8,.1) Cooking MiniLM 0.600 0.567 0.600 0.719 0.553 (.1,.9) MiniLM + TAG 0.619 0.583 0.614 0.719 0.568 (.1,.8,.1) DIY MiniLM 0.323 0.313 0.346 0.501 0.302 (.1,.9) MiniLM + TAG 0.335 0.324 0.356 0.501 0.312 (.1,.8,.1) Expatriates MiniLM + TAG 0.596 0.653 0.682 0.832 0.645 (.1,.9,.0) Fitness MiniLM + TAG 0.568 0.575 0.613 0.760 0.567 (.2,.8,.0) Freelancing MiniLM + TAG 0.513 0.472 0.506 0.654 0.457 (.1,.9,.0) Gaming MiniLM 0.510 0.534 0.562 0.686 0.532 (.1,.9) MiniLM + TAG 0.519 0.547 0.571 0.686 0.541 (.1,.8,.1) Gardening MiniLM 0.344 0.362 0.396 0.520 0.359 (.1,.9) MiniLM + TAG 0.345 0.369 0.399 0.520 0.363 (.1,.8,.1) Genealogy MiniLM + TAG 0.592 0.605 0.631 0.779 0.594 (.3,.7,.0) Health MiniLM + TAG 0.718 0.765 0.797 0.934 0.765 (.2,.8,.0) Gaming MiniLM 0.510 0.534 0.562 0.686 0.532 (.1,.9) MiniLM + TAG 0.519 0.547 0.571 0.686 0.541 (.1,.8,.1) Hermeneutics MiniLM 0.589 0.538 0.593 0.828 0.526 (.2,.8) MiniLM + TAG 0.632 0.570 0.617 0.828 0.552 (.1,.8,.1) Hinduism MiniLM 0.388 0.415 0.459 0.686 0.416 (.2,.8) MiniLM + TAG 0.382 0.410 0.457 0.686 0.412 (.1,.8,.1) History MiniLM + TAG 0.740 0.735 0.764 0.862 0.730 (.2,.8,.0) Hsm MiniLM + TAG 0.666 0.707 0.737 0.870 0.690 (.2,.8,.0) Interpersonal MiniLM + TAG 0.663 0.617 0.653 0.739 0.604 (.2,.8,.0) Islam MiniLM 0.382 0.412 0.453 0.642 0.410 (.1,.9) MiniLM + TAG 0.395 0.427 0.464 0.642 0.421 (.1,.8,.1) Judaism MiniLM + TAG 0.363 0.387 0.432 0.649 0.388 (.2,.8,.0) Law MiniLM 0.663 0.647 0.678 0.803 0.639 (.2,.8) MiniLM + TAG 0.677 0.657 0.687 0.803 0.649 (.1,.8,.1) Lifehacks MiniLM 0.714 0.601 0.617 0.703 0.553 (.1,.9) MiniLM + TAG 0.714 0.621 0.631 0.703 0.568 (.1,.8,.1) Linguistics MiniLM + TAG 0.584 0.588 0.630 0.794 0.587 (.2,.8,.0) Literature MiniLM + TAG 0.871 0.878 0.889 0.934 0.876 (.3,.7,.0) Martialarts MiniLM 0.630 0.599 0.645 0.796 0.596 (.1,.9) MiniLM + TAG 0.640 0.628 0.660 0.796 0.612 (.1,.8,.1) Money MiniLM 0.545 0.535 0.563 0.706 0.515 (.2,.8) MiniLM + TAG 0.559 0.542 0.571 0.706 0.523 (.1,.8,.1) Movies MiniLM 0.713 0.722 0.753 0.865 0.724 (.1,.9) MiniLM + TAG 0.728 0.735 0.762 0.865 0.735 (.1,.8,.1) Music MiniLM 0.508 0.447 0.476 0.602 0.418 (.2,.8) MiniLM + TAG 0.522 0.460 0.486 0.602 0.427 (.1,.8,.1) Musicfans MiniLM + TAG 0.531 0.531 0.560 0.693 0.539 (.1,.9,.0) Opensource MiniLM 0.574 0.593 0.621 0.771 0.581 (.2,.8) MiniLM + TAG 0.577 0.598 0.622 0.771 0.581 (.1,.8,.1) Outdoors MiniLM + TAG 0.681 0.643 0.675 0.819 0.629 (.1,.9,.0) Parenting MiniLM + TAG 0.485 0.430 0.452 0.602 0.399 (.1,.9,.0) Pets MiniLM 0.509 0.531 0.565 0.685 0.523 (.1,.9) MiniLM + TAG 0.519 0.549 0.581 0.685 0.541 (.1,.8,.1) Philosophy MiniLM + TAG 0.568 0.514 0.546 0.707 0.491 (.2,.8,.0) Politics MiniLM + TAG 0.659 0.630 0.659 0.814 0.608 (.1,.9,.0) Rpg MiniLM 0.657 0.646 0.685 0.849 0.640 (.2,.8) MiniLM + TAG 0.677 0.660 0.695 0.849 0.651 (.1,.8,.1) Scifi MiniLM 0.532 0.563 0.596 0.745 0.559 (.2,.8) MiniLM + TAG 0.549 0.574 0.606 0.745 0.569 (.1,.8,.1) Skeptics MiniLM + TAG 0.862 0.869 0.887 0.969 0.867 (.2,.8,.0) Sound MiniLM 0.377 0.410 0.451 0.626 0.405 (.2,.8) MiniLM + TAG 0.380 0.423 0.454 0.626 0.413 (.1,.8,.1) Sports MiniLM 0.673 0.721 0.756 0.902 0.724 (.2,.8) MiniLM + TAG 0.692 0.743 0.775 0.902 0.740 (.1,.8,.1) Sustainability MiniLM 0.657 0.677 0.735 0.895 0.675 (.1,.9) MiniLM + TAG 694 0.716 0.763 0.895 0.706 (.1,.8,.1) Travel MiniLM + TAG 0.546 0.547 0.576 0.700 0.530 (.1,.9,.0) Vegetarianism MiniLM + TAG 0.623 0.626 0.678 0.869 0.641 (.3,.7,.0) Woodworking MiniLM + TAG 0.656 0.654 0.692 0.847 0.645 (.2,.8,.0) Workplace MiniLM + TAG 0.574 0.444 0.429 0.495 0.359 (.3,.7,.0) Writers MiniLM + TAG 0.561 0.490 0.516 0.644 0.466 (.2,.8,.0) Average MiniLM 0.519 0.506 0.536 0.677 0.492 - MiniLM + TAG 0.530 0.515 0.544 0.677 0.500 -

Keywords

information retrieval, question answering, personalization, user model

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 47
    download downloads 67
  • 47
    views
    67
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
1
Average
Average
Average
47
67