Views provided by UsageCounts
This file contains a set-to-sequence model trained on the PROCAT dataset of product catalogues. It consists of a Set Transformer module that encodes the set of product offers, and a permutation learning module in the form of a Pointer Network that then implicitly clusters them into complementary sections and outputs the target catalogue structure. References to the authors of the individual used models can be found in the paper describing the dataset, which is under review for the NeurIPS 2021 Dataset track as of June 2021, referenced in the repository with code for repeated experiments, which can be found here: https://github.com/mateuszjurewicz/procat The model requires using the appropriate version of PyTorch and importing the proper model definition, as shown in the provided jupyter notebooks.
machine learning, permutation learning, set-to-sequence, set transformer, pointer network, product catalog, product catalogue
machine learning, permutation learning, set-to-sequence, set transformer, pointer network, product catalog, product catalogue
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 2 |

Views provided by UsageCounts