<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
handle: 10810/51271
Knowledge transfer between tasks can significantly improve the efficiency of machine learning algorithms. In supervised natural language understanding problems, this sort of improvement is critical since the availability of labelled data is usually scarce. In this paper we address the question of transfer learning between related topic classification tasks. A characteristic of our problem is that the tasks have a hierarchical relationship. Therefore, we introduce and validate how to implement the transfer exploiting this hierarchical structure. Our results for a real-world topic classification task show that the transfer can produce improvements in the behavior of the classifiers for some particular problems. The research presented in this paper is conducted as part of the project EM-PATHIC that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 769872.
transfer learning, neural networks, NLP, hierarchical classification
transfer learning, neural networks, NLP, hierarchical classification
citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |