Downloads provided by UsageCounts
Resource-Rational Lossy-Context Surprisal is a computationally implemented model of how humans process language, predicting at what points in complex sentences they experience comprehension difficulty. It unifies the memory-based and expectation-based paradigms in psycholinguistics, and provides a more refined account of when hierarchical structure is difficult to comprehend for humans. This repository contains output of the model on a battery of test sentences exhibiting iterated recursive structure, described in associated publications on Resource-Rational Lossy-Context Surprisal. The filenames are referred to in the source code, to be published together with a forthcoming journal publication on the model. The model was first described in the following publication: Lexical Effects in Structural Forgetting: Evidence for Experience-Based Accounts and a Neural Network Model (Michael Hahn, Richard Futrell, Edward Gibson), 33rd Annual CUNY Human Sentence Processing Conference, 2020
Psycholinguistics
Psycholinguistics
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 18 | |
| downloads | 1 |

Views provided by UsageCounts
Downloads provided by UsageCounts