
Modern machine learning has subverted and bypassed the theoretical frameworkof Chomsky’s generative approach to linguistics, including its core claims to particular insights, principles, structures, and processes. I describe the sense in whichmodern language models implement genuine theories of language, and I highlightthe links between these models and approaches to linguistics that are based ongradient computations and memorized constructions. I also describe why thesemodels undermine strong claims for the innateness of language and respond toseveral critiques of large language models, including arguments that they can’t answer “why” questions and skepticism that they are informative about real life acquisition. Most notably, large language models have attained remarkable successat discovering grammar without using any of the methods that some in linguisticsinsisted were necessary for a science of language to progress.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 2 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
