Published July 5, 2024 | Version v1
Book chapter Open

Modern language models refute Chomsky's approach to language

Authors/Creators

  • 1. UC Berkeley \& Helen Wills Neuroscience Institute

Description

Modern machine learning has subverted and bypassed the theoretical framework
of Chomsky’s generative approach to linguistics, including its core claims to
particular insights, principles, structures, and processes. I describe the sense in which
modern language models implement genuine theories of language, and I highlight
the links between these models and approaches to linguistics that are based on
gradient computations and memorized constructions. I also describe why these
models undermine strong claims for the innateness of language and respond to
several critiques of large language models, including arguments that they can’t
answer “why” questions and skepticism that they are informative about real life
acquisition. Most notably, large language models have attained remarkable success
at discovering grammar without using any of the methods that some in linguistics
insisted were necessary for a science of language to progress.

Files

434-GibsonPoliak-2024-15.pdf

Files (413.0 kB)

Name Size Download all
md5:7f418ce592466d3d50db8c75b86cd342
413.0 kB Preview Download

Additional details

Related works

Is part of
978-3-96110-473-4 (ISBN)
10.5281/zenodo.11351540 (DOI)