The Gradient: Perspectives on AI
In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.
Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.
Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at [email protected]
Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter
Outline:
* (00:00) Intro
* (02:13) Prof Gibson’s background
* (05:33) The computational linguistics community and NLP, engineering focus
* (10:48) Models of brains
* (12:03) Prof Gibson’s focus on behavioral work
* (12:53) How dependency distances impact language processing
* (14:03) Dependency distances and the origin of the problem
* (18:53) Dependency locality theory
* (21:38) The structures languages tend to use
* (24:58) Sentence parsing: structural integrations and memory costs
* (36:53) Reading strategies vs. ordinary language processing
* (40:23) Legalese
* (46:18) Cross-dependencies
* (50:11) Number as a cognitive technology
* (54:48) Experiments
* (1:03:53) Why counting is useful for Western societies
* (1:05:53) The Whorf hypothesis
* (1:13:05) Language as Communication
* (1:13:28) The noisy channel perspective on language processing
* (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims
* (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing
* (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs
* (2:08:48) LLM behavior and internal representations
* (2:12:53) Outro
Links:
* Re-imagining our theories of language
* Research — linguistic complexity and dependency locality theory
* Linguistic complexity: locality of syntactic dependencies (1998)
* The Dependency Locality Theory: A Distance-Based Theory of Linguistic Complexity (2000)
* Consequences of the Serial Nature of Linguistic Input for Sentential Complexity (2005)
* Large-scale evidence of dependency length minimization in 37 languages (2015)
* Dependency locality as an explanatory principle for word order (2020)
* A resource-rational model of human processing of recursive linguistic structure (2022)
* Research — language processing / communication and cross-linguistic universals
* Number as a cognitive technology: Evidence from Pirahã language and cognition (2008)
* The communicative function of ambiguity in language (2012)
* Color naming across languages reflects color use (2017)
* How Efficiency Shapes Human Language (2019)