Description
By: Yosef Grodzinsky
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
You may also like
Top Trending
Dog Man 14: Dog Man: Big Jim Believes: A Graphic Novel (Dog Man #14)
Sale priceHK$85.00
Regular priceHK$150.00
In stock
Press Start! #17 The Super Jump Between Worlds! (Branches)
Sale priceHK$55.00
Regular priceHK$98.00
In stock
Anzu and the Realm of Darkness: A Graphic Novel
Sale priceFrom HK$85.00
Regular priceHK$140.00
In stock
The Midnight Heist (Geronimo Stilton and The Kingdom of Fantasy #17)
Sale priceHK$128.00
Regular priceHK$200.00
In stock
Cat Kid Comic Club (正版) #04 Collaborations (Dav Pilkey)
Sale priceFrom HK$65.00
Regular priceHK$90.00
In stock
Maisy's (正版) Holiday Picture Book Bag Collection (Lucy Cousins)
Sale priceHK$185.00
Regular priceHK$504.00
In stock