Description
By: Yosef Grodzinsky
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
You may also like
熱銷中 Top Trending
Dog Man 14: Dog Man: Big Jim Believes: A Graphic Novel (Dog Man #14)
Sale priceHK$85.00
Regular priceHK$150.00
In stock
Dragon Masters #30 Vortex of the Chaos Dragon (Branches) (Tracey West)
Sale priceHK$55.00
Regular priceHK$69.00
In stock
Dragon Masters #29 (正版) Magic of the Wizard Dragon (Branches) (Tracey West)
Sale priceHK$48.00
Regular priceHK$69.00
In stock
Dragon Masters #28 (正版) Night of the Dream Dragon (Branches) (Tracey West)
Sale priceHK$48.00
Regular priceHK$69.00
In stock
Beast Quest Ice and Fire (15 Books) (Adam Blade)
Sale priceHK$346.00
Regular priceHK$1,257.90
In stock
Magic Tree House Fact Tracker Graphic Novel: Space
Sale priceFrom HK$70.00
Regular priceHK$110.00
In stock
National Geographic Little Kids First Big Book of the World
Sale priceFrom HK$96.00
Regular priceHK$150.00
In stock