Skip to Content
Uncategorized

Learning the Language

Kenneth Wexler is combining biological research and linguistic theory to unravel the mystery of how we acquire language.
August 1, 2005

Philosophers are content to hem and haw around one question for centuries without ever making an inch of tangible progress. Socrates, after all, asked questions; he didn’t answer them. But nowadays, some of the most interesting philosophical questions are scientific. Kenneth Wexler, professor of linguistics and brain and cognitive sciences, says the question he would like to see answered by science is one that is already centuries old and well worn by philosophers: how does the human mind produce and interpret language?

Sitting in his fourth-floor office overlooking Kendall Square, Wexler, who came to MIT in 1988, talks about language with the animation and enthusiasm of a researcher who is just beginning his career. He’s undaunted that the question he’s trying to answer has challenged the greatest minds in philosophy. “For the first time, I think we’re in a position to really study it,” he says.

That’s because researchers like Wexler are developing a new approach–one that combines the rich tradition of theory from linguistics with decades of data from cognitive-science research. Over the past 15 years, advances in scanning and imaging technology, along with progress in mapping the genome, have heralded hope of linking biological research ever more tightly with linguistic studies. Linguistic researchers across the country are starting to use traditional linguistic theory to drive experiments and, in turn, are feeding empirical results back into the theory. It’s a formula that works in most sciences, and one that has made Wexler a leading figure in psycholinguistics–the study of the cognitive aspects of language acquisition and use–for almost two decades.

In the late 1980s and throughout the 1990s, Wexler unraveled the details of how and when language emerges in children. He showed that some of the more complex aspects of language use develop in discrete stages. For example, certain elements of language, such as passive construction, normally show up in the speech of children between five and seven years of age–a pattern that is generally consistent across languages. Most adults understand that the sentence “John was pushed by Mary” describes exactly the same event as “Mary pushed John.” But to a young child, that relationship may not be so clear. Wexler maintains that children have trouble using and understanding the passive construction until they are about six or seven because the necessary biology hasn’t yet matured. In this view, children don’t “learn” the more complex aspects of their native language any more than they “learn” to grow adult teeth.

Wexler’s findings are now widely accepted. Nevertheless, questions about language development abound. How do toddlers pick up the subtleties and nuances of grammar without formal training and with such inconsistent and incomplete input from adults? Which linguistic faculties (if any) are present at birth, and which have to be learned? Why does language develop in stages? A classic method of determining how something works is to study what happens when it breaks. So Wexler is focusing on the speech of stroke patients and of children with genetically based developmental disorders. He hopes that by studying what happens when language breaks down, he’ll be able to address broader questions, such as which parts of the brain govern language use, and how genetics might steer language acquisition.

Philosophical Origins
Wexler’s ideas draw heavily from traditional linguistic theory–and particularly from the work of his MIT colleague Noam Chomsky, who in the late 1950s revolutionized the study of language by presenting a forceful argument that people have innate linguistic knowledge. Chomsky claimed that humans are endowed with a knowledge of “universal grammar”–a set of principles that are consistent across all languages. Such principles aren’t learned: we’re born with them. It is up to children to learn the particulars of their native languages.

Chomsky’s nativism–the belief that the mind has ideas that don’t come from external sources–broke from the predominant belief of the time, that language is entirely learned through exposure and experience. His ideas were “absolutely radical,” says Wexler. “Instead of thinking of language primarily as a cultural phenomenon, or a social phenomenon, he resituated it within the sciences as part of human biology. That was one of the biggest revolutions in the study of language.”

The idea of a universal grammar has been hotly debated ever since Chomsky introduced it. Now, research like Wexler’s is adding credence to Chomsky’s theories. “Ken’s work is important in that it has supported Chomsky’s nativist position,” says Rosalind Thornton, a senior lecturer in the linguistics department at Macquarie University in Sydney, Australia, who worked as a postdoc in Wexler’s lab from 1990 to 1993. “Ken has maintained and supported the idea over the years that there is a universal grammar that we’re born with.”

Gathering the Evidence
If you want to see Wexler at work, you’re better off heading to a day-care center than to a lab. His graduate students evaluate children’s speech by playing simple games with them. The students chart each child’s age, and which grammatical constructions he or she uses and understands. Later, the results can be compared with data from children with developmental disorders.

One area that holds promise for Wexler’s research is the study of Williams syndrome, a rare genetic condition that comes with a host of developmental delays and learning disabilities. Children with Williams syndrome have, in the past, been used to argue that linguistic and cognitive abilities are not identical: they have low IQs (usually around 50 or 60) but are said to have exceptional language skills, making them sound, at times, more fluent and expressive than non–Williams syndrome children of the same age with similarly low IQs.

But new research is challenging that analysis. Last summer, Alexandra Perovic, a postdoc in Wexler’s lab, conducted a study of Williams syndrome children. Wexler and Perovic found that while language development in Williams syndrome children mirrors that in typical children, those grammatical structures that are delayed in typical development are even more delayed in Williams syndrome children, or may not be acquired at all. Children with Williams syndrome “were supposed to be an example of people who really have language under control,” says Wexler. “But it turns out they actually do have problems.” Perovic will resume her work with Williams syndrome patients later this year.

Because Williams syndrome is a well-defined genetic disorder, Wexler says, it may indicate correlations between particular sets of genes and specific linguistic problems. Wexler hopes that studying Williams syndrome and similar disorders will yield clues about the genetic makeup of language. “Thirty years ago, I couldn’t imagine you could even begin to think about how to do this.” He has no illusions that the task will be easy, however, and figures that his studies will bear fruit “maybe not in my lifetime, but maybe in my students’ lifetime.”

Wexler’s studies have already yielded a practical diagnostic tool. The description of typical language development that he presented in the 1990s led to a major test used to identify a condition known as specific language impairment, in which a child’s ability to use language, but not his or her cognition, is impaired. Other research may eventually result in better speech therapy for stroke patients. But for the most part, Wexler’s work has the flavor of philosophy: plenty of questions and very few answers.

“It’s hard for me to think that anything could be as exciting as understanding what the nature of language is in the brain,” says Wexler. Philosophers have often said that language is what distinguishes us from animals, what makes us fundamentally human. “Some theologians might say it’s the soul. But how would you have religion without language? Language is what allows cognition,” says Wexler. “We couldn’t be anything like what we are without language. We couldn’t do science, we couldn’t have social interaction of the type that we do, we couldn’t talk about the weather.”

For all his emphasis on biology and empirical evidence, Wexler seems a philosopher at heart.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.