Skip to Content

How Can Humans Talk?

A new study links the evolution of single gene to human capacity for language.
November 11, 2009

The first concrete evidence of a genetic link to the evolution of language in humans was published today in the journal Nature. Researchers led by UCLA neurogeneticist Daniel Geschwind have shown that two small differences between the human and chimpanzee versions of a protein called FOXP2 result in significant differences in the behavior of dozens of genes in human neurons.

FOXP2 is a protein known as a transcription factor; its role is to turn other genes off or on. Geschwind and his collaborators deleted the native gene for FOXP2 from a lab-grown line of human neurons. They then inserted either the gene for human or chimp FOXP2 into the cells and screened the cells to see which genes were being expressed, or actively producing proteins. The researchers identified dozens of genes that were expressed at either higher or lower levels depending on whether the cells were making human or chimp FOXP2. They verified these findings by examining gene expression patterns in post-mortem brain tissue from both chimps and humans who died of natural causes.

Lab-grown nerve cells expressing human FOXP2, a gene believed to be involved in the evolution of language. Cells that are orange/red are the cells making the most FOXP2. Credit: Gena Konopka

Geschwind says that the new study demonstrates that the two mutations believed to be important to FOXP2’s evolution in humans change not only how the protein looks but also how it works, resulting in different gene targets being switched on or off in human and chimp brains. “Our findings may shed light on why human brains are born with the circuitry for speech and language and chimp brains are not,” Geschwind said in a UCLA press release on the research.

Geschwind and other scientists have been studying FOXP2’s role in the development of language since the gene’s discovery in 2001. Jon Cohen reported on some of Geschwind’s ground-breaking research on the genetics of language in the January 2008 issue of Technology Review.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.