You are currently viewing Do we need language to think?

Do we need language to think?

For thousands of years, philosophers have argued about the purpose of language. Plato believed it was essential to thinking. Thought “is the silent inner conversation of the soul with itself,” he wrote.

Many modern scientists express similar views. Beginning in the 1960s, Noam Chomsky, a linguist at MIT, argued that we use language for reasoning and other forms of thinking. “If there is a serious deficit in language, there will be a serious deficit in thought,” he wrote.

As an undergraduate, Evelina Fedorenko attended Dr. Chomsky’s class and heard him describe his theory. “I really liked the idea,” she recalls. But she was puzzled by the lack of evidence. “A lot of the things he was saying were just stated as if they were facts – the truth,” she said.

Dr. Fedorenko became a cognitive neuroscientist at MIT, using brain scans to study how the brain produces language. And after 15 years, her research led her to a surprising conclusion: We don’t need language to think.

“When you start evaluating it, you just don’t find support for this role of language in thinking,” she said.

When Dr. Fedorenko began this work in 2009, studies found that the same areas of the brain needed for language were also active when people reasoned or did arithmetic.

But Dr. Fedorenko and other researchers found that this overlap was a mirage. Part of the problem with the early results was that the scanners were relatively crude. The scientists made the most of their fuzzy scans by combining the results from all their volunteers, creating an overall average of brain activity.

In his own research, Dr. Fedorenko used more powerful scanners and conducted more tests on each volunteer. These steps allowed her and her colleagues to collect enough data from each person to create a fine-grained picture of the individual brain.

The scientists then conducted studies to determine the brain circuits involved in language tasks such as retrieving words from memory and following the rules of grammar. In a typical experiment, volunteers read nonsense words followed by true sentences. Scientists have discovered certain areas of the brain that only become active when volunteers are processing actual language.

Each volunteer had a language network—a constellation of regions that become active during language tasks. “It’s very stable,” said Dr. Fedorenko. “If I scan you today and 10 or 15 years from now, it will be in the same place.”

The researchers then scanned the same people while they performed different types of thinking, such as solving a puzzle. “Other regions in the brain are working really hard when you’re doing all these forms of thinking,” she said. But the language networks remained silent. “It became clear that none of these things seemed to engage language chains,” she said.

In a paper published Wednesday in Nature, Dr. Fedorenko and her colleagues say that studies of people with brain injuries point to the same conclusion.

Strokes and other forms of brain damage can destroy the language network, leaving people struggling to process words and grammar, a condition known as aphasia. But scientists have found that people can still do algebra and play chess even with aphasia. In experiments, people with aphasia can look at two numbers—say 123 and 321—and figure out that, using the same pattern, 456 must be followed by 654.

If language is not fundamental to thought, then what is language for? Communication, Dr. Fedorenko and her colleagues argue. Dr. Chomsky and other researchers rejected this idea, citing the ambiguity of words and the difficulty of expressing our intuitions out loud. “The system is not well designed in many functional respects,” Dr. Chomsky once said.

But large studies show that languages ​​are optimized to convey information clearly and effectively.

In one study, researchers found that frequently used words are shorter, making languages ​​easier to learn and speeding up the flow of information. In another study, researchers who studied 37 languages ​​found that the rules of grammar put words close together so that their combined meaning is easier to understand.

Kyle Mahowald, a linguist at the University of Texas at Austin who was not involved in the new work, said the separation of thought and language may help explain why AI systems like ChatGPT are so good at some tasks and so bad at others .

Computer scientists train these programs on vast amounts of text, uncovering rules about how words are related. Dr. Mahowald suspects that these programs begin to mimic the language network in the human brain, but fail to reason.

“We may have a very fluent grammatical text that may or may not have a coherent main thought,” Dr Mahowald said.

Dr. Fedorenko noted that many people intuitively believe that language is essential to thought because they have an inner voice that narrates their every thought. But not everyone has this ongoing monologue. And few studies have investigated the phenomenon.

“I don’t have a model of it yet,” she said. “I haven’t even done what I need to do to speculate like that.”

Leave a Reply