96 posts tagged language
For a Better Brain, Learn Another Language
The cognitive benefits of multilingualism
There’s a certain sinking feeling one gets when thinking of the perfect thing to say just a moment too late. Perhaps a witty parting word could have made all the difference. There is no English word to express this feeling, but the French have the term l’esprit de l’escalier—translated, “stairwell wit”—for this very phenomenon. Nor is there an English word to describe the binge eating that follows an emotional blow, but the Germans have kummerspeck—“grief-bacon”—to do just that. If we had the Swedish word lagom—which means something is just right—the English explanation of Goldilocks’ perfectly temperate soup could have been a lot more succinct. Or the term koi no yokan, a poetic Japanese turn of phrase that expresses the feeling of knowing that you will soon fall in love with the person you have just met. It’s not love at first sight so much as an understanding that love is inevitable. Keats and Byron could have really used a word like that. There are many words that English speakers don’t have. Sometimes Anglophones take from other languages, but often, we have to explain our way around a specific feeling or emotion that doesn’t have its own word, never quite touching on it exactly. “The reason why we borrow words like savoir faire from French is because it’s not part of the culture [in the United States] and therefore that word did not evolve as part of our language,” says George Lakoff, a professor of cognitive science and linguistics at the University of California at Berkeley. (via For a Better Brain, Learn Another Language - The Atlantic)
What’s Up With That: Why It’s So Hard to Lose an Accent
..Overcoming an accent is difficult, even for people who have lived in a foreign country most of their lives, or actors who have spent years training themselves to sound authentic in a second language. And studies have shown that accents can often be a burden. Sure, some accents can be adorable and others can make people sound smarter. But in some places, people with certain accents (mostly foreign, but some regional ones as well) are sometimes seen as unintelligent, uneducated, incompetent, and flat out unpleasant to converse with.While our brains are pretty good at picking up (and on) even very subtle accents, we struggle to transfer that insight to our own speech. Why is that? Scientists say it may come down to the first few months of our lives, before we’ve spoken our first word. For over two decades, researchers at the University of Washington have been figuring out how our brains learn language. Many of their experiments have involved measuring how babies from different parts of the world respond to sounds over time. In one study, they played a reel of sounds common in both Japanese and English to children from each culture. At around 6 months, all of the babies responded equally to sounds from both languages. But by the time they reached 10 months, babies failed to notice sounds that don’t exist in their mother’s tongue. For instance, at 10 months, the Japanese babies were ignoring the “r” and “l” sounds that are nonexistent in Japanese, but common in English. (via What’s Up With That: Why It’s So Hard to Lose an Accent | WIRED)
go read :The Linguistics of LOL
What Internet vernacular reveals about the evolution of language
When two friends created the site I Can Has Cheezburger?, in 2007, to share cat photos with funny, misspelled captions, it was a way of cheering themselves up. They probably weren’t thinking about long-term sociolinguistic implications. But seven years later, the “cheezpeep” community is still active online, chattering away in lolspeak, its own distinctive variety of English. lolspeak was meant to sound like the twisted language inside a cat’s brain, and has ended up resembling a down-South baby talk with some very strange characteristics, including deliberate misspellings (teh, ennyfing), unique verb forms (gotted, can haz), and word reduplication (fastfastfast). It can be difficult to master. One user writes that it used to take at least 10 minutes “to read adn unnerstand” a paragraph. (“Nao, it’z almost like a sekund lanjuaje.”) To a linguist, all of this sounds a lot like a sociolect: a language variety that’s spoken within a social group, like Valley Girl–influenced ValTalk or African American Vernacular English. (The word dialect, by contrast, commonly refers to a variety spoken by a geographic group—think Appalachian or Lumbee.) Over the past 20 years, online sociolects have been springing up around the world, from Jejenese in the Philippines to Ali G Language, a British lingo inspired by the Sacha Baron Cohen character. There’s also Padonkaffsky, an aughts-era slang beloved by Russia’s self-described “scum” (they call themselves Padonki—a garbling of podonok, the actual Russian word for “scum”), with phonetic spellings, offensive language, and a popular meme involving outdoor sex and an inopportune bear. Israel has Fakatsa, a sociolect beloved by teen girls—terms from which have popped up on baby clothes and menstrual-pain products. (via The Linguistics of LOL - Britt Peterson - The Atlantic)
Neuroscientists identify key role of language gene
Neuroscientists have found that a gene mutation that arose more than half a million years ago may be key to humans’ unique ability to produce and understand speech. Researchers from MIT and several European universities have shown that the human version of a gene called Foxp2 makes it easier to transform new experiences into routine procedures. When they engineered mice to express humanized Foxp2, the mice learned to run a maze much more quickly than normal mice. The findings suggest that Foxp2 may help humans with a key component of learning language — transforming experiences, such as hearing the word “glass” when we are shown a glass of water, into a nearly automatic association of that word with objects that look and function like glasses, says Ann Graybiel, an MIT Institute Professor, member of MIT’s McGovern Institute for Brain Research, and a senior author of the study. “This really is an important brick in the wall saying that the form of the gene that allowed us to speak may have something to do with a special kind of learning, which takes us from having to make conscious associations in order to act to a nearly automatic-pilot way of acting based on the cues around us,” Graybiel says. Wolfgang Enard, a professor of anthropology and human genetics at Ludwig-Maximilians University in Germany, is also a senior author of the study, which appears in the Proceedings of the National Academy of Sciences this week. The paper’s lead authors are Christiane Schreiweis, a former visiting graduate student at MIT, and Ulrich Bornschein of the Max Planck Institute for Evolutionary Anthropology in Germany. All animal species communicate with each other, but humans have a unique ability to generate and comprehend language. Foxp2 is one of several genes that scientists believe may have contributed to the development of these linguistic skills. The gene was first identified in a group of family members who had severe difficulties in speaking and understanding speech, and who were found to carry a mutated version of the Foxp2 gene. In 2009, Svante Pääbo, director of the Max Planck Institute for Evolutionary Anthropology, and his team engineered mice to express the human form of the Foxp2 gene, which encodes a protein that differs from the mouse version by only two amino acids. His team found that these mice had longer dendrites — the slender extensions that neurons use to communicate with each other — in the striatum, a part of the brain implicated in habit formation. They were also better at forming new synapses, or connections between neurons. (via Neuroscientists identify key role of language gene — ScienceDaily)
Neuroscientists test the theory that your body shapes your ideas
The player kicked the ball.
The patient kicked the habit.
The villain kicked the bucket.
The verbs are the same.
The syntax is identical.
Does the brain notice, or care, that the first is literal, the second metaphorical, the third idiomatic?
It sounds like a question that only a linguist could love. But neuroscientists have been trying to answer it using exotic brain-scanning technologies. Their findings have varied wildly, in some cases contradicting one another. If they make progress, the payoff will be big. Their findings will enrich a theory that aims to explain how wet masses of neurons can understand anything at all. And they may drive a stake into the widespread assumption that computers will inevitably become conscious in a humanlike way. The hypothesis driving their work is that metaphor is central to language. Metaphor used to be thought of as merely poetic ornamentation, aesthetically pretty but otherwise irrelevant. “Love is a rose, but you better not pick it,” sang Neil Young in 1977, riffing on the timeworn comparison between a sexual partner and a pollinating perennial. For centuries, metaphor was just the place where poets went to show off. But in their 1980 book, Metaphors We Live By, the linguist George Lakoff (at the University of California at Berkeley) and the philosopher Mark Johnson (now at the University of Oregon) revolutionized linguistics by showing that metaphor is actually a fundamental constituent of language. For example, they showed that in the seemingly literal statement “He’s out of sight,” the visual field is metaphorized as a container that holds things. The visual field isn’t really a container, of course; one simply sees objects or not. But the container metaphor is so ubiquitous that it wasn’t even recognized as a metaphor until Lakoff and Johnson pointed it out. From such examples they argued that ordinary language is saturated with metaphors. Our eyes point to where we’re going, so we tend to speak of future time as being “ahead” of us. When things increase, they tend to go up relative to us, so we tend to speak of stocks “rising” instead of getting more expensive. “Our ordinary conceptual system is fundamentally metaphorical in nature,” they wrote.