Skip to main content

Home/ Words R Us/ Group items matching "consonants" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Lisa Stewart

Pokorny Root Index - 1 views

  •  
    each consonant and vowel sound from proto-Indo-European (links from each letter take you to all the traceable words)
Ryan Catalani

Consonants: The thorny thicket of "th-" sounds | The Economist - 2 views

  •  
    "Indeed the dental fricatives, as they're known, are rare, existing in European languages today only in languages on the continental periphery....Read below for the intriguing, but possibly chimerical, link between dental fricatives and blood type."
kaylynfukuji17

Orangutan squeaks reveal language evolution, says study - BBC News - 0 views

  •  
    In this article, scientists analyze 5,000 orangutan "kiss squeaks." They think that our ancestors used orangutan kiss squeaks as a precursor of consonants and vowels. The kiss squeaks require the action of the lips, tongue, and jaw similar to the movements needed to say consonants. Scientists believe orangutan kiss squeaks to be the crucial "building blocks" in the evolution of language.
kainoapaul22

Neanderthals Listened to the World Much Like Us - 0 views

  •  
    This article describes a recent study in which scientists were able to use CT scans to generate 3-D models of Neanderthal ear structures. In the past, attempts to determine whether Neanderthals used language hinged on the hyoid bone, a single piece of the Neanderthal vocal tract. However, these scientists took a different approach by looking at the ears of Neanderthals to give clues about Neanderthal language. By running the ear models through computer programs, scientists were able to determine that the Neanderthal ear's "sweet spot" included higher frequencies characteristic of consonant production, and therefore human language. This is exciting because it gives scientists another piece in the puzzle of early human language development.
Lara Cowell

Pretending to Understand What Babies Say Can Make Them Smarter - 0 views

  •  
    New research suggests it's how parents talk to their infants, not just how often, that makes a difference for language development. Infants whose mothers had shown "sensitive" responses--verbally replied to or imitated the babies' sounds--showed increased rates of consonant-vowel vocalizations, meaning that their babbling more closely resembled something like real syllables, paving the way for real words. The same babies were also more likely to direct their noises at their mothers, indicating that they were "speaking" to them rather than simply babbling for babbling's sake. "The infants were using vocalizations in a communicative way, in a sense, because they learned they are communicative," study author Julie Gros-Louis, a psychology professor at the University of Iowa, said in a statement. In other words, by acting like they understood what their babies were saying and responding accordingly, the mothers were helping to introduce the concept that voices, more than just instruments for making fun noises, could also be tools for social interaction.
Ryan Catalani

"Language X is essentially language Y under conditions Z." - 4 views

  •  
    For example: "English is essentially bad Dutch with outrageously pronounced French and Latin vocabulary." "Hawaiian is a cousin of Indonesian with a fear of consonants." "Spanish is essentially Italian spoken by Arabs."
misamurata17

'Th' sound to vanish from English language by 2066 because of multiculturalism, say linguists - 1 views

  •  
    By 2066, linguists are predicting that the "th" sound will vanish completely in the capital because there are so many foreigners who struggle to pronounce interdental consonants - the term for a sound created by pushing the tongue against the upper teeth.
calistaagmata21

Opposite Patterns of Hemisphere Dominance for Early Auditory Processing of Lexical Tones and Consonants: JSTOR - 0 views

  •  
    Researchers have looked into how tonal languages are processed in the brain. They have found that lexical tones are processed in the right hemisphere while consonats are processed in the left hemisphere,
Lara Cowell

The Linguistic Mystery of Tonal Languages - The Atlantic - 0 views

  •  
    In many languages, such as Mandarin Chinese, pitch is as important as consonants and vowels for distinguishing one word from another. Tone languages are spoken all over the world, but they tend to cluster in three places: East and Southeast Asia; sub-Saharan Africa; and among the indigenous communities of Mexico. There are certain advantages to speaking tone languages. Speakers of some African languages can communicate across long distances playing the tones on drums, and Mazatec-speakers in Mexico use whistling for the same purpose. Also, speakers of tonal languages are better at identifying musical pitches than speakers of non-tonal languges.
Lara Cowell

Neuroscientists Pinpoint Brain Cells Responsible For Recognizing Intonation : Shots - Health News : NPR - 1 views

  •  
    Scientists are reporting in the journal Science that they have identified specialized brain cells that help us understand what a speaker really means. These cells do this by keeping track of changes in the pitch of the voice. "We found that there were groups of neurons that were specialized and dedicated just for the processing of pitch," says Dr. Eddie Chang, a professor of neurological surgery at the University of California, San Francisco. Chang says these neurons allow the brain to detect "the melody of speech," or intonation, while other specialized brain cells identify vowels and consonants. "Intonation is about how we say things," Chang says. "It's important because we can change the meaning, even - without actually changing the words themselves." The identification of specialized cells that track intonation shows just how much importance the human brain assigns to hearing, says Nina Kraus, a neurobiologist who runs the Auditory Neuroscience Laboratory at Northwestern University. "Processing sound is one of the most complex jobs that we ask our brain to do," Kraus says. And it's a skill that some brains learn better than others, she says. Apparently, musicians, according to a study conducted by Kraus, are better than non-musicians at recognizing the subtle tonal changes found in Mandarin Chinese. On the other hand, recognizing intonation is a skill that's often impaired in people with autism, Kraus says. "A typically developing child will process those pitch contours very precisely," Kraus says. "But some kids on the autism spectrum don't. They understand the words you are saying, but they are not understanding how you mean it." The new study suggests that may be because the brain cells that usually keep track of pitch aren't working the way they should.
Lara Cowell

The Linguistic Mystery of Tonal Languages - The Atlantic - 1 views

  •  
    In many languages, pitch is as important as consonants and vowels for distinguishing one word from another. In English, "pay" and "bay" are different because they have different starting sounds. But imagine if "pay" said on a high pitch meant "to give money," while "pay" said on a low pitch meant "a broad inlet of the sea where the land curves inward." That's what it feels like to speak what linguists call a tonal language. At least a billion and a half people worldwide do it their entire lives and think nothing of it. The article goes on to talk about which areas of the world have the highest concentration of tonal languages and reasons why that might be, also some of the advantages of speaking a tonal language.
Lara Cowell

Device taps brain waves to help paralyzed man communicate - 1 views

  •  
    Today, people who can't speak or write because of paralysis have very limited ways of communicating, e.g. using a pointer to touch words or letters on a screen or having computers track their eye movements. In a medical first, researchers harnessed the brain waves of a paralyzed man unable to speak - and turned what he intended to say into sentences on a computer screen. Dr. Edward Chang, a neurosurgeon at the University of California, San Francisco, led the work in developing a "speech neuroprosthetic" -- decoding brain waves that normally control the vocal tract, the tiny muscle movements of the lips, jaw, tongue and larynx that form each consonant and vowel.
1 - 12 of 12
Showing 20 items per page