Skip to main content

Home/ Words R Us/ Group items matching "linguist" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
1More

Sarah Thomason will speak on world's vanishing languages - 1 views

  •  
    Sarah Thomason, linguistics professor, estimates that by 2100, only 700 of the world's 7,000 languages will remain. Although this isn't a full article and mainly an advertisement for her lecture on this same subject, it will most likely be possible to find a full recording online after the event!
1More

Want to Learn Cherokee? How About Ainu? This Startup Is Teaching Endangered Languages ... - 0 views

  •  
    Some linguists estimate that roughly half of the world's 7,000 or so languages are on the verge of extinction. The UK-based startup Tribalingual is trying to prevent those types of sociolinguistic losses, offering classes that connect students with some of the few remaining speakers of endangered languages.
1More

What is Priming? A Psychological Look at Priming & Consumer Behavior - 1 views

  •  
    Priming is a linguistic and psychological concept where a "prime" idea (word, image, etc.) is presented before a "target". The prime might influence what a viewer thinks of the target. Psychological studies use priming in tests such as a completion or lexical decision task in order to test other phenomenon. Priming is also a strategy used in marketing. Advertisers use priming to get you see appeal in their product. Perhaps this is in the form of a commercial where statistics of the product vs other companies' products is shown to enhance their own product. This can also be as simple as playing moody music in a restaurant before you sit down!
1More

The Influence of Working Memory Load on Semantic Priming - 1 views

  •  
    This research article was published to the Journal of Experimental Psychology, but this experiment did include linguistics. The experiments purpose was to see if more engaged working memory could quickly determine whether a word was really a word, thus the effect of the working memory on semantic meaning. They found that a high working memory load impaired the prime and task efficiency.
1More

Linguistics: The pronunciation paradox - 0 views

  •  
    This article explains a study which proved that we tend to over estimate our pronunciation skills when learning a foreign language. It explains how our familiarity with our own accent impacts how we perceive our pronunciation. Finally, the article theorizes about how our perception of our own pronunciation can lead to "fossilization".
1More

Why Itʻs So Hard To Learn Another Language After Childhood - 0 views

  •  
    This article talks about the difficulty of learning new languages at a certain age. It explains how there are different beliefs as to when it becomes difficult for us to become fluent in a second language. Some scientist say the age of 10 is when our ability to learn drops, others say 17-18. This is still an undiscovered mystery in the linguist world.
1More

Picking up a second language is predicted by ability to learn patterns - 2 views

  •  
    Some people seem to pick up a second language with relative ease, while others have a much more difficult time. Now, a new study suggests that learning to understand and read a second language may be driven, at least in part, by our ability to pick up on statistical regularities. Some research suggests that learning a second language draws on capacities that are language-specific, while other research suggests that it reflects a more general capacity for learning patterns. According to psychological scientist and lead researcher Ram Frost of Hebrew University, the data from the new study clearly point to the latter: "These new results suggest that learning a second language is determined to a large extent by an individual ability that is not at all linguistic," says Frost. In the study, Frost and colleagues used three different tasks to measure how well American students in an overseas program picked up on the structure of words and sounds in Hebrew. The students were tested once in the first semester and again in the second semester. The students also completed a task that measured their ability to pick up on statistical patterns in visual stimuli. The participants watched a stream of complex shapes that were presented one at a time. Unbeknownst to the participants, the 24 shapes were organized into 8 triplets -- the order of the triplets was randomized, though the shapes within each triplet always appeared in the same sequence. After viewing the stream of shapes, the students were tested to see whether they implicitly picked up the statistical regularities of the shape sequences.
1More

Study of police language aims to find patterns that may lead to tragic outcomes - Scien... - 0 views

  •  
    With police brutality recently becoming a prominent topic in the political world, linguists are trying to find the link between police language used during these incidents and the incidents themselves. In the study, they analyzed police scanner transcripts and examined police communication ramifications. The goal of this ongoing study is to infer what the police officer is thinking and assuming at the time of the incident.
1More

Infant siblings of autistic children miss language-learning clues | Spectrum | Autism R... - 0 views

  •  
    An unreleased study tracked infants', a group with autistic older siblings and another without, gaze when shown a video of an adult speaking while surrounded by toys. The two groups had similar times in watching the screen and mouth and face of the actor as a whole. But it was shown that the younger siblings may not internalize the linguistic clues that help babies learn language.
1More

Northern Cities Vowel Shift: How Americans in the Great Lakes region are revolutionizin... - 0 views

  •  
    This article talks about different dialects in America and how American dialects are continuing to diverge, primarily with their vowel sounds. In particular, cities in the Great Lakes have been observed as revolutionizing the sound of English. Linguists have observed what's called a "chain shift," where by changing one sound, such as the short "a" sound, would have an effect in changing multiple sounds and therefore altering the Northern Cities dialects. This article goes on to outline the history behind these changes, the unawareness factor people from these cities experience, the racial aspect of how this dialect is diverging, and other points.
1More

Teenagers' role in language change is overstated, linguistics research finds - 1 views

  •  
    This article explains why teenagers are, in fact, not affecting the evolution of language as drastically as we initially thought.
1More

9 Tips to Design Conversational Style for Your Bot - 0 views

  •  
    Interesting article re: designing an efficient and responsive bot: leveraging key human linguistic features in order to have the machine "converse" with humans, comprehend their needs, and respond appropriately. Programming a machine to exhibit the conversational nuances and sophisticated comprehension of a normal human=hard.
1More

How sign language users learn intonation - 2 views

  •  
    A spoken language is more than just words and sounds. Speakers use changes in pitch and rhythm, known as prosody, to provide emphasis, show emotion, and otherwise add meaning to what they say. But a language does not need to be spoken to have prosody: sign languages, such as American Sign Language (ASL), use movements, pauses and facial expressions to achieve the same goals. In a study appearing in the September 2015 issue of Language, three linguists look at intonation (a key part of prosody) in ASL and find that native ASL signers learn intonation in much the same way that users of spoken languages do. Children learning ASL acquired prosodic features in three stages of "appearance, reorganization, and mastery": accurately replicating their use in simpler contexts, attempting unsuccessfully at first to use them in more challenging contexts, then using them accurately in all contexts as they fully learn the rules of prosody. Previous research has shown that native learners of spoken languages acquire intonation following a similar pattern.
1More

Creating Bilingual Minds - 1 views

  •  
    In this TED-Talk, Dr. Naja Ferjan Ramirez, linguistics professor at the University of Washington and a specialist in the brain processes of children 0-3 years, lays out the benefits of bilingualism, tells how to optimize language learning to achieve better acquisition, and dispels some common concerns about the cons of creating a bilingual child. No surprises here: start early, and create conditions where babies are exposed to the desired target languages-this will enable babies to process the sounds of dual languages, not just one. Ideally, babies will have frequent, social interactions with fully-competent, fluent speakers of the target languages. Ramirez also mentions a major cognitive benefit to bilingualism: a strengthened prefrontal cortex: the area of the brain that deals with task-switching and flexible thinking.
1More

Post-Neolithic Diet-Induced Dental Changes Led to Introduction of 'F' and 'V' Sounds - 3 views

  •  
    One of the central questions of Words R Us is what conditions fostered the emergence of language. In this article, you can discover where the 'F' and 'V' sounds, so challenging to replicate in ventriloquism, came from. A hint is that diet influenced the human bite and mouth shape, but take a peek to find out more!
1More

What sign language teaches us about the brain - 3 views

  •  
    Neuroimaging studies suggest that sign languages are complex linguistic systems processed much like spoken languages, even though they're gestural/visual, not oral. Wernicke's area activates when perceiving sign language; Broca's when producing sign language. In deaf people, lesions in left hemisphere "speech centres" like Broca's and Wernicke's areas produced significantly more sign errors on naming, repetition and sentence-comprehension tasks than signers with damaged right hemispheres.
1More

Does Your Language Shape How You Think? - 4 views

  •  
    Some 50 years ago, the renowned linguist Roman Jakobson pointed out a crucial fact about differences between languages in a pithy maxim: "Languages differ essentially in what they must convey and not in what they may convey." This maxim offers us the key to unlocking the real force of the mother tongue: if different languages influence our minds in different ways, this is not because of what our language allows us to think but rather because of what it habitually obliges us to think about. When your language routinely obliges you to specify certain types of information, it forces you to be attentive to certain details in the world and to certain aspects of experience that speakers of other languages may not be required to think about all the time. And since such habits of speech are cultivated from the earliest age, it is only natural that they can settle into habits of mind that go beyond language itself, affecting your experiences, perceptions, associations, feelings, memories and orientation in the world.
3More

Profanity's Roots in Brain Chemistry? Damn Right - 5 views

  •  
    Over the years, we have found that our words come from different parts of the brain. In addition the part of the brain which we use to formulate thoughts into sentences, we also use the part of the brain that deals with emotion when we swear. Researchers discovered that patients with neurodegenerative diseases like a stroke, were still able to swear. Studying patients with Tourette syndrome have also proved that swearing uses many areas of the brain. Since swearing involves the emotional part of the brain, we know that profanity is used to express intense emotions.
  •  
    Regular speech is generated in the left hemisphere, in an area of the brain close to the surface. The cerebral cortex, or "gray matter," is often associated with higher thought processes such as thought and action. "It's sophisticated," says Bergen, "and comports with the idea of what it means to be human." Swearing, on the other hand, is generated much deeper in the brain, in regions that are older and more primitive in evolutionary terms, says Bergen. These regions are often found in the right hemisphere in the brain's emotional center, the limbic system."These are words that express intense emotions-surprise, frustration, anger, happiness, fear," says psychologist and linguist Timothy Jay, who began studying profanity more than 40 years ago."[Swearing] serves my need to vent, and it conveys my emotions to other people very effectively and symbolically," he says. "Where other animals like to bite and scratch each other, I can say 'f*ck you' and you get my contempt-I don't have to do it physically." Profanity serves other purposes, too. Lovers use it as part of enticing sex talk; athletes and soldiers use it to forge camaraderie; and people in positions of power use it to reaffirm their superiority. Profanity is even used as a celebratory expression, says Adams, citing "F*ck yeah!" as an example. The meaning of a profanity, like any other word, changes with time, culture and context. While swear words have been around since Greek and Roman times, and maybe even earlier, the types of things people consider offensive have changed. "People of the Middle Ages had no problems talking about sex or excrement, that was not their hang-up," Adams explains. "Their hang-up was talking about God disrespectfully...so that was what a profanity was."
  •  
    The left hemisphere of the brain is responsible for emotions like happiness, sadness, and anger. The part of the brain that we use to formulate thoughts into sentences is that part that we also use to deal with emotion when we swear. Different studies done on people found with brain issues/diseases allowed researchers to understand that profanity is used to express the extreme emotions.
1More

Why some words hurt some people and not others - 0 views

  •  
    The author, a specialist and researcher in linguistics and discourse analysis, was interested in communication between individuals from different cultures. The misunderstandings it provokes are often based on unconscious reflexes and reference points which makes them all the more damaging. Communication between humans would be very difficult, if not impossible, without discursive memory. Our memories allow us to understand each other. Gregory Charles says in a tweet after the attack at the Grand Mosque in 2017, "Every nasty word we utter joins sentences, then paragraphs, pages and manifestos and ends up killing the world." This idea is defined by specialists in discourse analysis by theconcent of interdiscoursement. Not being aware of this discursive mechanism can cause many misunderstandings. Understanding it certainly helps to communicate better. Putting yourself in your audience's place is the key to good communication.
« First ‹ Previous 421 - 440 of 498 Next › Last »
Showing 20 items per page