Skip to main content

Home/ Words R Us/ Group items tagged Visual

Rss Feed Group items tagged

1More

Speech in the Home - Forbes.com - 3 views

  •  
    Check out the visualization: http://www.forbes.com/2010/12/21/speechome-interactive-visualization-language-acquisition.html "This interactive visualization provides a look into the most complete record of a single child's speech development ever created... But parentese is not universal. It varies between different parents and cultures, and in some cases has been reported to be absent altogether. What effect, then, does it have on child development? Answering this question could help guide better ways to help children that have difficulty learning language." With links to the actual studies at the bottom of the page.
1More

Think You're An Auditory Or Visual Learner? Maybe Not - 1 views

  •  
    This article talks about learning styles (audio, visual, kinesthetic) and how there has not yet been conclusive proof that they actually exist. It cites a Psychological Science journal, as well as psychologist Dan Willingham. However, while it states there has been no scientific evidence to prove the existence of learning styles, it does not actively disprove their existence.
1More

An interactive visual database for American Sign Language reveals how signs are organiz... - 0 views

  •  
    This article talks about the ASL-LEX database that four scientists created. The map of signs is meant to represent a mental lexicon and is allowing them to examine how signs are organized in the human mind. The article explains that signs may rhyme in a visual way even though the words do not, which causes the brain to relate groups of signs together. One pattern that was noticed is that the more commonly used signs tend to be simpler and shorter than the rare ones, which is comparable to spoken language. They also found that common signs are more likely to be in clusters of visually similar words, while rare signs are more isolated.
1More

Parts of brain can switch functions | MIT News | Massachusetts Institute of Technology - 0 views

  •  
    When your brain encounters sensory stimuli, such as the scent of your morning coffee or the sound of a honking car, that input gets shuttled to the appropriate brain region for analysis. The coffee aroma goes to the olfactory cortex, while sounds are processed in the auditory cortex. That division of labor suggests that the brain's structure follows a predetermined, genetic blueprint. However, evidence is mounting that brain regions can take over functions they were not genetically destined to perform. In a landmark 1996 study of people blinded early in life, neuroscientists showed that the visual cortex could participate in a nonvisual function - reading Braille. Now, a study from MIT neuroscientists shows that in individuals born blind, parts of the visual cortex are recruited for language processing. The finding suggests that the visual cortex can dramatically change its function - from visual processing to language - and it also appears to overturn the idea that language processing can only occur in highly specialized brain regions that are genetically programmed for language tasks. "Your brain is not a prepackaged kind of thing. It doesn't develop along a fixed trajectory, rather, it's a self-building toolkit. The building process is profoundly influenced by the experiences you have during your development," says Marina Bedny, an MIT postdoctoral associate in the Department of Brain and Cognitive Sciences and lead author of the study, which appears in the Proceedings of the National Academy of Sciences the week of Feb. 28.
1More

Baby\'s first words based on what they see most often: Research - 0 views

  •  
    A baby's first words usually have to do with their visual experiences. Familiar objects (e.g. shirts, the table, a spoon, bottle, etc.) can predict which words they'll learn first. This could suggest new ways to help treat autism and language deficits. There could be a correlation between visual processing problems and difficulty learning words. For example, children with autism have visual processing problems, which could explain why they have trouble communicating.
1More

How "twist my arm" engages the brain - 0 views

  •  
    (This article was by my college friend, Quinn Eastman, who's a trained scientist and science writer for Emory University.) Listening to metaphors involving arms or legs loops in a region of the brain responsible for visual perception of those body parts, scientists have discovered. The finding, recently published in Brain & Language, is another example of how neuroscience studies are providing evidence for "grounded cognition" - the idea that comprehension of abstract concepts in the brain is built upon concrete experiences, a proposal whose history extends back millennia to Aristotle. When study participants heard sentences that included phrases such as "shoulder responsibility," "foot the bill" or "twist my arm", they tended to engage a region of the brain called the left extrastriate body area or EBA. The same level of activation was not seen when participants heard literal sentences containing phrases with a similar meaning, such as "take responsibility" or "pay the bill." The study included 12 right-handed, English-speaking people, and blood flow in their brains was monitored by functional MRI (magnetic resonance imaging). "The EBA is part of the extrastriate visual cortex, and it was known to be involved in identifying body parts," says senior author Krish Sathian, MD, PhD, professor of neurology, rehabilitation medicine, and psychology at Emory University. "We found that the metaphor selectivity of the EBA matches its visual selectivity." The EBA was not activated when study participants heard literal, non-metaphorical sentences describing body parts. "This suggests that deep semantic processing is needed to recruit the EBA, over and above routine use of the words for body parts," Sathian says. Sathian's research team had previously observed that metaphors involving the sense of touch, such as "a rough day", activate a region of the brain important for sensing texture. In addition, other researchers have shown t
1More

Bedtime Stories for Young Brains - 3 views

  •  
    This month, the journal Pediatrics published a study that used functional magnetic resonance imaging to study brain activity in 3-to 5-year-old children as they listened to age-appropriate stories. The researchers found differences in brain activation according to how much the children had been read to at home. Children whose parents reported more reading at home and more books in the home showed significantly greater activation of brain areas in a region of the left hemisphere called the parietal-temporal-occipital association cortex. This brain area is "a watershed region, all about multisensory integration, integrating sound and then visual stimulation," said the lead author, Dr. John S. Hutton, a clinical research fellow at Cincinnati Children's Hospital Medical Center. This region of the brain is known to be very active when older children read to themselves, but Dr. Hutton notes that it also lights up when younger children are hearing stories. What was especially novel was that children who were exposed to more books and home reading showed significantly more activity in the areas of the brain that process visual association, even though the child was in the scanner just listening to a story and could not see any pictures. "When kids are hearing stories, they're imagining in their mind's eye when they hear the story," said Dr. Hutton. "For example, 'The frog jumped over the log.' I've seen a frog before, I've seen a log before, what does that look like?" The different levels of brain activation, he said, suggest that children who have more practice in developing those visual images, as they look at picture books and listen to stories, may develop skills that will help them make images and stories out of words later on. "It helps them understand what things look like, and may help them transition to books without pictures," he said. "It will help them later be better readers because they've developed that part of the brain
1More

Visual/spatial learning - 0 views

shared by madiendo15 on 08 Dec 14 - Cached
  •  
    Information regarding visual/spatial learners
1More

When your eyes override your ears: New insights into the McGurk effect - 0 views

  •  
    This article talks about an illusion our brain pulls on called the McGurk effect. This is when visual speech is mismatched with auditory speech and can result in the perception of an entirely different message. The example the article uses is if the visual "ga" were combined with the sound "ba", this results in the perception of "da." Vased on the principle of causal inference, researchers were able to create an algorithm model of multi-sensory speech perception. What this means is that when your brain is given a particular pair of auditory and visual syllables, it calculates the likelihood that they are from a single talker compared to multiple talkers and uses this likelihood to determine the final speech perception.
1More

Researchers Study What Makes Dyslexic Brains Different - 0 views

  •  
    Dyslexia is the most common learning disability in the U.S. Scientists are exploring how human brains learn to read, and are discovering new ways that brains with dyslexia can learn to cope. 2 areas on the left side of the brain are key for reading: 1. the left temporoparietal cortex: traditionally used to process spoken language. When learning to read, we start using it to sound out words. 2. the occipitotemporal cortex: part of the visual processing center, located at the base of our brain, behind our ears. A person who never learned to read uses this part of the brain to recognize objects - like a toaster or a chair. But, as we become fluent readers, we train this brain area to recognize letters and words visually. These words are called sight words: any word that you can see and instantly know without thinking about the letters and sounds. This requires retraining the brain. When recognizing a chair, the brain naturally sees it from many different angles - left, right, up, down - and, regardless of the perspective, the brain knows it is a chair. But that doesn't work for letters. Look at a lowercase 'b' from the backside of the page, and it looks like a lowercase 'd.' They are the same basic shape and, yet, two totally different letters. But, as it does with a chair, the brain wants to recognize them as the same object. Everyone - not just people with dyslexia - has to teach the brain not to conflate 'b' and 'd'. The good news: intervention and training can help. At the end of the six week training sessions with dyslexics, the brain areas typically associated with reading, in the left hemisphere, became more active. Additionally, right hemisphere areas started lighting up and helping out with the reading process. The lead scientist, Dr. Eden, says this is similar to what scientists see in stroke victims, where other parts of the brain start compensating.
1More

Why Mental Pictures Can Sway Your Moral Judgment - 3 views

  •  
    Joshua Greene, Harvard psychologist, posits that we have two competing moral circuits in our brains: a utilitarian, rational, cost-benefits circuit and an emotional circuit. Both circuits battle for dominance in the brain's ventromedial prefrontal cortex. When dilemmas produce vivid images in our heads, we tend to respond emotionally, due to our natural wiring. Take away the pictures - the brain goes into rational, calculation mode. In another experiment, Greene and a colleague, Amit, also found that people who think visually make more emotional moral judgments, whereas verbal people make more rational calculations.
1More

Try The McGurk Effect! - Horizon: Is Seeing Believing? - BBC Two - 1 views

shared by Ryan Catalani on 20 Sep 11 - No Cached
  •  
    Nice BBC clip explaining and illustrating the McGurk effect. "The McGurk effect is a compelling demonstration of how we all use visual speech information. The effect shows that we can't help but integrate visual speech into what we 'hear'."
1More

Left/Right Brain - 4 views

  •  
    Our brains are developed with two distinct groups, the left and right hemispheres, that think in two very different manners. The left brain processes information rationally, sequentially, and logically and uses language, while the right hemisphere processes information randomly and holistically, and uses visuals more. We as humans have a dominant side of the brain which we prefer to use, and by catering to this dominant side (e.g. learning through listening for the left side, and learning through visuals for the right), we are able to learn more and understand more easily. We are not able to survive by only using one side of our brain, and it is necessary to use both and train both.
1More

MIT Scientist Captures 90,000 Hours of Video of His Son's First Words, Graphs It | Fast... - 5 views

  •  
    "In one 40-second clip, you can hear how "gaga" turned into "water" over the course of six months. In a video clip, below, you can hear and watch the evolution of "ball." .... Unreal 3-D visualizations allowed his team to zoom through the house like a dollhouse and map the utterance of each word in its context. In a landscape-like image with peaks and valleys, you can see that the word "water" was uttered most often in the kitchen, while "bye" took place at the door."
2More

Futurity.org - To read words, brain detects motion - 1 views

  •  
    "An area of the brain called the Visual Word Form Area, or VWFA, is activated whenever it sees something that looks like a word-and is so adept at packaging visual input for the brain's language centers that activation happens within a few tens of milliseconds. ... Instead of being "luminance-defined," words can be "motion-defined," distinguishable from their background not by color or contrast, but by their apparent direction of movement. Against a field of dots moving one way, words made up of dots moving in the other direction will "pop out" to most viewers, even if the word and background dots are the same shade."
  •  
    Example of the motion-defined words used in the study: http://news.stanford.edu/news/2011/september/videos/973.html
1More

Neuroscience for Kids - Language - 2 views

  •  
    good visuals, short summary
1More

What's Going On In Your Child's Brain When You Read Them A Story? : NPR Ed : NPR - 0 views

  •  
    For the study, conducted by Dr. John Hutton, a researcher and pediatrician at Cincinnati Children's Hospital, and someone with an interest in emergent literacy, 27 children around age 4 went into an FMRI machine. They were presented with the same story in three conditions: audio only; the illustrated pages of a storybook with an audio voiceover; and an animated cartoon. While the children paid attention to the stories, the MRI, the machine scanned for activation within certain brain networks, and connectivity between the networks. Here's what researchers found: In the audio-only condition (too cold): language networks were activated, but there was less connectivity overall. "There was more evidence the children were straining to understand." In the animation condition (too hot): there was a lot of activity in the audio and visual perception networks, but not a lot of connectivity among the various brain networks. "The language network was working to keep up with the story," says Hutton. "Our interpretation was that the animation was doing all the work for the child. They were expending the most energy just figuring out what it means." The children's comprehension of the story was the worst in this condition. The illustration condition was what Hutton called "just right".When children could see illustrations, language-network activity dropped a bit compared to the audio condition. Instead of only paying attention to the words, Hutton says, the children's understanding of the story was "scaffolded" by having the images as clues. Most importantly, in the illustrated book condition, researchers saw increased connectivity between - and among - all the networks they were looking at: visual perception, imagery, default mode and language. One interesting note is that, because of the constraints of an MRI machine, which encloses and immobilizes your body, the story-with-illustrations condition wasn't actually as good as reading on Mom or Dad's lap. The emotional bon
1More

What Is the Hardest Language in the World to Lipread? - Atlas Obscura - 0 views

  •  
    Interesting article: not simply about lipreading per se, but generally about the importance of visual cues in discerning language and comprehending messages, and the connection between vision and speech perception.
1More

Babies Able to tell Through Visual Cues When Speakers Switch Languages - 0 views

  •  
    A study done with monolingual babies and bilingual babies under a year old showed that in the earlier months, both sets of babies had discrimination abilities to separate languages, but by eight months only the bilinguals had this ability. The study was done by only showing visual clips of people speaking different languages. This means that the babies could differentiate languages by looking at facial movements and the rhythm and shape of the person's mouth.
1More

Top 23 World Languages in One Visualization, By Native Speakers - 0 views

  •  
    This post (a 2021 update of a 2018 post) contains several useful infographics and charts, including a visualization of the top 23 most-spoken languages in the world, distribution of those languages by country, a family tree of Indo-European languages.
1 - 20 of 82 Next › Last »
Showing 20 items per page