Skip to main content

Home/ TOK@ISPrague/ Group items tagged Language

Rss Feed Group items tagged

markfrankel18

What's a Metaphor For? - The Chronicle Review - The Chronicle of Higher Education - 1 views

  • "Metaphorical thinking—our instinct not just for describing but for comprehending one thing in terms of another—shapes our view of the world, and is essential to how we communicate, learn, discover and invent. ... Our understanding of metaphor is in the midst of a metamorphosis. For centuries, metaphor has been seen as a kind of cognitive frill, a pleasant but essentially useless embellishment to 'normal' thought. Now, the frill is gone. New research in the social and cognitive sciences makes it increasingly plain that metaphorical thinking influences our attitudes, beliefs, and actions in surprising, hidden, and often oddball ways." Geary further unpacks metaphor's influence in his foreword: "Metaphor conditions our interpretations of the stock market and, through advertising, it surreptitiously infiltrates our purchasing decisions. In the mouths of politicians, metaphor subtly nudges public opinion; in the minds of businesspeople, it spurs creativity and innovation. In science, metaphor is the preferred nomenclature for new theories and new discoveries; in psychology, it is the natural language of human relationships and emotions."
  • The upshot of the boom in metaphor studies, Geary makes clear, is the overturning of that presumption toward literalism: Nowadays, it's believers in a literalism that goes all the way down (so to speak) who are on the defensive in intellectual life, and explorers of metaphor who are on the ascendant. As a result, Geary hardly feels a need to address literalism, devoting most of his book to how metaphor connects to etymology, money, mind, politics, pleasure, science, children, the brain, the body, and such literary forms as the proverb and aphorism.
Lawrence Hrubes

How Children Learn To Read - The New Yorker - 0 views

  • Why is it easy for some people to learn to read, and difficult for others? It’s a tough question with a long history. We know that it’s not just about raw intelligence, nor is it wholly about repetition and dogged persistence. We also know that there are some conditions that, effort aside, can hold a child back. Socioeconomic status, for instance, has been reliably linked to reading achievement. And, regardless of background, children with lower general verbal ability and those who have difficulty with phonetic processing seem to struggle. But what underlies those differences? How do we learn to translate abstract symbols into meaningful sounds in the first place, and why are some children better at it than others?
  • When Hoeft took into account all of the explanatory factors that had been linked to reading difficulty in the past—genetic risk, environmental factors, pre-literate language ability, and over-all cognitive capacity—she found that only one thing consistently predicted how well a child would learn to read. That was the growth of white matter in one specific area of the brain, the left temporoparietal region. The amount of white matter that a child arrived with in kindergarten didn’t make a difference. But the change in volume between kindergarten and third grade did.
  • She likens it to the Dr. Seuss story of Horton and the egg. Horton sits on an egg that isn’t his own, and, because of his dedication, the creature that eventually hatches looks half like his mother, and half like the elephant. In this particular case, Hoeft and her colleagues can’t yet separate cause and effect: Were certain children predisposed to develop strong white-matter pathways that then helped them to learn to read, or was superior instruction and a rich environment prompting the building of those pathways?
markfrankel18

Creativity Creep - The New Yorker - 3 views

  • How did we come to care so much about creativity? The language surrounding it, of unleashing, unlocking, awakening, developing, flowing, and so on, makes it sound like an organic and primordial part of ourselves which we must set free—something with which it’s natural to be preoccupied. But it wasn’t always so; people didn’t always care so much about, or even think in terms of, creativity.
  • It was Romanticism, in the late eighteenth and early nineteenth centuries, which took the imagination and elevated it, giving us the “creative imagination.”
  • How did creativity transform from a way of being to a way of doing? The answer, essentially, is that it became a scientific subject, rather than a philosophical one.
  • ...5 more annotations...
  • All of this measuring and sorting has changed the way we think about creativity. For the Romantics, creativity’s center of gravity was in the mind. But for us, it’s in whatever the mind decides to share—that is, in the product. It’s not enough for a person to be “imaginative” or “creative” in her own consciousness. We want to know that the product she produces is, in some sense, “actually” creative; that the creative process has come to a workable conclusion. To today’s creativity researchers, the “self-styled creative person,” with his inner, unverifiable, possibly unproductive creativity, is a kind of bogeyman; a great deal of time is spent trampling on the scarf of the lone, Romantic genius. Instead, attention is paid to the systems of influence, partnership, power, funding, and reception that surround creativity—the social structures, in other words, that enable managers to reap the fruits of creative labor. Often, this is imagined to be some sort of victory over Romanticism and its fusty, pretentious, élitist ideas about creativity.
  • But this kind of thinking misses the point of the Romantic creative imagination. The Romantics weren’t obsessed with who created what, because they thought you could be creative without “creating” anything other than the liveliness in your own head.
  • It sounds bizarre, in some ways, to talk about creativity apart from the creation of a product. But that remoteness and strangeness is actually a measure of how much our sense of creativity has taken on the cast of our market-driven age
  • Thus the rush, in my pile of creativity books, to reconceive every kind of life style as essentially creative—to argue that you can “unleash your creativity” as an investor, a writer, a chemist, a teacher, an athlete, or a coach.
  • Among the many things we lost when we abandoned the Romantic idea of creativity, the most valuable may have been the idea of creativity’s stillness. If you’re really creative, really imaginative, you don’t have to make things. You just have to live, observe, think, and feel.
markfrankel18

What's your mother's maiden name? It's none of your business | Media | The Guardian - 2 views

  • It’s sexist to ask a woman (but not a man) her maiden name, or to ask anyone for their mother’s maiden name. It’s none of their business, just its none of their business to know whether a woman is married (“Mrs”) or not (“Miss”) unless she chooses to tell them.
Lawrence Hrubes

Beyond 'he' and 'she': The rise of non-binary pronouns - BBC News - 0 views

  • In the English language, the word "he" is used to refer to males and "she" to refer to females. But some people identify as neither gender, or both - which is why an increasing number of US universities are making it easier for people to choose to be referred to by other pronouns.Kit Wilson's introduction when meeting other people is: "Hi, I'm Kit. I use they/them pronouns." That means that when people refer to Kit in conversation, the first-year student at the University of Wisconsin-Milwaukee would prefer them to use "they" rather than "she" or "he".
markfrankel18

The Washington Post Style Guide Now Accepts Singular 'They' | Mental Floss - 0 views

  • Proponents of singular they have long argued that the prohibition makes no sense. Not only is it natural, it has been used in English for centuries. It’s in the King James Bible. Authors like Chaucer, Shakespeare, Swift, Austen, Thackeray, and Shaw used it. Before the production of school textbooks for grammar in the 19th century, no one complained about it or even noticed it. Avoiding it is awkward or necessitates sexist language. Now, in the most recent update to The Washington Post style guide, singular they has been given official approval.
  • it is “the only sensible solution to English’s lack of a gender-neutral third-person singular personal pronoun.”
Lawrence Hrubes

University tells students Britain 'invaded' Australia - BBC News - 0 views

  • A top Australian university has rejected claims it is trying to rewrite the nation's colonial history.Students are being encouraged to use the term "invaded" rather than "settled" or "discovered", and avoid the word "Aborigines".The University of New South Wales (UNSW) Indigenous Terminology guide states that Australia was "invaded, occupied and colonised".But UNSW says it does not mandate what language can and cannot be used.
Lawrence Hrubes

Why this man wants to take the words 'Allahu akbar' back from terrorists - Home | As It... - 1 views

  • Extremists on all sides not only hijack religion and identity and narratives, they also hijack language to rationalize their violent ideology and their violent actions. I want to take it back and say, "No. Allahu akbar means God is great. I use it in prayer."
Lawrence Hrubes

The Great A.I. Awakening - The New York Times - 1 views

  • Translation, however, is an example of a field where this approach fails horribly, because words cannot be reduced to their dictionary definitions, and because languages tend to have as many exceptions as they have rules. More often than not, a system like this is liable to translate “minister of agriculture” as “priest of farming.” Still, for math and chess it worked great, and the proponents of symbolic A.I. took it for granted that no activities signaled “general intelligence” better than math and chess.
  • A rarefied department within the company, Google Brain, was founded five years ago on this very principle: that artificial “neural networks” that acquaint themselves with the world via trial and error, as toddlers do, might in turn develop something like human flexibility. This notion is not new — a version of it dates to the earliest stages of modern computing, in the 1940s — but for much of its history most computer scientists saw it as vaguely disreputable, even mystical. Since 2011, though, Google Brain has demonstrated that this approach to artificial intelligence could solve many problems that confounded decades of conventional efforts. Speech recognition didn’t work very well until Brain undertook an effort to revamp it; the application of machine learning made its performance on Google’s mobile platform, Android, almost as good as human transcription. The same was true of image recognition. Less than a year ago, Brain for the first time commenced with the gut renovation of an entire consumer product, and its momentous results were being celebrated tonight.
Lawrence Hrubes

Think You Always Say Thank You? Oh, Please - The New York Times - 1 views

  • But as it turns out, human beings say thank you far less often than we might think.A new study of everyday language use around the world has found that, in informal settings, people almost always complied with requests for an object, service or help. For their efforts, they received expressions of gratitude only rarely — in about one of 20 occasions.
Lawrence Hrubes

Actually, Gender-Neutral Pronouns Can Change a Culture | WIRED - 1 views

  • So this was the real test. Would native-speaker Swedes, seven years after getting a new pronoun plugged into their language, be more likely to assume this androgynous cartoon was a man? A woman? Either, or neither? Now that they had a word for it, a nonbinary option, would they think to use it?
markfrankel18

Is Economics More Like History Than Physics? | Guest Blog, Scientific American Blog Net... - 3 views

  •  
    "Is economics like physics, or more like history? Steven Pinker says, "No sane thinker would try to explain World War I in the language of physics." Yet some economists aim close to such craziness. Pinker says the "mindset of science" eliminates errors by "open debate, peer review, and double-blind methods," and especially, experimentation. But experiments require repetition and control over all relevant variables. We can experiment on individual behavior, but not with history or macroeconomics."
markfrankel18

The Touch-Screen Generation - Hanna Rosin - The Atlantic - 0 views

    • markfrankel18
       
      This is important!
  • What, really, would Maria Montessori have made of this scene? The 30 or so children here were not down at the shore poking their fingers in the sand or running them along mossy stones or digging for hermit crabs. Instead they were all inside, alone or in groups of two or three, their faces a few inches from a screen, their hands doing things Montessori surely did not imagine. A couple of 3-year-old girls were leaning against a pair of French doors, reading an interactive story called Ten Giggly Gorillas and fighting over which ape to tickle next. A boy in a nearby corner had turned his fingertip into a red marker to draw an ugly picture of his older brother. On an old oak table at the front of the room, a giant stuffed Angry Bird beckoned the children to come and test out tablets loaded with dozens of new apps. Some of the chairs had pillows strapped to them, since an 18-month-old might not otherwise be able to reach the table, though she’d know how to swipe once she did.
markfrankel18

One of Us - Lapham's Quarterly - 0 views

  • These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
« First ‹ Previous 101 - 120 of 213 Next › Last »
Showing 20 items per page