Skip to main content

Home/ TOK Friends/ Group items tagged vocabulary

Rss Feed Group items tagged

Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
kushnerha

BBC - Future - Will emoji become a new language? - 2 views

  • Emoji are now used in around half of every sentence on sites like Instagram, and Facebook looks set to introduce them alongside the famous “like” button as a way of expression your reaction to a post.
  • If you were to believe the headlines, this is just the tipping point: some outlets have claimed that emoji are an emerging language that could soon compete with English in global usage. To many, this would be an exciting evolution of the way we communicate; to others, it is linguistic Armageddon.
  • Do emoji show the same characteristics of other communicative systems and actual languages? And what do they help us to express that words alone can’t say?When emoji appear with text, they often supplement or enhance the writing. This is similar to gestures that appear along with speech. Over the past three decades, research has shown that our hands provide important information that often transcends and clarifies the message in speech. Emoji serve this function too – for instance, adding a kissy or winking face can disambiguate whether a statement is flirtatiously teasing or just plain mean.
  • ...17 more annotations...
  • This is a key point about language use: rarely is natural language ever limited to speech alone. When we are speaking, we constantly use gestures to illustrate what we mean. For this reason, linguists say that language is “multi-modal”. Writing takes away that extra non-verbal information, but emoji may allow us to re-incorporate it into our text.
  • Emoji are not always used as embellishments, however – sometimes, strings of the characters can themselves convey meaning in a longer sequence on their own. But to constitute their own language, they would need a key component: grammar.
  • A grammatical system is a set of constraints that governs how the meaning of an utterance is packaged in a coherent way. Natural language grammars have certain traits that distinguish them. For one, they have individual units that play different roles in the sequence – like nouns and verbs in a sentence. Also, grammar is different from meaning
  • When emoji are isolated, they are primarily governed by simple rules related to meaning alone, without these more complex rules. For instance, according to research by Tyler Schnoebelen, people often create strings of emoji that share a common meaning
  • This sequence has little internal structure; even when it is rearranged, it still conveys the same message. These images are connected solely by their broader meaning. We might consider them to be a visual list: “here are all things related to celebrations and birthdays.” Lists are certainly a conventionalised way of communicating, but they don’t have grammar the way that sentences do.
  • What if the order did matter though? What if they conveyed a temporal sequence of events? Consider this example, which means something like “a woman had a party where they drank, and then opened presents and then had cake”:
  • In all cases, the doer of the action (the agent) precedes the action. In fact, this pattern is commonly found in both full languages and simple communication systems. For example, the majority of the world’s languages place the subject before the verb of a sentence.
  • These rules may seem like the seeds of grammar, but psycholinguist Susan Goldin-Meadow and colleagues have found this order appears in many other systems that would not be considered a language. For example, this order appears when people arrange pictures to describe events from an animated cartoon, or when speaking adults communicate using only gestures. It also appears in the gesture systems created by deaf children who cannot hear spoken languages and are not exposed to sign languages.
  • describes the children as lacking exposure to a language and thus invent their own manual systems to communicate, called “homesigns”. These systems are limited in the size of their vocabularies and the types of sequences they can create. For this reason, the agent-act order seems not to be due to a grammar, but from basic heuristics – practical workarounds – based on meaning alone. Emoji seem to tap into this same system.
  • Nevertheless, some may argue that despite emoji’s current simplicity, this may be the groundwork for emerging complexity – that although emoji do not constitute a language at the present time, they could develop into one over time.
  • Could an emerging “emoji visual language” be developing in a similar way, with actual grammatical structure? To answer that question, you need to consider the intrinsic constraints on the technology itself.Emoji are created by typing into a computer like text. But, unlike text, most emoji are provided as whole units, except for the limited set of emoticons which convert to emoji, like :) or ;). When writing text, we use the building blocks (letters) to create the units (words), not by searching through a list of every whole word in the language.
  • emoji force us to convey information in a linear unit-unit string, which limits how complex expressions can be made. These constraints may mean that they will never be able to achieve even the most basic complexity that we can create with normal and natural drawings.
  • What’s more, these limits also prevent users from creating novel signs – a requisite for all languages, especially emerging ones. Users have no control over the development of the vocabulary. As the “vocab list” for emoji grows, it will become increasingly unwieldy: using them will require a conscious search process through an external list, not an easy generation from our own mental vocabulary, like the way we naturally speak or draw. This is a key point – it means that emoji lack the flexibility needed to create a new language.
  • we already have very robust visual languages, as can be seen in comics and graphic novels. As I argue in my book, The Visual Language of Comics, the drawings found in comics use a systematic visual vocabulary (such as stink lines to represent smell, or stars to represent dizziness). Importantly, the available vocabulary is not constrained by technology and has developed naturally over time, like spoken and written languages.
  • grammar of sequential images is more of a narrative structure – not of nouns and verbs. Yet, these sequences use principles of combination like any other grammar, including roles played by images, groupings of images, and hierarchic embedding.
  • measured participants’ brainwaves while they viewed sequences one image at a time where a disruption appeared either within the groupings of panels or at the natural break between groupings. The particular brainwave responses that we observed were similar to those that experimenters find when violating the syntax of sentences. That is, the brain responds the same way to violations of “grammar”, whether in sentences or sequential narrative images.
  • I would hypothesise that emoji can use a basic narrative structure to organise short stories (likely made up of agent-action sequences), but I highly doubt that they would be able to create embedded clauses like these. I would also doubt that you would see the same kinds of brain responses that we saw with the comic strip sequences.
Emily Freilich

Are You Smarter Than Your Grandfather? Probably Not. | Science | Smithsonian - 1 views

  • IQ test scores had significantly risen from one generation to the nex
  • widespread increase in IQ scores, and reveals some new ones, regarding teenagers’ vocabularies and the mental decline of the extremely bright in old age. Ultimately, Flynn concludes that human beings are not smarter—just more modern
  • there is a subtest called “similarities,” which asks questions like, what do dogs and rabbits have in common? Or what do truth and beauty have in common? On this subtest, the gains over those 50 years have been quite extraordinary, something like 25 points. The arithmetic subtest essentially tests arithmetical reasoning, and on that, the gains have been extremely small.
  • ...9 more annotations...
  • Formal schooling is terribly important; it helps you think in the way that IQ testers like.
  • In 1910, schools were focused on kids memorizing things about the real world. Today, they are entirely about relationships.
  • One of the fundamental things is the switch from “utilitarian spectacles” to “scientific spectacles.” The fact that we wear scientific spectacles doesn’t mean that we actually know a lot about science.
  • in 1900 in America, if you asked a child, what do dogs and rabbits have in common, they would say, “Well, you use dogs to hunt rabbits.” This is not the answer that the IQ tests want. They want you to classify. Today, a child would be likely to say, “They are both animals.” They picked up the habit of classification and use the vocabulary of science.
  • we have learned to use logic to attack the hypothetical. We have an ability to deal with a much wider range of problems than our ancestors would.
  • In 1950, teenagers could not only understand their parents, but they could also mimic their speech. Today, teenagers can still understand their parents. Their passive vocabularies are good enough. But when it comes to the words they actively use, they are much less capable of adult speak.
  • The brighter you are, the quicker after the age of 65 you have a downward curve for your analytic abilities
  • Retire from your job, but read great literature. Read about the history of science. Try and keep up your problem solving skills
  • One of the most interesting predictions is what will happen to the developing world. If they industrialize, in theory, they should have the explosive IQ gains in the coming century that we had in the last century.
caelengrubb

Our Language Affects What We See - Scientific American - 0 views

  • Does the language you speak influence how you think? This is the question behind the famous linguistic relativity hypothesis, that the grammar or vocabulary of a language imposes on its speakers a particular way of thinking about the world. 
  • The strongest form of the hypothesis is that language determines thought
  • A weak form is now thought to be obviously true, which is that if one language has a specific vocabulary item for a concept but another language does not, then speaking about the concept may happen more frequently or more easily.
  • ...6 more annotations...
  • Scholars are now interested in whether having a vocabulary item for a concept influences thought in domains far from language, such as visual perception.
  • In the journal Psychological Science,  Martin Maier and Rasha Abdel Rahman investigated whether the color distinction in the Russian blues would help the brain become consciously aware of a stimulus which might otherwise go unnoticed.
  • The task selected to investigate this is the "attentional blink." This is an experimental paradigm frequently used to test whether a stimuli is consciously noticed.
  • The current study is an important advance in documenting how linguistic categories influence perception. Consider how this updates the original Russian blues study, in which observers pressed a button to indicate whether two shades of blue were the same or different
  • In that study, it seems likely that observers silently labeled colors in order to make fast decisions. It is less likely that labeling was used during the attentional blink task, because paying attention to color is not required and indeed was irrelevant to the task.
  •  The current finding indicates that linguistic knowledge can influence perception, contradicting the traditional view that perception is processed independently from other aspects of cognition, including language.
caelengrubb

How Did Language Begin? | Linguistic Society of America - 0 views

  • The question is not how languages gradually developed over time into the languages of the world today. Rather, it is how the human species developed over time so that we - and not our closest relatives, the chimpanzees and bonobos - became capable of using language.
  • Human language can express thoughts on an unlimited number of topics (the weather, the war, the past, the future, mathematics, gossip, fairy tales, how to fix the sink...). It can be used not just to convey information, but to solicit information (questions) and to give orders.
  • Every human language has a vocabulary of tens of thousands of words, built up from several dozen speech sounds
  • ...14 more annotations...
  • Animal communication systems, in contrast, typically have at most a few dozen distinct calls, and they are used only to communicate immediate issues such as food, danger, threat, or reconciliation. Many of the sorts of meanings conveyed by chimpanzee communication have counterparts in human 'body language'.
  • The basic difficulty with studying the evolution of language is that the evidence is so sparse. Spoken languages don't leave fossils, and fossil skulls only tell us the overall shape and size of hominid brains, not what the brains could do
  • All present-day languages, including those of hunter-gatherer cultures, have lots of words, can be used to talk about anything under the sun, and can express negation. As far back as we have written records of human language - 5000 years or so - things look basically the same.
  • According to current thinking, the changes crucial for language were not just in the size of the brain, but in its character: the kinds of tasks it is suited to do - as it were, the 'software' it comes furnished with.
  • So the properties of human language are unique in the natural world.
  • About the only definitive evidence we have is the shape of the vocal tract (the mouth, tongue, and throat): Until anatomically modern humans, about 100,000 years ago, the shape of hominid vocal tracts didn't permit the modern range of speech sounds. But that doesn't mean that language necessarily began the
  • Some researchers even propose that language began as sign language, then (gradually or suddenly) switched to the vocal modality, leaving modern gesture as a residue.
  • . In an early stage, sounds would have been used to name a wide range of objects and actions in the environment, and individuals would be able to invent new vocabulary items to talk about new things
  • In order to achieve a large vocabulary, an important advance would have been the ability to 'digitize' signals into sequences of discrete speech sounds - consonants and vowels - rather than unstructured calls.
  • These two changes alone would yield a communication system of single signals - better than the chimpanzee system but far from modern language. A next plausible step would be the ability to string together several such 'words' to create a message built out of the meanings of its parts.
  • This has led some researchers to propose that the system of 'protolanguage' is still present in modern human brains, hidden under the modern system except when the latter is impaired or not yet developed.
  • Again, it's very hard to tell. We do know that something important happened in the human line between 100,000 and 50,000 years ago: This is when we start to find cultural artifacts such as art and ritual objects, evidence of what we would call civilization.
  • One tantalizing source of evidence has emerged recently. A mutation in a gene called FOXP2 has been shown to lead to deficits in language as well as in control of the face and mouth. This gene is a slightly altered version of a gene found in apes, and it seems to have achieved its present form between 200,000 and 100,000 years ago.
  • Nevertheless, if we are ever going to learn more about how the human language ability evolved, the most promising evidence will probably come from the human genome, which preserves so much of our species' history. The challenge for the future will be to decode it.
Javier E

The Failure of Rational Choice Philosophy - NYTimes.com - 1 views

  • According to Hegel, history is idea-driven.
  • Ideas for him are public, rather than in our heads, and serve to coordinate behavior. They are, in short, pragmatically meaningful words.  To say that history is “idea driven” is to say that, like all cooperation, nation building requires a common basic vocabulary.
  • One prominent component of America’s basic vocabulary is ”individualism.”
  • ...12 more annotations...
  • individualism, the desire to control one’s own life, has many variants. Tocqueville viewed it as selfishness and suspected it, while Emerson and Whitman viewed it as the moment-by-moment expression of one’s unique self and loved it.
  • individualism as the making of choices so as to maximize one’s preferences. This differed from “selfish individualism” in that the preferences were not specified: they could be altruistic as well as selfish. It differed from “expressive individualism” in having general algorithms by which choices were made. These made it rational.
  • it was born in 1951 as “rational choice theory.” Rational choice theory’s mathematical account of individual choice, originally formulated in terms of voting behavior, made it a point-for-point antidote to the collectivist dialectics of Marxism
  • Functionaries at RAND quickly expanded the theory from a tool of social analysis into a set of universal doctrines that we may call “rational choice philosophy.” Governmental seminars and fellowships spread it to universities across the country, aided by the fact that any alternative to it would by definition be collectivist.
  • rational choice philosophy moved smoothly on the backs of their pupils into the “real world” of business and governme
  • Today, governments and businesses across the globe simply assume that social reality  is merely a set of individuals freely making rational choices.
  • At home, anti-regulation policies are crafted to appeal to the view that government must in no way interfere with Americans’ freedom of choice.
  • But the real significance of rational choice philosophy lay in ethics. Rational choice theory, being a branch of economics, does not question people’s preferences; it simply studies how they seek to maximize them. Rational choice philosophy seems to maintain this ethical neutrality (see Hans Reichenbach’s 1951 “The Rise of Scientific Philosophy,” an unwitting masterpiece of the genre); but it does not.
  • Whatever my preferences are, I have a better chance of realizing them if I possess wealth and power. Rational choice philosophy thus promulgates a clear and compelling moral imperative: increase your wealth and power!
  • Today, institutions which help individuals do that (corporations, lobbyists) are flourishing; the others (public hospitals, schools) are basically left to rot. Business and law schools prosper; philosophy departments are threatened with closure.
  • Hegel, for one, had denied all three of its central claims in his “Encyclopedia of the Philosophical Sciences” over a century before. In that work, as elsewhere in his writings, nature is not neatly causal, but shot through with randomness. Because of this chaos, we cannot know the significance of what we have done until our community tells us; and ethical life correspondingly consists, not in pursuing wealth and power, but in integrating ourselves into the right kinds of community.
  • By 1953, W. V. O. Quine was exposing the flaws in rational choice epistemology. John Rawls, somewhat later, took on its sham ethical neutrality, arguing that rationality in choice includes moral constraints. The neat causality of rational choice ontology, always at odds with quantum physics, was further jumbled by the environmental crisis, exposed by Rachel Carson’s 1962 book “The Silent Spring,” which revealed that the causal effects of human actions were much more complex, and so less predicable, than previously thought.
sissij

Scientists Figure Out When Different Cognitive Abilities Peak Throughout Life | Big Think - 0 views

  • Such skills come from accumulated knowledge which benefits from a lifetime of experience. 
  • Vocabulary, in fact, peaked even later, in the late 60s to early 70s. So now you know why grandpa is so good at crosswords.
  • And here’s a win for the 40+ folks - the below representation of a test of 10,000 visitors to TestMyBrain.org shows that older subjects did better than the young on the vocabulary test.
  • ...4 more annotations...
  • The under-30 group did much better on memory-related tasks, however.
  • Is there one age when all of your mental powers are at their maximum? The researchers don’t think so.  
  • In general, the researchers found 24 to be a key age, after which player abilities slowly declined, losing about 15% of the speed every 15 years. 
  • Older players did perform better in some aspects, making up for the slower brain processing by using simpler strategies and being more efficient. They were, in other words, wiser.
  •  
    It is really surprising to me that cognitive abilities are directly related to age. But it is understandable since there also feels like a gulp between seniors and teenagers. There is always something we are especially good at at a certain age. I think this aligns with the logic of evolution as the society consists of people from different ages so they will cooperate well and reach the maximum benefit by working together. The society is really diverse and by having people of different age in the same team can have people cover up the cognitive disadvantages of others. --Sissi (4/4/2017)
Javier E

Obesity: Another thing it's too late to prevent | The Economist - 0 views

  • It's not that it's impossible for governments to hold down obesity; France, which had rapidly rising childhood obesity early this century, instituted an aggressive set of public-health interventions including school-based food and exercise shifts, nurse assessments of overweight kids, visits to families where overweight kids were identified, and so forth. Their childhood obesity rates stabilised at a fraction of America's.
  • The problem isn't that it's not possible; rather, it's that America is incapable of doing it.
  • America's national governing ideology is based almost entirely on the assertion of negative rights, with a few exceptions for positive rights and public goods such as universal elementary education, national defence and highways. But it's become increasingly clear over the past decade that the country simply doesn't have the political vocabulary that would allow it to institute effective national programmes to improve eating and exercise habits or culture. A country that can't think of a vision of public life beyond freedom of individual choice, including the individual choice to watch TV and eat a Big Mac, is not going to be able to craft public policies that encourage people to exercise and eat right. We're the fattest country on earth because that's what our political philosophy leads to. We ought to incorporate that into the way we see ourselves; it's certainly the way other countries see us.
sissij

How Does Expectation Affect Perception - 3 views

  • One important fact is that the brain works in some ways like television transmission, in that it processes stable backgrounds without much attention and moving parts more intensely and differently.
  • Recent research in babies shows that they respond most to unexpected events and use these to evaluate the environment and learn.
  • But, the over arching analysis of visual signals depends on what is expected.
  • ...7 more annotations...
  • Picture of bright light causes eye pupils to react, as if a real light.
  • Good hitters in baseball view the ball as larger.
  • Large people judge the absolute measurement of a doorway as more narrow than others will.
  • Words and thoughts alter sensory information:
  • She kicked the ball” or “grasped the subject” stimulates the leg or arm brain regions related to kicking or grasping.
  • Experienced observers of ballet or classical Indian dance who have never danced, when watching a dance stimulate specific muscles of the dance.
  • The brain has many interacting pathways and loops that create expectations with different probabilities from our previous experiences.
  •  
    I found this article very interesting because it explains some aspects of how our expectation can influence our perception. In this article, language is also mentioned that different vocabulary can alter our perception. I think this can be related to the definition of words we talked about recently. I think this article suggests that the definition of a word is the result of our expectation as we often define things differently in our favor if no clear definition is stated. This relationship can also be reversed as we use definitions to describe and organize our expectation. --Sissi (11/16/2016)
Javier E

The Story Behind the SAT Overhaul - NYTimes.com - 2 views

  • “When you cover too many topics,” Coleman said, “the assessments designed to measure those standards are inevitably superficial.” He pointed to research showing that more students entering college weren’t prepared and were forced into “remediation programs from which they never escape.” In math, for example, if you examined data from top-performing countries, you found an approach that emphasized “far fewer topics, far deeper,” the opposite of the curriculums he found in the United States, which he described as “a mile wide and an inch deep.”
  • The lessons he brought with him from thinking about the Common Core were evident — that American education needed to be more focused and less superficial, and that it should be possible to test the success of the newly defined standards through an exam that reflected the material being taught in the classroom.
  • she and her team had extensive conversations with students, teachers, parents, counselors, admissions officers and college instructors, asking each group to tell them in detail what they wanted from the test. What they arrived at above all was that a test should reflect the most important skills that were imparted by the best teachers
  • ...12 more annotations...
  • for example, a good instructor would teach Martin Luther King Jr.’s “I Have a Dream” speech by encouraging a conversation that involved analyzing the text and identifying the evidence, both factual and rhetorical, that makes it persuasive. “The opposite of what we’d want is a classroom where a teacher might ask only: ‘What was the year the speech was given? Where was it given?’ ”
  • in the past, assembling the SAT focused on making sure the questions performed on technical grounds, meaning: Were they appropriately easy or difficult among a wide range of students, and were they free of bias when tested across ethnic, racial and religious subgroups? The goal was “maximizing differentiation” among kids, which meant finding items that were answered correctly by those students who were expected to get them right and incorrectly by the weaker students. A simple way of achieving this, Coleman said, was to test the kind of obscure vocabulary words for which the SAT was famous
  • In redesigning the test, the College Board shifted its emphasis. It prioritized content, measuring each question against a set of specifications that reflect the kind of reading and math that students would encounter in college and their work lives. Schmeiser and others then spent much of early last year watching students as they answered a set of 20 or so problems, discussing the questions with the students afterward. “The predictive validity is going to come out the same,” she said of the redesigned test. “But in the new test, we have much more control over the content and skills that are being measured.”
  • Evidence-based reading and writing, he said, will replace the current sections on reading and writing. It will use as its source materials pieces of writing — from science articles to historical documents to literature excerpts — which research suggests are important for educated Americans to know and understand deeply. “The Declaration of Independence, the Constitution, the Bill of Rights and the Federalist Papers,” Coleman said, “have managed to inspire an enduring great conversation about freedom, justice, human dignity in this country and the world” — therefore every SAT will contain a passage from either a founding document or from a text (like Lincoln’s Gettysburg Address) that is part of the “great global conversation” the founding documents inspired.
  • The idea is that the test will emphasize words students should be encountering, like “synthesis,” which can have several meanings depending on their context. Instead of encouraging students to memorize flashcards, the test should promote the idea that they must read widely throughout their high-school years.
  • The Barbara Jordan vocabulary question would have a follow-up — “How do you know your answer is correct?” — to which students would respond by identifying lines in the passage that supported their answer.
  • . No longer will it be good enough to focus on tricks and trying to eliminate answer choices. We are not interested in students just picking an answer, but justifying their answers.”
  • the essay portion of the test will also be reformulated so that it will always be the same, some version of: “As you read the passage in front of you, consider how the author uses evidence such as facts or examples; reasoning to develop ideas and to connect claims and evidence; and stylistic or persuasive elements to add power to the ideas expressed. Write an essay in which you explain how the author builds an argument to persuade an audience.”
  • The math section, too, will be predicated on research that shows that there are “a few areas of math that are a prerequisite for a wide range of college courses” and careers. Coleman conceded that some might treat the news that they were shifting away from more obscure math problems to these fewer fundamental skills as a dumbing-down the test, but he was adamant that this was not the case. He explained that there will be three areas of focus: problem solving and data analysis, which will include ratios and percentages and other mathematical reasoning used to solve problems in the real world; the “heart of algebra,” which will test how well students can work with linear equations (“a powerful set of tools that echo throughout many fields of study”); and what will be called the “passport to advanced math,” which will focus on the student’s familiarity with complex equations and their applications in science and social science.
  • “Sometimes in the past, there’s been a feeling that tests were measuring some sort of ineffable entity such as intelligence, whatever that might mean. Or ability, whatever that might mean. What this is is a clear message that good hard work is going to pay off and achievement is going to pay off. This is one of the most significant developments that I have seen in the 40-plus years that I’ve been working in admissions in higher education.”
  • The idea of creating a transparent test and then providing a free website that any student could use — not to learn gimmicks but to get a better grounding and additional practice in the core knowledge that would be tested — was appealing to Coleman.
  • (The College Board won’t pay Khan Academy.) They talked about a hypothetical test-prep experience in which students would log on to a personal dashboard, indicate that they wanted to prepare for the SAT and then work through a series of preliminary questions to demonstrate their initial skill level and identify the gaps in their knowledge. Khan said he could foresee a way to estimate the amount of time it would take to achieve certain benchmarks. “It might go something like, ‘O.K., we think you’ll be able to get to this level within the next month and this level within the next two months if you put in 30 minutes a day,’ ” he said. And he saw no reason the site couldn’t predict for anyone, anywhere the score he or she might hope to achieve with a commitment to a prescribed amount of work.
Javier E

Opinion | Gen Z slang terms are influenced by incels - The Washington Post - 0 views

  • Incels (as they’re known) are infamous for sharing misogynistic attitudes and bitter hostility toward the romantically successful
  • somehow, incels’ hateful rhetoric has bizarrely become popularized via Gen Z slang.
  • it’s common to hear the suffix “pilled” as a funny way to say “convinced into a lifestyle.” Instead of “I now love eating burritos,” for instance, one might say, “I’m so burritopilled.” “Pilled” as a suffix comes from a scene in 1999’s “The Matrix” where Neo (Keanu Reeves) had to choose between the red pill and the blue pill, but the modern sense is formed through analogy with “blackpilled,” an online slang term meaning “accepting incel ideology.
  • ...11 more annotations...
  • the popular suffix “maxxing” for “maximizing” (e.g., “I’m burritomaxxing” instead of “I’m eating a lot of burritos”) is drawn from the incel idea of “looksmaxxing,” or “maximizing attractiveness” through surgical or cosmetic techniques.
  • Then there’s the word “cucked” for “weakened” or “emasculated.” If the taqueria is out of burritos, you might be “tacocucked,” drawing on the incel idea of being sexually emasculated by more attractive “chads.
  • These slang terms developed on 4chan precisely because of the site’s anonymity. Since users don’t have identifiable aliases, they signal their in-group status through performative fluency in shared slang
  • there’s a dark side to the site as well — certain boards, like /r9k/, are known breeding grounds for incel discussion, and the source of the incel words being used today.
  • finally, we have the word “sigma” for “assertive male,” which comes from an incel’s desired position outside the social hierarchy.
  • Memes and niche vocabulary become a form of cultural currency, fueling their proliferation.
  • From there, those words filter out to more mainstream websites such as Reddit and eventually become popularized by viral memes and TikTok trends. Social media algorithms do the rest of the work by curating recommended content for viewers.
  • Because these terms often spread in ironic contexts, people find them funny, engage with them and are eventually rewarded with more memes featuring incel vocabulary.
  • Creators are not just aware of this process — they are directly incentivized to abet it. We know that using trending audio helps our videos perform better and that incorporating popular metadata with hashtags or captions will help us reach wider audiences
  • kids aren’t actually saying “cucked” because they’re “blackpilled”; they’re using it for the same reason all kids use slang: It helps them bond as a group. And what are they bonding over? A shared mockery of incel ideas.
  • These words capture an important piece of the Gen Z zeitgeist. We should therefore be aware of them, keeping in mind that they’re being used ironically.
Javier E

What's the secret to learning a second language? - Salon.com - 0 views

  • “Arabic is a language of memorization,” he said. “You just have to drill the words into your head, which unfortunately takes a lot of time.” He thought, “How can I maximize the number of words I learn in the minimum amount of time?”
  • Siebert started studying the science of memory and second-language acquisition and found two concepts that went hand in hand to make learning easier: selective learning and spaced repetition. With selective learning, you spend more time on the things you don’t know, rather than on the things you already do
  • Siebert designed his software to use spaced repetition. If you get cup right, the program will make the interval between seeing the word cup longer and longer, but it will cycle cup back in just when you’re about to forget it. If you’ve forgotten cup entirely, the cycle starts again. This system moves the words from your brain’s short-term memory into long-term memory and maximizes the number of words you can learn effectively in a period. You don’t have to cram
  • ...8 more annotations...
  • ARABIC IS ONE of the languages the U.S. Department of State dubs “extremely hard.” Chinese, Japanese, and Korean are the others. These languages’ structures are vastly different from that of English, and they are memorization-driven.
  • To help meet its language-learning goals, in 2003 the Department of Defense established the University of Maryland Center for Advanced Study of Language.
  • MICHAEL GEISLER, a vice president at Middlebury College, which runs the foremost language-immersion school in the country, was blunt: “The drill-and-kill approach we used 20 years ago doesn’t work.” He added, “The typical approach that most programs take these days—Rosetta Stone is one example—is scripted dialogue and picture association. You have a picture of the Eiffel Tower, and you have a sentence to go with it. But that’s not going to teach you the language.”
  • According to Geisler, you need four things to learn a language. First, you have to use it. Second, you have to use it for a purpose. Research shows that doing something while learning a language—preparing a cooking demonstration, creating an art project, putting on a play—stimulates an exchange of meaning that goes beyond using the language for the sake of learning it.Third, you have to use the language in context. This is where Geisler says all programs have fallen short.
  • Fourth, you have to use language in interaction with others. In a 2009 study led by Andrew Meltzoff at the University of Washington, researchers found that young children easily learned a second language from live human interaction while playing and reading books. But audio and DVD approaches with the same material, without the live interaction, fostered no learning progress at all. Two people in conversation constantly give each other feedback that can be used to make changes in how they respond.
  • our research shows that the ideal model is a blended one,” one that blends technology and a teacher. “Our latest research shows that with the proper use of technology and cognitive neuroscience, we can make language learning more efficient.” 
  • The school released its first two online programs, for French and Spanish, last year. The new courses use computer avatars for virtual collaboration; rich video of authentic, unscripted conversations with native speakers; and 3-D role-playing games in which students explore life in a city square, acting as servers and taking orders from customers in a café setting. The goal at the end of the day, as Geisler put it, is for you to “actually be able to interact with a native speaker in his own language and have him understand you, understand him, and, critically, negotiate when you don’t understand what he is saying.” 
  • The program includes the usual vocabulary lists and lessons in how to conjugate verbs, but students are also consistently immersed in images, audio, and video of people from different countries speaking with different accents. Access to actual teachers is another critical component.
Javier E

The varieties of denialism | Scientia Salon - 1 views

  • a stimulating conference at Clark University about “Manufacturing Denial,” which brought together scholars from wildly divergent disciplines — from genocide studies to political science to philosophy — to explore the idea that “denialism” may be a sufficiently coherent phenomenon underlying the willful disregard of factual evidence by ideologically motivated groups or individuals.
  • the Oxford defines a denialist as “a person who refuses to admit the truth of a concept or proposition that is supported by the majority of scientific or historical evidence,” which represents a whole different level of cognitive bias or rationalization. Think of it as bias on steroids.
  • First, as a scientist: it’s just not about the facts, indeed — as Brendan showed data in hand during his presentation — insisting on facts may have counterproductive effects, leading the denialist to double down on his belief.
  • ...22 more annotations...
  • if I think that simply explaining the facts to the other side is going to change their mind, then I’m in for a rude awakening.
  • As a philosopher, I found to be somewhat more disturbing the idea that denialism isn’t even about critical thinking.
  • what the large variety of denialisms have in common is a very strong, overwhelming, ideological commitment that helps define the denialist identity in a core manner. This commitment can be religious, ethnical or political in nature, but in all cases it fundamentally shapes the personal identity of the people involved, thus generating a strong emotional attachment, as well as an equally strong emotional backlash against critics.
  • To begin with, of course, they think of themselves as “skeptics,” thus attempting to appropriate a word with a venerable philosophical pedigree and which is supposed to indicate a cautiously rational approach to a given problem. As David Hume put it, a wise person (i.e., a proper skeptic) will proportion her beliefs to the evidence. But there is nothing of the Humean attitude in people who are “skeptical” of evolution, climate change, vaccines, and so forth.
  • Denialists have even begun to appropriate the technical language of informal logic: when told that a majority of climate scientists agree that the planet is warming up, they are all too happy to yell “argument from authority!” When they are told that they should distrust statements coming from the oil industry and from “think tanks” in their pockets they retort “genetic fallacy!” And so on. Never mind that informal fallacies are such only against certain background information, and that it is eminently sensible and rational to trust certain authorities (at the least provisionally), as well as to be suspicious of large organizations with deep pockets and an obvious degree of self-interest.
  • What commonalities can we uncover across instances of denialism that may allow us to tackle the problem beyond facts and elementary logic?
  • the evidence from the literature is overwhelming that denialists have learned to use the vocabulary of critical thinking against their opponents.
  • Another important issue to understand is that denialists exploit the inherently tentative nature of scientific or historical findings to seek refuge for their doctrines.
  • . Scientists have been wrong before, and doubtlessly will be again in the future, many times. But the issue is rather one of where it is most rational to place your bets as a Bayesian updater: with the scientific community or with Faux News?
  • Science should be portrayed as a human story of failure and discovery, not as a body of barely comprehensible facts arrived at by epistemic priests.
  • Is there anything that can be done in this respect? I personally like the idea of teaching “science appreciation” classes in high school and college [2], as opposed to more traditional (usually rather boring, both as a student and as a teacher) science instruction
  • Denialists also exploit the media’s self imposed “balanced” approach to presenting facts, which leads to the false impression that there really are two approximately equal sides to every debate.
  • This is a rather recent phenomenon, and it is likely the result of a number of factors affecting the media industry. One, of course, is the onset of the 24-hr media cycle, with its pernicious reliance on punditry. Another is the increasing blurring of the once rather sharp line between reporting and editorializing.
  • The problem with the media is of course made far worse by the ongoing crisis in contemporary journalism, with newspapers, magazines and even television channels constantly facing an uncertain future of revenues,
  • he push back against denialism, in all its varied incarnations, is likely to be more successful if we shift the focus from persuading individual members of the public to making political and media elites accountable.
  • This is a major result coming out of Brendan’s research. He showed data set after data set demonstrating two fundamental things: first, large sections of the general public do not respond to the presentation of even highly compelling facts, indeed — as mentioned above — are actually more likely to entrench further into their positions.
  • Second, whenever one can put pressure on either politicians or the media, they do change their tune, becoming more reasonable and presenting things in a truly (as opposed to artificially) balanced way.
  • Third, and most crucially, there is plenty of evidence from political science studies that the public does quickly rally behind a unified political leadership. This, as much as it is hard to fathom now, has happened a number of times even in somewhat recent times
  • when leaders really do lead, the people follow. It’s just that of late the extreme partisan bickering in Washington has made the two major parties entirely incapable of working together on the common ground that they have demonstrably had in the past.
  • Another thing we can do about denialism: we should learn from the detailed study of successful cases and see what worked and how it can be applied to other instances
  • Yet another thing we can do: seek allies. In the case of evolution denial — for which I have the most first-hand experience — it has been increasingly obvious to me that it is utterly counterproductive for a strident atheist like Dawkins (or even a relatively good humored one like yours truly) to engage creationists directly. It is far more effective when we have clergy (Barry Lynn of Americans United for the Separation of Church and State [6] comes to mind) and religious scientists
  • Make no mistake about it: denialism in its various forms is a pernicious social phenomenon, with potentially catastrophic consequences for our society. It requires a rallying call for all serious public intellectuals, academic or not, who have the expertise and the stamina to join the fray to make this an even marginally better world for us all. It’s most definitely worth the fight.
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

Let's All Feel Superior - NYTimes.com - 0 views

  • People are really good at self-deception. We attend to the facts we like and suppress the ones we don’t. We inflate our own virtues and predict we will behave more nobly than we actually do. As Max H. Bazerman and Ann E. Tenbrunsel write in their book, “Blind Spots,” “When it comes time to make a decision, our thoughts are dominated by thoughts of how we want to behave; thoughts of how we should behave disappear.”
  • In centuries past, people built moral systems that acknowledged this weakness. These systems emphasized our sinfulness. They reminded people of the evil within themselves. Life was seen as an inner struggle against the selfish forces inside. These vocabularies made people aware of how their weaknesses manifested themselves and how to exercise discipline over them. These systems gave people categories with which to process savagery and scripts to follow when they confronted it. They helped people make moral judgments and hold people responsible amidst our frailties.
  • We live in a society oriented around our inner wonderfulness. So when something atrocious happens, people look for some artificial, outside force that must have caused it — like the culture of college football, or some other favorite bogey. People look for laws that can be changed so it never happens again.
  • ...1 more annotation...
  • The proper question is: How can we ourselves overcome our natural tendency to evade and self-deceive. That was the proper question after Abu Ghraib, Madoff, the Wall Street follies and a thousand other scandals. But it’s a question this society has a hard time asking because the most seductive evasion is the one that leads us to deny the underside of our own nature.
Javier E

The Rediscovery of Character - NYTimes.com - 0 views

  • broken windows was only a small piece of what Wilson contributed, and he did not consider it the center of his work. The best way to understand the core Wilson is by borrowing the title of one of his essays: “The Rediscovery of Character.”
  • When Wilson began looking at social policy, at the University of Redlands, the University of Chicago and Harvard, most people did not pay much attention to character. The Marxists looked at material forces. Darwinians at the time treated people as isolated products of competition. Policy makers of right and left thought about how to rearrange economic incentives. “It is as if it were a mark of sophistication for us to shun the language of morality in discussing the problems of mankind,” he once recalled.
  • during the 1960s and ’70s, he noticed that the nation’s problems could not be understood by looking at incentives
  • ...9 more annotations...
  • “At root,” Wilson wrote in 1985 in The Public Interest, “in almost every area of important concern, we are seeking to induce persons to act virtuously, whether as schoolchildren, applicants for public assistance, would-be lawbreakers or voters and public officials.”
  • When Wilson wrote about character and virtue, he didn’t mean anything high flown or theocratic. It was just the basics, befitting a man who grew up in the middle-class suburbs of Los Angeles in the 1940s: Behave in a balanced way. Think about the long-term consequences of your actions. Cooperate. Be decent.
  • Wilson argued that American communities responded to the stresses of industrialization by fortifying self-control.
  • It was habituated by practicing good manners, by being dependable, punctual and responsible day by day.
  • Wilson set out to learn how groups created a good order, why that order sometimes frayed.
  • In “The Moral Sense,” he brilliantly investigated the virtuous sentiments we are born with and how they are cultivated by habit. Wilson’s broken windows theory was promoted in an essay with George Kelling called “Character and Community.” Wilson and Kelling didn’t think of crime primarily as an individual choice. They saw it as something that emerged from the social psychology of a community. When neighborhoods feel disorganized and scary, crime increases.
  • he emphasized that character was formed in groups. As he wrote in “The Moral Sense,” his 1993 masterpiece, “Order exists because a system of beliefs and sentiments held by members of a society sets limits to what those members can do.”
  • But America responded to the stresses of the information economy by reducing the communal buttresses to self-control, with unfortunate results.
  • Wilson was not a philosopher. He was a social scientist. He just understood that people are moral judgers and moral actors, and he reintegrated the vocabulary of character into discussions of everyday life.
Javier E

Which Language and Grammar Rules to Flout - Room for Debate - NYTimes.com - 0 views

  • Welcome to another round of the Language Wars. By now we know the battle lines: As a “descriptivist,” I try to describe language as it is used. As a “prescriptivist,” you focus on how language should be used.
  • Your excellent guide, “Garner’s Modern American Usage,” shows you to be, in your words, a “descriptive prescriber.” You give not just “right” or “wrong” rulings on usage, but often a 1-5 score, in which a given usage may be a 1 (definitely a mistake), 3 (common, but …) or 5 (perfectly acceptable). This notion of correctness as a scale, not a binary state, makes you different from many prescriptivists.
  • “There is a set of standard conventions everyone needs for formal writing and speaking. Except under unusual circumstances, you should use the grammar and vocabulary of standard written English for these purposes.”
  • ...4 more annotations...
  • One could defensibly call me a descriptivist. I just describe something that dogmatic egalitarians don’t want described: the linguistic choices of a fully informed, highly literate but never uptight user of language. It’s a rational construct — rather like the law’s “reasonable person” — and a highly useful one at that
  • But that’s all that the reputable usage experts were ever doing.
  • descriptivists have moderated the indefensible positions they once took. The linguists have switched their position — without, of course, acknowledging that this is what they’ve done.
  • The fact that you and other linguists are now embracing the prescriptive tradition is cause for celebration.
Javier E

The Older Mind May Just Be a Fuller Mind - NYTimes.com - 0 views

  • Memory’s speed and accuracy begin to slip around age 25 and keep on slipping.
  • Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing
  • Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
  • ...6 more annotations...
  • Neuroscientists have some reason to believe that neural processing speed, like many reflexes, slows over the years; anatomical studies suggest that the brain also undergoes subtle structural changes that could affect memory.
  • doubts about the average extent of the decline are rooted not in individual differences but in study methodology. Many studies comparing older and younger people, for instance, did not take into account the effects of pre-symptomatic Alzheimer’s disease,
  • The new data-mining analysis also raises questions about many of the measures scientists use. Dr. Ramscar and his colleagues applied leading learning models to an estimated pool of words and phrases that an educated 70-year-old would have seen, and another pool suitable for an educated 20-year-old. Their model accounted for more than 75 percent of the difference in scores between older and younger adults on items in a paired-associate test
  • That is to say, the larger the library you have in your head, the longer it usually takes to find a particular word (or pair).
  • Scientists who study thinking and memory often make a broad distinction between “fluid” and “crystallized” intelligence. The former includes short-term memory, like holding a phone number in mind, analytical reasoning, and the ability to tune out distractions, like ambient conversation. The latter is accumulated knowledge, vocabulary and expertise.
  • an increase in crystallized intelligence can account for a decrease in fluid intelligence,
Javier E

The Painful Hunt for Malaysia Airlines Flight 370 - NYTimes.com - 0 views

  • Humanity now produces as much data in two days as it did in all of history till the year 2003 — and the amount of data is doubling every two years.
  • In the time you take to read this piece, the human race will generate as much data as currently exists in the Library of Congress.
  • 10 percent of all the pictures ever taken as of the end of 2011 were taken in 2011.
  • ...2 more annotations...
  • it’s our folly to assume we know very much at all. There’s “a highly objectionable word,” he writes, “which should be removed from our vocabulary in discussions of major events,” and that word is “knew.”
  • Whatever the field of our expertise, most of us realize that the more data we acquire, the less, very often, we know. The universe is not a fixed sum, in which the amount you know subtracts from the amount you don’t.
Javier E

Metaphors: Johnson: The impossibility of being literal | The Economist - 1 views

  • IT IS literally impossible to be literal.
  • Guy Deutscher, a linguist at the University of Manchester, calls all language “a reef of dead metaphor”. Most of the time we do not realise that nearly every word that comes out of our mouths has made some kind of jump from older, concrete meanings to the ones we use today.  This process is simple language change. Yesterday’s metaphors become so common that today we don’t process them as metaphors at all. 
  • if “tree” and “rock” aren’t metaphors, nearly everything else in our vocabulary seems to be. For example, you cannot use “independent” without metaphor, unless you mean “not hanging from”. You can’t use “transpire” unless you mean “to breathe through”. The first English meaning of a book was “a written document”. If we want to avoid all metaphorised language (If we want to be “literal”), we must constantly rush to a historical dictionary and frantically check
  • ...2 more annotations...
  • In every language, pretty much everything is metaphor—even good old “literally”, the battle-axe of those who think that words can always be pinned down precisely.
  • The body of educated English speakers has decided, by voice and by deed, that “literally” does mean something real in the real world. Namely, “not figuratively, allegorically”. Widespread educated usage is ultimately what determines its meaning. And perhaps that is concrete enough.
1 - 20 of 57 Next › Last »
Showing 20 items per page