Skip to main content

Home/ TOK Friends/ Group items tagged Metaphors

Rss Feed Group items tagged

sissij

The Right Way to Fall - The New York Times - 1 views

  • According to paratroopers, stunt professionals, physical therapists and martial arts instructors, there is indeed a “right way” to fall — and it can save you a lot of grief if you know how to do it.
  • The Agency for Healthcare Research and Quality estimates that falls cause more than a third of injury-related emergency room visits, around 7.9 million a year.
  • Moreover, falling straight forward or backward raises the risk of damaging your spine and vital organs.
  • ...4 more annotations...
  • You similarly don’t want to come crashing down on your knee so you break your kneecap or do that maneuver where you kind of pedal with your feet to catch yourself, which can lead to broken bones in your foot and ankle.
  • Paratroopers’ goal is to fall sideways in the direction the wind is carrying them — in no way resisting the momentum of the fall. When the balls of their feet barely reach the ground, they immediately distribute the impact in rapid sequence up through the calf to the thigh and buttocks.
  • Accept that you’re falling and go with it, round your body, and don’t stiffen and distribute the energy so you take the fall in the widest area possible,
  • Young children are arguably the best fallers because they have yet to develop fear or embarrassment, so they just tumble and roll without tensing up and trying to catch themselves.
  •  
    There are techniques and science even in how you choose to fall. After reading this article, I sort of take the advice metaphorically. In the article, it said: "Accept that you're falling and go with it, round your body, and don't stiffen and distribute the energy so you take the fall in the widest area possible." I think it also applies to times when we meet some obstacles and fall in our life. We sometimes just have to accept the grieve and go with it. Although there are many novels depicting heroes going against their fall, as individuals in the reality, I think the better way to deal with our down point is to go with it and let it fade away. Always have your pain and grief at a high concentration will only lead to a broken heart. --Sissi (1/26/2017)
demetriar

How Culture Shapes Our Senses - NYTimes.com - 3 views

  • social psychologist Daryl J. Bem described the knowledge we gain from our senses as “zero-order beliefs,” so taken for granted that we do not even notice them as beliefs. The sky is blue. The fan hums. Ice is cold. That’s the nature of reality, and it seems peculiar that different people with their senses intact would experience it subjectively.
  • sensory perception is culturally specific.
  • . But more and more are willing to argue that sensory perception is as much about the cultural training of attention as it is about biological capacity.
  • ...4 more annotations...
  • That’s why we think of scent as a trigger for personal memory — leading to the recall of something specific, particular, uniquely our own.
  • When the research team visited the Jahai, rain-forest foragers on the Malay Peninsula, they found that the Jahai were succinct and more accurate with the scratch-and-sniff cards.
  • The team also found that several communities — speakers of Persian, Turkish and Zapotec — used different metaphors than English and Dutch speakers to describe pitch, or frequency: Sounds were thin or thick rather than high or low. In later work, they demonstrated that the metaphors were powerful enough to disrupt perception.
  • younger Cantonese speakers had fewer words for tastes and smells than older ones, a shift attributed to rapid socioeconomic development and Western-style schooling.
Javier E

Opinion | The Strange Failure of the Educated Elite - The New York Times - 0 views

  • We replaced a system based on birth with a fairer system based on talent. We opened up the universities and the workplace to Jews, women and minorities. University attendance surged, creating the most educated generation in history. We created a new boomer ethos, which was egalitarian (bluejeans everywhere!), socially conscious (recycling!) and deeply committed to ending bigotry.
  • The older establishment won World War II and built the American Century. We, on the other hand, led to Donald Trump. The chief accomplishment of the current educated elite is that it has produced a bipartisan revolt against itself.
  • the new meritocratic aristocracy has come to look like every other aristocracy. The members of the educated class use their intellectual, financial and social advantages to pass down privilege to their children, creating a hereditary elite that is ever more insulated from the rest of society. We need to build a meritocracy that is true to its values, truly open to all.
  • ...17 more annotations...
  • But the narrative is insufficient. The real problem with the modern meritocracy can be found in the ideology of meritocracy itself. Meritocracy is a system built on the maximization of individual talent, and that system unwittingly encourages several ruinous beliefs:
  • Exaggerated faith in intelligence.
  • Many of the great failures of the last 50 years, from Vietnam to Watergate to the financial crisis, were caused by extremely intelligent people who didn’t care about the civic consequences of their actions.
  • Misplaced faith in autonomy
  • The meritocracy is based on the metaphor that life is a journey. On graduation days, members for the educated class give their young Dr. Seuss’ “Oh, the Places You’ll Go!” which shows a main character, “you,” who goes on a solitary, unencumbered journey through life toward success. If you build a society upon this metaphor you will wind up with a society high in narcissism and low in social connection
  • Life is not really an individual journey. Life is more like settling a sequence of villages. You help build a community at home, at work, in your town and then you go off and settle more villages.
  • Instead of seeing the self as the seat of the soul, the meritocracy sees the self as a vessel of human capital, a series of talents to be cultivated and accomplishments to be celebrated.
  • Misplaced notion of the self
  • If you base a society on a conception of self that is about achievement, not character, you will wind up with a society that is demoralized; that puts little emphasis on the sorts of moral systems that create harmony within people, harmony between people and harmony between people and their ultimate purpose.
  • Inability to think institutionally.
  • Previous elites poured themselves into institutions and were pretty good at maintaining existing institutions, like the U.S. Congress, and building new ones, like the postwar global order.
  • The current generation sees institutions as things they pass through on the way to individual success. Some institutions, like Congress and the political parties, have decayed to the point of uselessness, while others, like corporations, lose their generational consciousness
  • Misplaced idolization of diversity
  • But diversity is a midpoint, not an endpoint. Just as a mind has to be opened so that it can close on something, an organization has to be diverse so that different perspectives can serve some end.
  • Diversity for its own sake, without a common telos, is infinitely centrifugal, and leads to social fragmentation.
  • The essential point is this: Those dimwitted, stuck up blue bloods in the old establishment had something we meritocrats lack — a civic consciousness, a sense that we live life embedded in community and nation, that we owe a debt to community and nation and that the essence of the admirable life is community before self.
  • The meritocracy is here to stay, thank goodness, but we probably need a new ethos to reconfigure it — to redefine how people are seen, how applicants are selected, how social roles are understood and how we narrate a common national purpose
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
Javier E

'I Think This Guy Is, Like, Passed Out in His Tesla' - The New York Times - 0 views

  • Tesla’s response to these videos has been consistent: Autopilot is meant to function as a complement to a conscious driver, not a replacement. If you don’t keep a hand on the wheel, your Tesla is supposed to beep at you; eventually it’s supposed to slow to a stop and put its hazard lights on. Anyway, who knows if these clips were real? Couldn’t some of them be the work of pranksters?
  • of course you can still fall asleep with a hand on the wheel — or you can go on YouTube and watch Tesla drivers swap tips for using a water bottle or custom “cellphone holder” to fool the system.
  • What’s fascinating is the way the sci-fi novelty of Autopilot — combined with the deep familiarity of old-fashioned driving — manages to warp our danger-detecting radar. There are instances in which investigators have found that the Autopilot system contributed to crashes, but none of those have been captured on film.
  • ...6 more annotations...
  • driving is already one of the more dangerous activities Americans undertake on a daily basis. According to the National Highway Traffic Safety Administration, “drowsy driving” was a factor in 91,000 crashes, resulting in 50,000 people injured and 810 deaths in 2017, so it’s theoretically possible that what some of these videos are showing us is disaster averted, not disaster in motion.
  • Tesla once generated widespread good will by promising affordable electric cars that would make the world cleaner and safer. But over time, its image was tarnished by missed deadlines, worrying crash reports, signs of a cultlike corporate culture and a chief executive, Elon Musk, who habitually exaggerates progress while announcing extravagant new ideas. This was hardly the institution you would want determining the future of highway safety.
  • These technologies — and the companies that engineer them — keep turning out to be less benign than imagined. We fell in love with Amazon, but now we miss the local stores it closed. We couldn’t resist the convenience of Uber and Lyft, but now we’ve seen their effect on public transit and drivers. “Jetsons”-esque smart-home technology turned out to be riddled with glitches and vulnerable to hackers.
  • Tech companies have hollowed out old industries, shredded privacy, disregarded regulations and created new vectors for the spread of misinformation and extremism, and now there is a sense that choices we have already made — tectonic shifts already in motion, terms of service already accepted — may be changing us in ways that we are only beginning to process, ready to leap up and bite us in the collective behind.
  • It’s hard to imagine a more potent visual metaphor for this feeling than a human lulled to sleep inside a hunk of metal and glass, hurtling down a highway under the control of proprietary algorithms beamed on board from Palo Alto
  • These videos are magnetic not just because of the eerie images they contain, but also because, watching them, we can’t actually be sure what we’re seeing. Is this danger or safety or both at once? Perhaps in a different era we would have cried out in excitement: How cool! Today we are more tempted to gasp in shock and call out a warning: Wake up!
manhefnawi

An Axiom of Feeling: Werner Herzog on the Absolute, the Sublime, and Ecstatic Truth - B... - 0 views

  • “The soul of the listener or the spectator… actualizes truth through the experience of sublimity: that is, it completes an independent act of creation.”
  • Nietzsche defined truth as “a movable host of metaphors, metonymies, and anthropomorphisms: in short, a sum of human relations which have been poetically and rhetorically intensified, transferred, and embellished.” Truth, of course, is not reality but a subset of reality, alongside the catalogue of fact and the question of meaning, inside which human consciousness dwells. “Only art penetrates … the seeming realities of this world,” Saul Bellow asserted in his superb Nobel Prize acceptance speech. “There is another reality, the genuine one, which we lose sight of. This other reality is always sending us hints, which without art, we can’t receive.”
Javier E

The Constitution of Knowledge - Persuasion - 0 views

  • But ideas in the marketplace do not talk directly to each other, and for the most part neither do individuals.
  • It is a good metaphor as far as it goes, yet woefully incomplete. It conjures up an image of ideas being traded by individuals in a kind of flea market, or of disembodied ideas clashing and competing in some ethereal realm of their own
  • When Americans think about how we find truth amid a world full of discordant viewpoints, we usually turn to a metaphor, that of the marketplace of ideas
  • ...31 more annotations...
  • Rather, our conversations are mediated through institutions like journals and newspapers and social-media platforms. They rely on a dense network of norms and rules, like truthfulness and fact-checking. They depend on the expertise of professionals, like peer reviewers and editors. The entire system rests on a foundation of values: a shared understanding that there are right and wrong ways to make knowledge.
  • Those values and rules and institutions do for knowledge what the U.S. Constitution does for politics: They create a governing structure, forcing social contestation onto peaceful and productive pathways.
  • I call them, collectively, the Constitution of Knowledge. If we want to defend that system from its many persistent attackers, we need to understand it—and its very special notion of reality.
  • What reality really is
  • The question “What is reality?” may seem either too metaphysical to answer meaningfully or too obvious to need answering
  • The whole problem is that humans have no direct access to an objective world independent of our minds and senses, and subjective certainty is no guarantee of truth. Faced with those problems and others, philosophers and practitioners think of reality as a set of propositions (or claims, or statements) that have been validated in some way, and that have thereby been shown to be at least conditionally true—true, that is, unless debunked
  • Some propositions reflect reality as we perceive it in everyday life (“The sky is blue”). Others, like the equations on a quantum physicist’s blackboard, are incomprehensible to intuition. Many fall somewhere in between.
  • a phrase I used a few sentences ago, “validated in some way,” hides a cheat. In epistemology, the whole question is, validated in what way? If we care about knowledge, freedom, and peace, then we need to stake a strong claim: Anyone can believe anything, but liberal science—open-ended, depersonalized checking by an error-seeking social network—is the only legitimate validator of knowledge, at least in the reality-based community.
  • That is a very bold, very broad, very tough claim, and it goes down very badly with lots of people and communities who feel ignored or oppressed by the Constitution of Knowledge: creationists, Christian Scientists, homeopaths, astrologists, flat-earthers, anti-vaxxers, birthers, 9/11 truthers, postmodern professors, political partisans, QAnon followers, and adherents of any number of other belief systems and religions.
  • But, like the U.S. Constitution’s claim to exclusivity in governing (“unconstitutional” means “illegal,” period), the Constitution of Knowledge’s claim to exclusivity is its sine qua non.
  • Rules for reality
  • The specific proposition does not matter. What does matter is that the only way to validate it is to submit it to the reality-based community. Otherwise, you could win dominance for your proposition by, say, brute force, threatening and jailing and torturing and killing those who see things differently—a standard method down through history
  • Say you believe something (X) to be true, and you believe that its acceptance as true by others is important or at least warranted
  • Or you and your like-minded friends could go off and talk only to each other, in which case you would have founded a cult—which is lawful but socially divisive and epistemically worthless.
  • Or you could engage in a social-media campaign to shame and intimidate those who disagree with you—a very common method these days, but one that stifles debate and throttles knowledge (and harms a lot of people).
  • What the reality-based community does is something else again. Its distinctive qualities derive from two core rules: 
  • what counts is the way the rule directs us to behave: You must assume your own and everyone else’s fallibility and you must hunt for your own and others’ errors, even if you are confident you are right. Otherwise, you are not reality-based.
  • The fallibilist rule: No one gets the final say. You may claim that a statement is established as knowledge only if it can be debunked, in principle, and only insofar as it withstands attempts to debunk it.
  • The empirical rule: No one has personal authority. You may claim that a statement has been established as knowledge only insofar as the method used to check it gives the same result regardless of the identity of the checker, and regardless of the source of the statement
  • Who you are does not count; the rules apply to everybody and persons are interchangeable. If your method is valid only for you or your affinity group or people who believe as you do, then you are not reality-based.
  • Whatever you do to check a proposition must be something that anyone can do, at least in principle, and get the same result. Also, no one proposing a hypothesis gets a free pass simply because of who she is or what group she belongs to.
  • Both rules have very profound social implications. “No final say” insists that to be knowledge, a statement must be checked; and it also says that knowledge is always provisional, standing only as long as it withstands checking.
  • “No personal authority” adds a crucial second step by defining what properly counts as checking. The point, as the great American philosopher Charles Sanders Peirce emphasized more than a century ago, is not that I look or you look but that we look; and then we compare, contest, and justify our views. Critically, then, the empirical rule is a social principle that forces us into the same conversation—a requirement that all of us, however different our viewpoints, agree to discuss what is in principle only one reality.
  • By extension, the empirical rule also dictates what does not count as checking: claims to authority by dint of a personally or tribally privileged perspective.
  • In principle, persons and groups are interchangeable. If I claim access to divine revelation, or if I claim the support of miracles that only believers can witness, or if I claim that my class or race or historically dominant status or historically oppressed status allows me to know and say things that others cannot, then I am breaking the empirical rule by exempting my views from contestability by others.
  • Though seemingly simple, the two rules define a style of social learning that prohibits a lot of the rhetorical moves we see every day.
  • Claiming that a conversation is too dangerous or blasphemous or oppressive or traumatizing to tolerate will almost always break the fallibilist rule.
  • Claims which begin “as a Jew,” or “as a queer,” or for that matter “as minister of information” or “as Pope” or “as head of the Supreme Soviet,” can be valid if they provide useful information about context or credentials; but if they claim to settle an argument by appealing to personal or tribal authority, rather than earned authority, they violate the empirical rule. 
  • “No personal authority” says nothing against trying to understand where people are coming from. If we are debating same-sex marriage, I may mention my experience as a gay person, and my experience may (I hope) be relevant.
  • But statements about personal standing and interest inform the conversation; they do not control it, dominate it, or end it. The rule acknowledges, and to an extent accepts, that people’s social positions and histories matter; but it asks its adherents not to burrow into their social identities, and not to play them as rhetorical trump cards, but to bring them to the larger project of knowledge-building and thereby transcend them.
  • the fallibilist and empirical rules are the common basis of science, journalism, law, and all the other branches of today’s reality-based community. For that reason, both rules also attract hostility, defiance, interference, and open warfare from those who would rather manipulate truth than advance it.
Javier E

Obsessed? You're Not Alone - NYTimes.com - 1 views

  • “We think that things are not good unless we’re obsessed about them,” he continued. “If you’re only mildly interested in your partner, that’s not as hot as being obsessed about somebody. Being blandly detached and mild seems like a failure.”
  • Contrary to our claims of obsession, Professor Davis believes that “the generation now is very low key — the emotions are flat — compared to movies from the ’50s, when people look sentimental.”
  • the etymology for “obsess” dates back to the turn of the 16th century (though “obsessed” itself wasn’t used until the mid-19th century) and is related to the devil. “ ‘Possession’ meant the devil occupied your soul and you had no awareness. If you were ‘obsessed,’ it meant the devil occupied your body but you were aware.” The meaning, then, has changed from something people “are forced to do but want to stop, to something they want to do and get pleasure from,” he said.
  • ...3 more annotations...
  • Consequently, he said, “it inflates the language. We’re using this powerful word, but lowering the standard by having everybody be obsessed by everything.” (This jibes with a general culture of hyperbole on the Internet, where in an attempt to stand out from the noise, everything is the best. Thing. EVER!)
  • “It’s a gesture of self-promotion, innocent self-aggrandizement,” he said. “They’re compensating for not having the capacity to get all that interested in things. Younger people create an atmosphere of hysteria around themselves and get caught up in it.”
  • . “It’s basically a case of hyperbolic metaphorical extension — which is just a fancy way of saying it is fun to use over-the-top words about less-than-over-the-top things,” she said. “This is pretty common in English. ‘Awesome’ used to mean ‘inspiring awe’ and now it means, ‘That was a great burrito, dude.’ ”
Javier E

Is Confidence in Science as a Source of Progress Based on Faith or Fact? - NYTimes.com - 3 views

  • There’s been a range of interesting reactions to my piece on Pete Seeger’s question about whether confidence in science as a source of human progress is underpinned by fact or faith.
  • the discussion was not about confidence in science as an enterprise, but confidence that benefits would always accrue to society from applications of scientific knowledge
  • Theologically speaking, science constantly reminds us of the sense in which we are nearly – but clearly not quite – gods. Perhaps the trickiest value issues surrounding science are hidden behind the seemingly innocent metaphors of ‘getting into the mind of God’ (physics) and ‘playing God’ (biomedicine). Notwithstanding scientists’ own disclaimers, as a matter of fact science has done as well as it has because scientists have adopted a ‘godlike’ attitude toward nature. We have allowed ourselves to imagine and intervene in things at very high levels of abstraction and in ways that can only be justified in terms of the power unleashed by the resulting systematic view of things. The costs incurred have included devaluing our most immediate experiences of nature and subjecting things to quite artificial conditions in order to extract knowledge.
  • ...3 more annotations...
  • For Francis Bacon and the other early Scientific Revolutionaries, this was a fair price to pay for doing divine work – God, after all, was thought to be himself transcendent and perhaps even alienated from nature. But without this theistic assumption, it becomes difficult to justify the unfettered pursuit of science, once both the costs and benefits are each given their due. Of course, we could simply say that science is what turns humans into gods. For all its hubris, this response would at least possess the virtues of candor and consistency. As it stands, scientists shy away from any such strong self-understandings, preferring to hide behind more passive accounts of their activities – e.g. they ‘describe’ rather than generate phenomena, they ‘explain’ rather than justify nature, etc. Lost in this secular translation of an originally sacred mission is the scientist’s sense of personal responsibility qua scientist.
  • Rather than thinking of science as a “force for good”, we should think of it as an inherent human activity, like com
  • merce.
Emily Horwitz

True Blue Stands Out in an Earthy Crowd - NYTimes.com - 0 views

  • blue was the only color with enough strength of character to remain blue “in all its tones.”
  • Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency.
  • Still other researchers are tracing the history of blue pigments in human culture, and the role those pigments have played in shaping our notions of virtue, authority, divinity and social class. “Blue pigments played an outstanding role in human development,” said Heinz Berke, an emeritus professor of chemistry at the University of Zurich. For some cultures, he said, they were as valuable as gold.
  • ...8 more annotations...
  • people their favorite color, and in most parts of the world roughly half will say blue, a figure three to four times the support accorded common second-place finishers like purple or green
  • t young patients preferred nurses wearing blue uniforms to those in white or yellow.
  • blue’s basic emotional valence is calmness and open-endedness, in contrast to the aggressive specificity associated with red. Blue is sea and sky, a pocket-size vacation.
  • computer screen color affected participants’ ability to solve either creative problems —
  • blue can also imply coldness, sorrow and death. On learning of a good friend’s suicide in 1901, Pablo Picasso fell into a severe depression, and he began painting images of beggars, drunks, the poor and the halt, all famously rendered in a palette of blue.
  • association arose from the look of the body when it’s in a low energy, low oxygen state. “The lips turn blue, there’s a blue pallor to the complexion,” she said. “It’s the opposite of the warm flushing of the skin that we associate with love, kindness and affection.”
  • A blue glow makes food look very unappetizing.”
  • That blue can connote coolness and tranquillity is one of nature’s little inside jokes. Blue light is on the high-energy end of the visible spectrum, and the comparative shortness of its wavelengths explains why the blue portion of the white light from the sun is easily scattered by the nitrogen and oxygen molecules in our atmosphere, and thus why the sky looks blue.
Javier E

Book Review: Models Behaving Badly - WSJ.com - 1 views

  • Mr. Derman is perhaps a bit too harsh when he describes EMM—the so-called Efficient Market Model. EMM does not, as he claims, imply that prices are always correct and that price always equals value. Prices are always wrong. What EMM says is that we can never be sure if prices are too high or too low.
  • The Efficient Market Model does not suggest that any particular model of valuation—such as the Capital Asset Pricing Model—fully accounts for risk and uncertainty or that we should rely on it to predict security returns. EMM does not, as Mr. Derman says, "stubbornly assume that all uncertainty about the future is quantifiable." The basic lesson of EMM is that it is very difficult—well nigh impossible—to beat the market consistently.
  • Mr. Derman gives an eloquent description of James Clerk Maxwell's electromagnetic theory in a chapter titled "The Sublime." He writes: "The electromagnetic field is not like Maxwell's equations; it is Maxwell's equations."
  • ...4 more annotations...
  • He sums up his key points about how to keep models from going bad by quoting excerpts from his "Financial Modeler's Manifesto" (written with Paul Wilmott), a paper he published a couple of years ago. Among its admonitions: "I will always look over my shoulder and never forget that the model is not the world"; "I will not be overly impressed with mathematics"; "I will never sacrifice reality for elegance"; "I will not give the people who use my models false comfort about their accuracy"; "I understand that my work may have enormous effects on society and the economy, many beyond my apprehension."
  • As the collapse of the subprime collateralized debt market in 2008 made clear, it is a terrible mistake to put too much faith in models purporting to value financial instruments. "In crises," Mr. Derman writes, "the behavior of people changes and normal models fail. While quantum electrodynamics is a genuine theory of all reality, financial models are only mediocre metaphors for a part of it."
  • Although financial models employ the mathematics and style of physics, they are fundamentally different from the models that science produces. Physical models can provide an accurate description of reality. Financial models, despite their mathematical sophistication, can at best provide a vast oversimplification of reality. In the universe of finance, the behavior of individuals determines value—and, as he says, "people change their minds."
  • Bringing ethics into his analysis, Mr. Derman has no patience for coddling the folly of individuals and institutions who over-rely on faulty models and then seek to escape the consequences. He laments the aftermath of the 2008 financial meltdown, when banks rebounded "to record profits and bonuses" thanks to taxpayer bailouts. If you want to benefit from the seven fat years, he writes, "you must suffer the seven lean years too, even the catastrophically lean ones. We need free markets, but we need them to be principled."
Javier E

What Happened Before the Big Bang? The New Philosophy of Cosmology - Ross Andersen - Te... - 1 views

  • This question of accounting for what we call the "big bang state" -- the search for a physical explanation of it -- is probably the most important question within the philosophy of cosmology, and there are a couple different lines of thought about it.
  • One that's becoming more and more prevalent in the physics community is the idea that the big bang state itself arose out of some previous condition, and that therefore there might be an explanation of it in terms of the previously existing dynamics by which it came about
  • The problem is that quantum mechanics was developed as a mathematical tool. Physicists understood how to use it as a tool for making predictions, but without an agreement or understanding about what it was telling us about the physical world. And that's very clear when you look at any of the foundational discussions. This is what Einstein was upset about; this is what Schrodinger was upset about. Quantum mechanics was merely a calculational technique that was not well understood as a physical theory. Bohr and Heisenberg tried to argue that asking for a clear physical theory was something you shouldn't do anymore. That it was something outmoded. And they were wrong, Bohr and Heisenberg were wrong about that. But the effect of it was to shut down perfectly legitimate physics questions within the physics community for about half a century. And now we're coming out of that
  • ...9 more annotations...
  • One common strategy for thinking about this is to suggest that what we used to call the whole universe is just a small part of everything there is, and that we live in a kind of bubble universe, a small region of something much larger
  • Newton realized there had to be some force holding the moon in its orbit around the earth, to keep it from wandering off, and he knew also there was a force that was pulling the apple down to the earth. And so what suddenly struck him was that those could be one and the same thing, the same force
  • That was a physical discovery, a physical discovery of momentous importance, as important as anything you could ever imagine because it knit together the terrestrial realm and the celestial realm into one common physical picture. It was also a philosophical discovery in the sense that philosophy is interested in the fundamental natures of things.
  • There are other ideas, for instance that maybe there might be special sorts of laws, or special sorts of explanatory principles, that would apply uniquely to the initial state of the universe.
  • The basic philosophical question, going back to Plato, is "What is x?" What is virtue? What is justice? What is matter? What is time? You can ask that about dark energy - what is it? And it's a perfectly good question.
  • right now there are just way too many freely adjustable parameters in physics. Everybody agrees about that. There seem to be many things we call constants of nature that you could imagine setting at different values, and most physicists think there shouldn't be that many, that many of them are related to one another. Physicists think that at the end of the day there should be one complete equation to describe all physics, because any two physical systems interact and physics has to tell them what to do. And physicists generally like to have only a few constants, or parameters of nature. This is what Einstein meant when he famously said he wanted to understand what kind of choices God had --using his metaphor-- how free his choices were in creating the universe, which is just asking how many freely adjustable parameters there are. Physicists tend to prefer theories that reduce that number
  • You have others saying that time is just an illusion, that there isn't really a direction of time, and so forth. I myself think that all of the reasons that lead people to say things like that have very little merit, and that people have just been misled, largely by mistaking the mathematics they use to describe reality for reality itself. If you think that mathematical objects are not in time, and mathematical objects don't change -- which is perfectly true -- and then you're always using mathematical objects to describe the world, you could easily fall into the idea that the world itself doesn't change, because your representations of it don't.
  • physicists for almost a hundred years have been dissuaded from trying to think about fundamental questions. I think most physicists would quite rightly say "I don't have the tools to answer a question like 'what is time?' - I have the tools to solve a differential equation." The asking of fundamental physical questions is just not part of the training of a physicist anymore.
  • The question remains as to how often, after life evolves, you'll have intelligent life capable of making technology. What people haven't seemed to notice is that on earth, of all the billions of species that have evolved, only one has developed intelligence to the level of producing technology. Which means that kind of intelligence is really not very useful. It's not actually, in the general case, of much evolutionary value. We tend to think, because we love to think of ourselves, human beings, as the top of the evolutionary ladder, that the intelligence we have, that makes us human beings, is the thing that all of evolution is striving toward. But what we know is that that's not true. Obviously it doesn't matter that much if you're a beetle, that you be really smart. If it were, evolution would have produced much more intelligent beetles. We have no empirical data to suggest that there's a high probability that evolution on another planet would lead to technological intelligence.
Javier E

Sexual Freelancing in the Gig Economy - The New York Times - 0 views

  • We constantly use economic metaphors to describe romantic and sexual relations. Few people today refer to women as “damaged goods” or wonder why a man would “buy the cow when he can get the milk for free,” but we have “friends with benefits” and “invest in relationships.” An ex may be “on” or “off the market.” Online dating makes “shopping around” explicit. Blog after blog strategizes about how to maximize your “return on investment” on OkCupid.
  • he ways that people date — who contacts whom, where they meet and what happens next — have always been tied to the economy. Dating applies the logic of capitalism to courtship. On the dating market, everyone competes for him or herself.
  • If you want to understand why “Netflix and chill” has replaced dinner and a movie, you need to look at how people work.
  • ...13 more annotations...
  • Today, people are constantly told that we must be flexible and adaptable in order to succeed. Is it surprising that these values are reshaping how many of us approach sex and love?
  • part-timers, contractors and other contingent workers — who constitute some 40 percent of the American work force — are more inclined to text one another “u still up?” than to make plans in advance
  • Smartphones have altered expectations about when we are “on” and “off,” and working from home or from cafes has blurred the lines between labor and leisure.
  • The 2013 and 2014 Work and Education Poll conducted by Gallup found that the average full-time American worker reported working 47 hours per week. Moreover, 21 percent of the people surveyed reported working 50 to 59 hours per week; and another 18 percent said they worked 60 or more hours a week.
  • marriage rates have declined significantly since 1960. The median age of first marriage has risen to a record high: 27 for women and 29 for men.
  • “Knot Yet: The Benefits and Costs of Delayed Marriage in America” observed that young adults have gone from seeing marriage as a “cornerstone” of adult life to its “capstone,” something you enter only after you complete your education and attain professional stability
  • DATING itself is a recent invention. It developed when young people began moving to cities and women began working outside private homes. By 1900, 44 percent of single American women worked. Previously, courtship had taken place under adult supervision, in private places: a parlor, a factory dance or church social. But once women started going out and earning wages, they had more freedom over where and how they met prospective mates. Because men vastly out-earned women, they typically paid for entertainment.
  • In the 1920s and ‘30s, as more and more middle-class women started going to college, parents and faculty panicked over the “rating and dating” culture, which led kids to participate in “petting parties” and take “joy rides” with members of the opposite sex.
  • By the 1950s, a new kind of dating took over: “going steady.
  • by the post-war era of full employment, this form of courtship made perfect sense. The booming economy, which was targeting the newly flush “teen” demographic, dictated that in order for everyone to partake in new consumer pleasures — for everyone to go out for a burger and root beer float on the weekends — young people had to pair off
  • The generation of Americans that came of age around the time of the 2008 financial crisis has been told constantly that we must be “flexible” and “adaptable.” Is it so surprising that we have turned into sexual freelancers? Many of us treat relationships like unpaid internships: We cannot expect them to lead to anything long-term, so we use them to get experience. If we look sharp, we might get a free lunch.
  • this kind of dating isn’t any more transactional than it was back when suitors paid women family-supervised visits or parents sought out a yenta to introduce their children at a synagogue mixer.
  • Courtship has always been dictated by changes in the market. The good news is that dating is not the same thing as love. And as anyone who has ever been in love can attest, the laws of supply and demand do not control our feelings.
Javier E

Big Data Is Great, but Don't Forget Intuition - NYTimes.com - 2 views

  • THE problem is that a math model, like a metaphor, is a simplification. This type of modeling came out of the sciences, where the behavior of particles in a fluid, for example, is predictable according to the laws of physics.
  • In so many Big Data applications, a math model attaches a crisp number to human behavior, interests and preferences. The peril of that approach, as in finance, was the subject of a recent book by Emanuel Derman, a former quant at Goldman Sachs and now a professor at Columbia University. Its title is “Models. Behaving. Badly.”
  • A report last year by the McKinsey Global Institute, the research arm of the consulting firm, projected that the United States needed 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers, whether retrained or hired.
  • ...4 more annotations...
  • A major part of managing Big Data projects, he says, is asking the right questions: How do you define the problem? What data do you need? Where does it come from? What are the assumptions behind the model that the data is fed into? How is the model different from reality?
  • Society might be well served if the model makers pondered the ethical dimensions of their work as well as studying the math, according to Rachel Schutt, a senior statistician at Google Research. “Models do not just predict, but they can make things happen,” says Ms. Schutt, who taught a data science course this year at Columbia. “That’s not discussed generally in our field.”
  • the increasing use of software that microscopically tracks and monitors online behavior has raised privacy worries. Will Big Data usher in a digital surveillance state, mainly serving corporate interests?
  • my bigger concern is that the algorithms that are shaping my digital world are too simple-minded, rather than too smart. That was a theme of a book by Eli Pariser, titled “The Filter Bubble: What the Internet Is Hiding From You.”
Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture ... - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
kushnerha

A new atlas maps word meanings in the brain | PBS NewsHour - 0 views

  • like Google Maps for your cerebral cortex: A new interactive atlas, developed with the help of such unlikely tools as public radio podcasts and Wikipedia, purports to show which bits of your brain help you understand which types of concepts.
  • Hear a word relating to family, loss, or the passing of time — such as “wife,” “month,” or “remarried”— and a ridge called the right angular gyrus may be working overtime. Listening to your contractor talking about the design of your new front porch? Thank a pea-sized spot of brain behind your left ear.
  • The research on the “brain dictionary” has the hallmarks of a big scientific splash: Published on Wednesday in Nature, it’s accompanied by both a video and an interactive website where you can click your way from brain region to brain region, seeing what kinds of words are processed in each. Yet neuroscientists aren’t uniformly impressed.
  • ...9 more annotations...
  • invoked an old metaphor to explain why he isn’t convinced by the analysis: He compared it to establishing a theory of how weather works by pointing a video camera out the window for 7 hours.
  • Indeed, among neuroscientists, the new “comprehensive atlas” of the cerebral cortex is almost as controversial as a historical atlas of the Middle East. That’s because every word has a constellation of meanings and associations — and it’s hard for scientists to agree about how best to study them in the lab.
  • For this study, neuroscientist Jack Gallant and his team at the University of California, Berkeley played more than two hours’ worth of stories from the Moth Radio Hour for seven grad students and postdocs while measuring their cerebral blood flow using functional magnetic resonance imaging. Then, they linked the activity in some 50,000 pea-sized regions of the cortex to the “meaning” of the words being heard at that moment.
  • How, you might ask, did they establish the meaning of words? The neuroscientists pulled all the nouns and verbs from the podcasts. With a computer program, they then looked across millions of pages of text to see how often the words from the podcasts are used near 985 common words taken from Wikipedia’s List of 1,000 Basic Words. “Wolf,” for instance, would presumably be used more often in proximity to “dog” than to, say, “eggplant.” Using that data, the program assigned numbers that approximated the meaning of each individual word from the podcasts — and, with some fancy number crunching, they figured out what areas of the brain were activated when their research subjects heard words with certain meanings.
  • Everyone agrees that the research is innovative in its method. After all, linking up the meanings of thousands of words to the second-by-second brain activity in thousands of tiny brain regions is no mean feat. “That’s way more data than any human being can possibly think about,” said Gallant.
  • What they can’t agree on is what it means. “In this study, our goal was not to ask a specific question. Our goal was to map everything so that we can ask questions after that,” said Gallant. “One of the most frequent questions we get is, ‘What does it mean?’ If I gave you a globe, you wouldn’t ask what it means, you’d start using it for stuff. You can look for the smallest ocean or how long it will take to get to San Francisco.”
  • This “data-driven approach” still involves assumptions about how to break up language into different categories of meaning
  • “Of course it’s a very simplified version of how meaning is captured in our minds, but it seems to be a pretty good proxy,” she said.
  • hordes of unanswered questions: “We can map where your brain represents the meaning of a narrative text that is associated with family, but we don’t know why the brain is responding to family at that location. Is it the word ‘father’ itself? Is it your memories of your own father? Is it your own thinking about being a parent yourself?” He hopes that it’s just those types of questions that researchers will ask, using his brain map as a guide.
sissij

Why Does Time Seem to Move Faster as We Grow Older? | Big Think - 0 views

  • Today it feels like the days peel on by, and a vacation which may last days or weeks is gone in mere hours. This is a common human experience.
  • There are an awful lot of theories that give us insight. But a direct scientific law so far remains elusive.
  • In 1877, the “ratio theory” was born, which states that we compare intervals to the total amount of time we’ve been alive. This can also be called “proportional theory,” the idea that as we age, our sense of the present begins to feel short as compared to our total lifespan.
  • ...4 more annotations...
  • While youth colors everything anew, as we age, we become more or less familiar with every aspect of our surroundings, and so the nuance wears off.
  • Young people tended to select steady or unchanging metaphors for time, while older people chose those which had to do with swiftness and speed.
  • This creates a “reminiscence bump.” The farther we move away from the bump, the quicker time seems to move.
  • A neurochemical explanation lies in the dopamine level in our brains. This is the pleasure neurotransmitter that gives us a feeling of wellbeing and reward.
  •  
    I have sometimes think about this issue and the explanation I come up is that when we are seven, one year is one seventh of our live. When we are seventy, the denominator get much bigger and the proportion get significantly smaller, one seventieth. I am very surprised to find that the scientists have similar theory. It also shows how unreliable our sense perception is. We are very unreliable about time since time is a concept that we cannot see or touch. How we feel about time largely depends on our individual emotion. Our brain also tends to remember the first times and we also have a tendency to believe vivid memory. --Sissi (2/8/2017)
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

Rich People Just Care Less - NYTimes.com - 0 views

  • Turning a blind eye. Giving someone the cold shoulder. Looking down on people. Seeing right through them. These metaphors for condescending or dismissive behavior are more than just descriptive. They suggest, to a surprisingly accurate extent, the social distance between those with greater power and those with less
  • A growing body of recent research shows that people with the most social power pay scant attention to those with little such power. This tuning out has been observed, for instance, with strangers in a mere five-minute get-acquainted session, where the more powerful person shows fewer signals of paying attention, like nodding or laughing. Higher-status people are also more likely to express disregard, through facial expressions, and are more likely to take over the conversation and interrupt or look past the other speaker.
  • A prerequisite to empathy is simply paying attention to the person in pain. In 2008, social psychologists from the University of Amsterdam and the University of California, Berkeley, studied pairs of strangers telling one another about difficulties they had been through, like a divorce or death of a loved one. The researchers found that the differential expressed itself in the playing down of suffering. The more powerful were less compassionate toward the hardships described by the less powerful.
  • ...7 more annotations...
  • research finds that the poor, compared with the wealthy, have keenly attuned interpersonal attention in all directions, in general, those with the most power in society seem to pay particularly little attention to those with the least power. To be sure, high-status people do attend to those of equal rank — but not as well as those low of status do.
  • This has profound implications for societal behavior and government policy. Tuning in to the needs and feelings of another person is a prerequisite to empathy, which in turn can lead to understanding, concern and, if the circumstances are right, compassionate action.
  • In politics, readily dismissing inconvenient people can easily extend to dismissing inconvenient truths about them. The insistence by some House Republicans in Congress on cutting financing for food stamps and impeding the implementation of Obamacare, which would allow patients, including those with pre-existing health conditions, to obtain and pay for insurance coverage, may stem in part from the empathy gap.
  • redistricting and gerrymandering have led to the creation of more and more safe districts, in which elected officials don’t even have to encounter many voters from the rival party, much less empathize with them.
  • Social distance makes it all the easier to focus on small differences between groups and to put a negative spin on the ways of others and a positive spin on our own. Freud called this “the narcissism of minor differences,”
  • Since the 1970s, the gap between the rich and everyone else has skyrocketed. Income inequality is at its highest level in a century.
  • Apart from the financial inequities, I fear the expansion of an entirely different gap, caused by the inability to see oneself in a less advantaged person’s shoes.
Ellie McGinnis

The Drugs of Work-Performance Enhancement - Steven Petrow - The Atlantic - 1 views

  • Adderall makes everything easier to understand; it makes you more alert and focused. Some college students scarf them like M&Ms and think they’re more effective at cognitive enhancement than energy drinks and safer than a smoke or a beer.
  • 4.4 percent of the adult U.S. population has ADHD, which if left untreated is associated with significant morbidity, divorce, employment, and substance abuse.
  • Nonetheless, for untold healthy adults (those whom researchers refer to as “mentally competent”) the cognitive-enhancing drug has led to positive changes in their lives.
  • ...7 more annotations...
  • “[Adderall] makes me so happy I can be at a family function or out socializing and not get too distracted by other events/conversations around me. I can hear them, but am not taken in by them.”
  • “Since being on Adderall, I have been insanely productive… I have paid all my outstanding bills and parking tickets (and even renewed my car's registration before it was due). I'm not late for things anymore… I have not spent a single day lying around my house doing nothing in the past few months. I have a budget, and a scheduler that I actually use.”
  • When she asked me why I needed it, I replied just as the college kids had on 60 Minutes: “For focus.”  
  • Did it make me smarter? No. Did it make me a faster writer? Yes. Previously, when I’d sit down at my desk, I felt adrift at sea. It was as though my MacBook and research materials, piled high, swayed from left to right and then back again. It was dizzying; I just couldn’t get a grip.
  • My metaphoric double vision snapped to mono and I could see and think as clearly as if I’d stepped out of a fog. I’d never had such concentration and it showed in the number of well-written pages I produced daily
  • But with Adderall, I had knowledge aplenty and knew that once I stopped it, my depression would quickly lift. I also know that not everyone has that kind of previous experience or perspective, which is when folks get into deep trouble.
  • “Under medical supervision, stimulant medications are considered safe.” I’d add, as the Nature authors did, especially for “mentally competent adults.”
‹ Previous 21 - 40 of 80 Next › Last »
Showing 20 items per page