Skip to main content

Home/ TOK Friends/ Group items tagged classics

Rss Feed Group items tagged

lenaurick

IQ can predict your risk of death, and 8 other smart facts about intelligence - Vox - 0 views

  • But according to Stuart Ritchie, an intelligence researcher at the University of Edinburgh, there's a massive amount of data showing that it's one of the best predictors of someone's longevity, health, and prosperity
  • In a new book, Intelligence: All that Matters, Ritchie persuasively argues that IQ doesn't necessarily set the limit for what we can do, but it does give us a starting point
  • Most people you meet are probably average, and a few are extraordinarily smart. Just 2.2 percent have an IQ of 130 or greate
  • ...17 more annotations...
  • "The classic finding — I would say it is the most replicated finding in psychology — is that people who are good at one type of mental task tend to be good at them all,"
  • G-factor is real in the sense it can predict outcomes in our lives — how much money you'll make, how productive of a worker you might be, and, most chillingly, how likely you are to die an earlier death.
  • According to the research, people with high IQs tend to be healthier and live longer than the rest of us
  • One is the fact that people with higher IQs tend to make more money than people with lower scores. Money is helpful in maintaining weight, nutrition, and accessing good health care.
  • IQ often beats personality when it comes to predicting life outcomes: Personality traits, a recent study found, can explain about 4 percent of the variance in test scores for students under age 16. IQ can explain 25 percent, or an even higher proportion, depending on the study.
  • Many of these correlations are less than .5, which means there's plenty of room for individual differences. So, yes, very smart people who are awful at their jobs exist. You're just less likely to come across them.
  • The correlation between IQ and happiness is usually positive, but also usually smaller than one might expect (and sometimes not statistically significant)," Ritchie says.
  • It could also be that people with higher IQs are smart enough to avoid accidents and mishaps. There's actually some evidence to support this: Higher-IQ people are less likely to die in traffic accidents.
  • Even though intelligence generally declines with age, those who had high IQs as children were most likely to retain their smarts as very old people.
  • "If we know the genes related to intelligence — and we know these genes are related to cognitive decline as well — then we can start to a predict who is going to have the worst cognitive decline, and devote health care medical resources to them," he says.
  • Studies comparing identical and fraternal twins find about half of IQ can be explained by genetics.
  • genetics seems to become more predictive of IQ with age.
  • The idea is as we age, we grow more in control of our environments. Those environments we create can then "amplify" the potential of our genes.
  • About half the variability in IQ is attributed to the environment. Access to nutrition, education, and health care appear to play a big role.
  • People’s lives are really messy, and the environments they are in are messy. There’s a possibility that a lot of the environmental effect on a person’s intelligence is random."
  • Hurray! Mean IQ scores appear to be increasing between 2 and 3 points per decade.
  • This phenomenon is know as the Flynn effect, and it is likely the result of increasing quality of childhood nutrition, health care, and education.
proudsa

Hitler Is a Rock Star in South Asia | VICE | United States - 0 views

  • Hitler Is a Rock Star in South Asia
    • proudsa
       
      TOK - perspective
  • In Asia, though, Mein Kampf is treated like an old classic. It's long been a popular read for businessmen in India, sold alongside titles like Rich Dad Poor Dad, Who Moved My Cheese?, and the various motivational books by Donald Trump.
    • proudsa
       
      The idea that people half-way across the globe associate Hitler with Trump in their ways of thinking should say something to the American public
  • "we [in Nepal] need a leader like Hitler."
    • proudsa
       
      did they get a different version of history than we did?
  • ...1 more annotation...
  • When Nepal hasn't been under the blanket of armed insurgencies, it's been in the grip of corrupt political leaders. People in Nepal seem to be looking for a leader that can carry them out of developmental paralysis, no matter the cost.
    • proudsa
       
      Similar to post-WWII Germany
Javier E

No matter who wins the presidential election, Nate Silver was right - The Washington Post - 1 views

  • I don’t fault Silver for his caution. It’s honest. What it really says is he doesn’t know with much confidence what’s going to happen
  • That’s because there’s a lot of human caprice and whim in electoral behavior that can’t always be explained or predicted with scientific precision. Politics ain’t moneyball. Good-quality polls give an accurate sense of where a political race is at a point in time, but they don’t predict the future.
  • Predictive models, generally based on historical patterns, work until they don’t.
  • ...2 more annotations...
  • In his hedged forecasts this time, Silver appears to be acknowledging that polling and historical patterns don’t always capture what John Maynard Keynes, in his classic 1936 economic General Theory, described as “animal spirits.”
  • There is, Keynes wrote, “the instability due to the characteristic of human nature that a large proportion of our positive activities depend on spontaneous optimism rather than on a mathematical expectation, whether moral or hedonistic or economic. Most, probably, of our decisions to do something positive, the full consequences of which will be drawn out over many days to come, can only be taken as a result of animal spirits — of a spontaneous urge to action rather than inaction, and not as the outcome of a weighted average of quantitative benefits multiplied by quantitative probabilities.”
Javier E

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
charlottedonoho

Beware Eurosceptic versions of history and science| Rebekah Higgitt | Science | The Gua... - 1 views

  • Readers of the Guardian Science pages may not have noticed the group called Historians for Britain, or a recent piece in History Today by David Abulafia asserting their belief “that Britain’s unique history sets it apart from the rest of Europe”.
  • It requires critical scrutiny from everyone with an interest in Britain’s relationship with the rest of the world, and in evidence-based political discussion.
  • Abilafia’s article is a classic example of an old-fashioned “Whiggish” narrative. It claims a uniquely moderate and progressive advance toward the development of British institutions, traced continuously from Magna Carta and isolated from the rages and radicalism of the Continent.
  • ...3 more annotations...
  • The answer is not “because Britain is better and unique” but “because I am British and these are the stories I have been brought up on” at school, university, on TV and elsewhere. Go to another country and you will see that they have their own, equally admirable, pantheon of greats.
  • The area that I have been working on, the eighteenth-century search for longitude, likewise reveals the need to challenge nationalistic assumptions.
  • Historians and readers of history both need to be aware of the biases of our education and literature. Accounts of British exceptionalism, especially those that lump the rest of Europe or the world into an amorphous group of also-rans, are more the result of national tradition and wishful thinking than a careful reading of the sources.
kushnerha

If Philosophy Won't Diversify, Let's Call It What It Really Is - The New York Times - 0 views

  • The vast majority of philosophy departments in the United States offer courses only on philosophy derived from Europe and the English-speaking world. For example, of the 118 doctoral programs in philosophy in the United States and Canada, only 10 percent have a specialist in Chinese philosophy as part of their regular faculty. Most philosophy departments also offer no courses on Africana, Indian, Islamic, Jewish, Latin American, Native American or other non-European traditions. Indeed, of the top 50 philosophy doctoral programs in the English-speaking world, only 15 percent have any regular faculty members who teach any non-Western philosophy.
  • Given the importance of non-European traditions in both the history of world philosophy and in the contemporary world, and given the increasing numbers of students in our colleges and universities from non-European backgrounds, this is astonishing. No other humanities discipline demonstrates this systematic neglect of most of the civilizations in its domain. The present situation is hard to justify morally, politically, epistemically or as good educational and research training practice.
  • While a few philosophy departments have made their curriculums more diverse, and while the American Philosophical Association has slowly broadened the representation of the world’s philosophical traditions on its programs, progress has been minimal.
  • ...9 more annotations...
  • Many philosophers and many departments simply ignore arguments for greater diversity; others respond with arguments for Eurocentrism that we and many others have refuted elsewhere. The profession as a whole remains resolutely Eurocentric.
  • Instead, we ask those who sincerely believe that it does make sense to organize our discipline entirely around European and American figures and texts to pursue this agenda with honesty and openness. We therefore suggest that any department that regularly offers courses only on Western philosophy should rename itself “Department of European and American Philosophy.”
  • We see no justification for resisting this minor rebranding (though we welcome opposing views in the comments section to this article), particularly for those who endorse, implicitly or explicitly, this Eurocentric orientation.
  • Some of our colleagues defend this orientation on the grounds that non-European philosophy belongs only in “area studies” departments, like Asian Studies, African Studies or Latin American Studies. We ask that those who hold this view be consistent, and locate their own departments in “area studies” as well, in this case, Anglo-European Philosophical Studies.
  • Others might argue against renaming on the grounds that it is unfair to single out philosophy: We do not have departments of Euro-American Mathematics or Physics. This is nothing but shabby sophistry. Non-European philosophical traditions offer distinctive solutions to problems discussed within European and American philosophy, raise or frame problems not addressed in the American and European tradition, or emphasize and discuss more deeply philosophical problems that are marginalized in Anglo-European philosophy. There are no comparable differences in how mathematics or physics are practiced in other contemporary cultures.
  • Of course, we believe that renaming departments would not be nearly as valuable as actually broadening the philosophical curriculum and retaining the name “philosophy.” Philosophy as a discipline has a serious diversity problem, with women and minorities underrepresented at all levels among students and faculty, even while the percentage of these groups increases among college students. Part of the problem is the perception that philosophy departments are nothing but temples to the achievement of males of European descent. Our recommendation is straightforward: Those who are comfortable with that perception should confirm it in good faith and defend it honestly; if they cannot do so, we urge them to diversify their faculty and their curriculum.
  • This is not to disparage the value of the works in the contemporary philosophical canon: Clearly, there is nothing intrinsically wrong with philosophy written by males of European descent; but philosophy has always become richer as it becomes increasingly diverse and pluralistic.
  • We hope that American philosophy departments will someday teach Confucius as routinely as they now teach Kant, that philosophy students will eventually have as many opportunities to study the “Bhagavad Gita” as they do the “Republic,” that the Flying Man thought experiment of the Persian philosopher Avicenna (980-1037) will be as well-known as the Brain-in-a-Vat thought experiment of the American philosopher Hilary Putnam (1926-2016), that the ancient Indian scholar Candrakirti’s critical examination of the concept of the self will be as well-studied as David Hume’s, that Frantz Fanon (1925-1961), Kwazi Wiredu (1931- ), Lame Deer (1903-1976) and Maria Lugones will be as familiar to our students as their equally profound colleagues in the contemporary philosophical canon. But, until then, let’s be honest, face reality and call departments of European-American Philosophy what they really are.
  • For demographic, political and historical reasons, the change to a more multicultural conception of philosophy in the United States seems inevitable. Heed the Stoic adage: “The Fates lead those who come willingly, and drag those who do not.”
Javier E

A Harvard Scholar on the Enduring Lessons of Chinese Philosophy - The New York Times - 0 views

  • Since 2006, Michael Puett has taught an undergraduate survey course at Harvard University on Chinese philosophy, examining how classic Chinese texts are relevant today. The course is now one of Harvard’s most popular, third only to “Introduction to Computer Science” and “Principles of Economics.”
  • So-called Confucianism, for example, is read as simply being about forcing people to accept their social roles, while so-called Taoism is about harmonizing with the larger natural world. So Confucianism is often presented as bad and Taoism as good. But in neither case are we really learning from them.
  • we shouldn’t domesticate them to our own way of thinking. When we read them as self-help, we are assuming our own definition of the self and then simply picking up pieces of these ideas that fit into such a vision
  • ...11 more annotations...
  • these ideas are not about looking within and finding oneself. They are about overcoming the self. They are, in a sense, anti-self-help.
  • Today, we are often told that our goal should be to look within and find ourselves, and, once we do, to strive to be sincere and authentic to that true self, always loving ourselves and embracing ourselves for who we are. All of this sounds great and is a key part of what we think of as a properly “modern” way to live.
  • But what if we’re, on the contrary, messy selves that tend to fall into ruts and patterns of behavior? If so, the last thing we would want to be doing is embracing ourselves for who we are — embracing, in other words, a set of patterns we’ve fallen into. The goal should rather be to break these patterns and ruts, to train ourselves to interact better with those around us.
  • Certainly some strains of Chinese political theory will take this vision of the self — that we tend to fall into patterns of behavior — to argue for a more paternalistic state that will, to use a more recent term, “nudge” us into better patterns.
  • many of the texts we discuss in the book go the other way, and argue that the goal should be to break us from being such passive creatures — calling on us to do things that break us out of these patterns and allow us to train ourselves to start altering our behavior for the better.
  • You argue that Chinese philosophy views rituals as tools that can liberate us from these ruts.
  • Rituals force us for a brief moment to become a different person and to interact with those around us in a different way. They work because they break us from the patterns that we fall into and that otherwise dominate our behavior.
  • In the early Han dynasty, for example, we have examples of rituals that called for role reversals. The father would be called upon to play the son, and the son would play the father. Each is forced to see the world from the other’s perspective, with the son learning what it’s like to be in a position of authority and the father remembering what it was like to be the more subservient one
  • We tend to think that we live in a globalized world, but in a lot of ways we really don’t. The truth is that for a long time only a very limited number of ideas have dominated the world, while ideas that arose elsewhere were seen as “traditional” and not worth learning from.
  • imagine future generations that grow up reading Du Fu along with Shakespeare, and Confucius along with Plato. Imagine that type of world, where great ideas — wherever they arose — are thought about and wrestled with.
  • There’s a very strong debate going on in China about values — a sense that everything has become about wealth and power, and a questioning about whether this should be rethought. And among the ideas that are being brought into the debate are these earlier notions about the self and about how one can lead a good life. So, while the government is appropriating some of these ideas in particular ways, the broader public is debating them, and certainly with very different interpretations.
Javier E

How 'Concept Creep' Made Americans So Sensitive to Harm - The Atlantic - 0 views

  • How did American culture arrive at these moments? A new research paper by Nick Haslam, a professor of psychology at the University of Melbourne, Australia, offers as useful a framework for understanding what’s going on as any I’ve seen. In “Concept Creep: Psychology's Expanding Concepts of Harm and Pathology,”
  • concepts like abuse, bullying, trauma, mental disorder, addiction, and prejudice, “now encompass a much broader range of phenomena than before,”expanded meanings that reflect “an ever-increasing sensitivity to harm.”
  • “they also have potentially damaging ramifications for society and psychology that cannot be ignored.”
  • ...20 more annotations...
  • He calls these expansions of meaning “concept creep.”
  • critics may hold concept creep responsible for damaging cultural trends, he writes, “such as supposed cultures of fear, therapy, and victimhood, the shifts I present have some positive implications.”
  • Concept creep is inevitable and vital if society is to make good use of new information. But why has the direction of concept creep, across so many different concepts, trended toward greater sensitivity to harm as opposed to lesser sensitivity?
  • The concept of abuse expanded too far.
  • Classically, psychological investigations recognized two forms of child abuse, physical and sexual, Haslam writes. In more recent decades, however, the concept of abuse has witnessed “horizontal creep” as new forms of abuse were recognized or studied. For example, “emotional abuse” was added as a new subtype of abuse. Neglect, traditionally a separate category, came to be seen as a type of abuse, too.
  • Meanwhile, the concept of abuse underwent “vertical creep.” That is, the behavior seen as qualifying for a given kind of abuse became steadily less extreme. Some now regard any spanking as physical abuse. Within psychology, “the boundary of neglect is indistinct,” Haslam writes. “As a consequence, the concept of neglect can become over-inclusive, identifying behavior as negligent that is substantially milder or more subtle than other forms of abuse. This is not to deny that some forms of neglect are profoundly damaging, merely to argue that the concept’s boundaries are sufficiently vague and elastic to encompass forms that are not severe.”
  • How did a working-class mom get arrested, lose her fast food job, and temporarily lose custody of her 9-year-old for letting the child play alone at a nearby park?
  • One concerns the field of psychology and its incentives. “It could be argued that just as successful species increase their territory, invading and adapting to new habitats, successful concepts and disciplines also expand their range into new semantic niches,” he theorizes. “Concepts that successfully attract the attention of researchers and practitioners are more likely to be applied in new ways and new contexts than those that do not.”
  • Concept creep can be necessary or needless. It can align concepts more or less closely with underlying realities. It can change society for better or worse. Yet many who push for more sensitivy to harm seem unaware of how oversensitivty can do harm.
  • The other theory posits an ideological explanation. “Psychology has played a role in the liberal agenda of sensitivity to harm and responsiveness to the harmed,” he writes “and its increased focus on negative phenomena—harms such as abuse, addiction, bullying, mental disorder, prejudice, and trauma—has been symptomatic of the success of that social agenda.”
  • Jonathan Haidt, who believes it has gone too far, offers a fourth theory. “If an increasingly left-leaning academy is staffed by people who are increasingly hostile to conservatives, then we can expect that their concepts will shift, via motivated scholarship, in ways that will help them and their allies (e.g., university administrators) to prosecute and condemn conservatives,
  • While Haslam and Haidt appear to have meaningfully different beliefs about why concept creep arose within academic psychology and spread throughout society, they were in sufficient agreement about its dangers to co-author a Guardian op-ed on the subject.
  • It focuses on how greater sensitivity to harm has affected college campuses.
  • “Of course young people need to be protected from some kinds of harm, but overprotection is harmful, too, for it causes fragility and hinders the development of resilience,” they wrote. “As Nasim Taleb pointed out in his book Antifragile, muscles need resistance to develop, bones need stress and shock to strengthen and the growing immune system needs to be exposed to pathogens in order to function. Similarly, he noted, children are by nature anti-fragile – they get stronger when they learn to recover from setbacks, failures and challenges to their cherished ideas.”
  • police officers fearing harm from dogs kill them by the hundreds or perhaps thousands every year in what the DOJ calls an epidemic.
  • After the terrorist attacks of September 11, 2001, the Bush Administration and many Americans grew increasingly sensitive to harms, real and imagined, from terrorism
  • Dick Cheney declared, “If there's a 1% chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It's not about our analysis ... It's about our response.” The invasion of Iraq was predicated, in part, on the idea that 9/11 “changed everything,”
  • Before 9/11, the notion of torturing prisoners was verboten. After the Bush Administration’s torture was made public, popular debate focused on mythical “ticking time bomb” scenarios, in which a whole city would be obliterated but for torture. Now Donald Trump suggests that torture should be used more generally against terrorists. Torture is, as well, an instance in which people within the field of psychology pushed concept creep in the direction of less sensitivity to harm,
  • Haslam endorses two theories
  • there are many reasons to be concerned about excessive sensitivity to harm:
Javier E

China: A Modern Babel - WSJ - 0 views

  • The oft-repeated claim that we must all learn Mandarin Chinese, the better to trade with our future masters, is one that readers of David Moser’s “A Billion Voices” will rapidly end up re-evaluating.
  • In fact, many Chinese don’t speak it: Even Chinese authorities quietly admit that only about 70% of the population speaks Mandarin, and merely one in 10 of those speak it fluently.
  • Mr. Moser presents a history of what is more properly called Putonghua, or “common speech,” along with a clear, concise and often amusing introduction to the limits of its spoken and written forms.
  • ...12 more annotations...
  • what Chinese schoolchildren are encouraged to think of as the longstanding natural speech of the common people is in fact an artificial hybrid, only a few decades old, although it shares a name—Mandarin—with the language of administration from imperial times. It’s a designed-by-committee camel of a language that has largely lost track of its past.
  • The idea of a national Chinese language began with the realization by the accidentally successful revolutionaries of 1911 that retaining control over a country speaking multiple languages and myriad dialects would necessitate reform. Long-term unification and the introduction of mass education would require a common language.
  • Whatever the province they originated from, the administrators of the now-toppled Great Qing Empire had all learned to communicate with one another in a second common language—Guanhua, China’s equivalent, in practical terms, of medieval Latin
  • To understand this highly compressed idiom required a considerable knowledge of the Chinese classics. Early Jesuit missionaries had labeled it Mandarin,
  • The committee decided that the four-tone dialect of the capital would be the base for a new national language but added a fifth tone whose use had lapsed in the north but not in southern dialects. The result was a language that no one actually spoke.
  • After the Communist victory of 1949, the process began all over again with fresh conferences, leading finally to the decision to use Beijing sounds, northern dialects and modern literature in the vernacular (of which there was very little) as a source of grammar.
  • This new spoken form is what is now loosely labeled Mandarin, still as alien to most Chinese as all the other Chinese languages.
  • A Latin alphabet system called Pinyin was introduced to help children learn to pronounce Chinese characters, but today it is usually abandoned after the first few years of elementary school.
  • The view that Mandarin is too difficult for mere foreigners to learn is essential to Chinese amour propre. But it is belied by the number of foreign high-school students who now learn the language by using Pinyin as a key to pronunciation —and who bask in the admiration they receive as a result.
  • Since 1949, the Chinese government, obsessed with promoting the image of a nation completely united in its love of the Communist Party, has decided that the Chinese people speak not several different languages but the same one in a variety of dialects. To say otherwise is to suggest, dangerously, that China is not one nation
  • Yet on Oct. 1, 1949, Mao Zedong announced the founding of the People’s Republic in a Hunan accent so thick that members of his audience subsequently differed about what he had said. He never mastered the Beijing sounds on which Putonghua is based, nor did Sichuanese-speaking Deng Xiaoping or most of his successors.
  • When Xi Jinping took power in 2012, many online commentators rejoiced. “At last! A Chinese leader who can speak Putonghua!” One leader down, only 400 million more common people to go.
oliviaodon

The Cult of Coincidence | The Huffington Post - 0 views

  • Most people readily believe that they themselves are essentially fully independent thinkers, and that closed-mindedness, intellectual inflexibility and an irrational commitment to pre-conceived thinking dwells only in the feeble minds of others. Think about it: When was the last time in the course of discussion that someone admitted to you something like, “You’re right, I have just blindly swallowed all of the positions and cultural mores of my milieu” or, “Yes, I agree that no amount of oppositional information will ever dissuade me from the beliefs I hold?” No one is immune from this state of affairs, and it requires courage and perpetual vigilance to even venture outside of the intellectual echo chamber that most of us inhabit.
  • There are those who believe that the scientific community is uniquely positioned to avoid these pitfalls. They suggest that the system of peer review is inherently self-critical, and as such is structurally quarantined from bias. Some scientists think otherwise and note that science, in as much as it is conducted by human beings, is subject to the same partiality as every other endeavor.
  • like the communist party under Lenin, science is [in its own eyes] infallible because its judgments are collective. Critics are unneeded, and since they are unneeded, they are not welcome.
  • ...2 more annotations...
  • A classic example of this endemic bias at work is illustrated through Einstein. He was disturbed by the implications of an expanding universe. For thousands of years it was assumed — outside of some theological circles — that matter was eternal. The notion that it came into being at a discreet point in time naturally implied that something had caused it and quite possibly that that something had done it on purpose. Not willing to accept this new information, Einstein added a now famous “fudge factor” to his equations to maintain the solid state universe that he was comfortable with — something he would later describe as “the greatest blunder of my career.”
  • If there is great resistance to notions of design and causality in science, it is exponentially greater when it comes to theology.
sissij

Quantum Gravity Loops Back To Ancient Atomic Logic, and The Big Bang Becomes A Big Boun... - 0 views

  • Greeks had the “first true alphabet”: a “universal” writing system that used a few letters to encode the infinite variety of all possible utterances. Similarly, all matter is written in a "language… of atoms."
  • Mysterious “meanings” still surround 100-year-old quantum mechanics equations
  • Their meaning/function/grammar is relational and sequential and word-like. The information encoded in matching sequential text-like compositions matters (DNA—>RNA, letters—>“social cartesian” lexicon).
  • ...3 more annotations...
  • Beyond the grammars of geometry and algebra lies a domain of not math-like but text-like compositions and meanings (of semantics beyond mathematics).
  • 17. Word and world both have grammars that don’t fit our available mathematical rules.
  • 18. Reality is relational, and not entirely objective. Subject and object aren’t separable, they’re entangled, inescapably. “Objective” is always relative to some other system/observer. 
  •  
    I find it very interesting that the author is trying o look at the world from a different perspective than mathematics. He thinks atoms as a language that have grammar and meanings. He thinks mathematical rules cannot fully explain our world because it is too objective. He involves the idea of language to describe how the world is entangled and relational. As we learned in TOK, language is an important AOK that shows human civilization in a very complicated way. Language is flexible, emotional and relational. It gives things meaning as human likes to assign meaning and pattern to things around. The world around us are not just cold fact, we as observers give them meaning to exist. In that sense, the concept of language can better help us depict the world. --Sissi (2/27/2017)
Javier E

'The Death of Expertise' Explores How Ignorance Became a Virtue - The New York Times - 1 views

  • a larger wave of anti-rationalism that has been accelerating for years — manifested in the growing ascendance of emotion over reason in public debates, the blurring of lines among fact and opinion and lies, and denialism in the face of scientific findings about climate change and vaccination.
  • “Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue,”
  • “To reject the advice of experts is to assert autonomy, a way for Americans to insulate their increasingly fragile egos from ever being told they’re wrong about anything. It is a new Declaration of Independence: No longer do we hold these truths to be self-evident, we hold all truths to be self-evident, even the ones that aren’t true. All things are knowable and every opinion on any subject is as good as any other.”
  • ...10 more annotations...
  • iterating arguments explored in more depth in books like Al Gore’s “The Assault on Reason,” Susan Jacoby’s “The Age of American Unreason,” Robert Hughes’s “Culture of Complaint” and, of course, Richard Hofstadter’s 1963 classic, “Anti-Intellectualism in American Life.” Nichols’s source notes are one of the highlights of the volume, pointing the reader to more illuminating books and articles.
  • “resistance to intellectual authority” naturally took root in a country, dedicated to the principles of liberty and egalitarianism, and how American culture tends to fuel “romantic notions about the wisdom of the common person or the gumption of the self-educated genius.”
  • the “protective swaddling environment of the modern university infantilizes students,”
  • today’s populism has magnified disdain for elites and experts of all sorts, be they in foreign policy, economics, even science.
  • Trump won the 2016 election, Nichols writes, because “he connected with a particular kind of voter who believes that knowing about things like America’s nuclear deterrent is just so much pointy-headed claptrap.” Worse, he goes on, some of these voters “not only didn’t care that Trump is ignorant or wrong, they likely were unable to recognize his ignorance or errors,” thanks to their own lack of knowledge.
  • While the internet has allowed more people more access to more information than ever before, it has also given them the illusion of knowledge when in fact they are drowning in data and cherry-picking what they choose to read
  • it becomes easy for one to succumb to “confirmation bias” — the tendency, as Nichols puts it, “to look for information that only confirms what we believe, to accept facts that only strengthen our preferred explanations, and to dismiss data that challenge what we accept as truth.”
  • When confronted with hard evidence that they are wrong, many will simply double down on their original assertions. “This is the ‘backfire effect,’” Nichols writes, “in which people redouble their efforts to keep their own internal narrative consistent, no matter how clear the indications that they’re wrong.” As a result, extreme views are amplified online, just as fake news and propaganda easily go viral.
  • Today, all these factors have combined to create a maelstrom of unreason that’s not just killing respect for expertise, but also undermining institutions, thwarting rational debate and spreading an epidemic of misinformation. These developments, in turn, threaten to weaken the very foundations of our democracy.
  • “Laypeople complain about the rule of experts and they demand greater involvement in complicated national questions, but many of them only express their anger and make these demands after abdicating their own important role in the process: namely, to stay informed and politically literate enough to choose representatives who can act on their behalf.”
Javier E

Eric Kandel's Visions - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • Judith, "barely clothed and fresh from the seduction and slaying of Holofernes, glows in her voluptuousness. Her hair is a dark sky between the golden branches of Assyrian trees, fertility symbols that represent her eroticism. This young, ecstatic, extravagantly made-up woman confronts the viewer through half-closed eyes in what appears to be a reverie of orgasmic rapture," writes Eric Kandel in his new book, The Age of Insight. Wait a minute. Writes who? Eric Kandel, the Nobel-winning neuroscientist who's spent most of his career fixated on the generously sized neurons of sea snails
  • Kandel goes on to speculate, in a bravura paragraph a few hundred pages later, on the exact neurochemical cognitive circuitry of the painting's viewer:
  • "At a base level, the aesthetics of the image's luminous gold surface, the soft rendering of the body, and the overall harmonious combination of colors could activate the pleasure circuits, triggering the release of dopamine. If Judith's smooth skin and exposed breast trigger the release of endorphins, oxytocin, and vasopressin, one might feel sexual excitement. The latent violence of Holofernes's decapitated head, as well as Judith's own sadistic gaze and upturned lip, could cause the release of norepinephrine, resulting in increased heart rate and blood pressure and triggering the fight-or-flight response. In contrast, the soft brushwork and repetitive, almost meditative, patterning may stimulate the release of serotonin. As the beholder takes in the image and its multifaceted emotional content, the release of acetylcholine to the hippocampus contributes to the storing of the image in the viewer's memory. What ultimately makes an image like Klimt's 'Judith' so irresistible and dynamic is its complexity, the way it activates a number of distinct and often conflicting emotional signals in the brain and combines them to produce a staggeringly complex and fascinating swirl of emotions."
  • ...18 more annotations...
  • His key findings on the snail, for which he shared the 2000 Nobel Prize in Physiology or Medicine, showed that learning and memory change not the neuron's basic structure but rather the nature, strength, and number of its synaptic connections. Further, through focus on the molecular biology involved in a learned reflex like Aplysia's gill retraction, Kandel demonstrated that experience alters nerve cells' synapses by changing their pattern of gene expression. In other words, learning doesn't change what neurons are, but rather what they do.
  • In Search of Memory (Norton), Kandel offered what sounded at the time like a vague research agenda for future generations in the budding field of neuroaesthetics, saying that the science of memory storage lay "at the foothills of a great mountain range." Experts grasp the "cellular and molecular mechanisms," he wrote, but need to move to the level of neural circuits to answer the question, "How are internal representations of a face, a scene, a melody, or an experience encoded in the brain?
  • Since giving a talk on the matter in 2001, he has been piecing together his own thoughts in relation to his favorite European artists
  • The field of neuroaesthetics, says one of its founders, Semir Zeki, of University College London, is just 10 to 15 years old. Through brain imaging and other studies, scholars like Zeki have explored the cognitive responses to, say, color contrasts or ambiguities of line or perspective in works by Titian, Michelangelo, Cubists, and Abstract Expressionists. Researchers have also examined the brain's pleasure centers in response to appealing landscapes.
  • it is fundamental to an understanding of human cognition and motivation. Art isn't, as Kandel paraphrases a concept from the late philosopher of art Denis Dutton, "a byproduct of evolution, but rather an evolutionary adaptation—an instinctual trait—that helps us survive because it is crucial to our well-being." The arts encode information, stories, and perspectives that allow us to appraise courses of action and the feelings and motives of others in a palatable, low-risk way.
  • "as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources—musical and visual—and probably by other sources as well." Specifically, in this "brain-based theory of beauty," the paper says, that faculty is associated with activity in the medial orbitofrontal cortex.
  • It also enables Kandel—building on the work of Gombrich and the psychoanalyst and art historian Ernst Kris, among others—to compare the painters' rendering of emotion, the unconscious, and the libido with contemporaneous psychological insights from Freud about latent aggression, pleasure and death instincts, and other primal drives.
  • Kandel views the Expressionists' art through the powerful multiple lenses of turn-of-the-century Vienna's cultural mores and psychological insights. But then he refracts them further, through later discoveries in cognitive science. He seeks to reassure those who fear that the empirical and chemical will diminish the paintings' poetic power. "In art, as in science," he writes, "reductionism does not trivialize our perception—of color, light, and perspective—but allows us to see each of these components in a new way. Indeed, artists, particularly modern artists, have intentionally limited the scope and vocabulary of their expression to convey, as Mark Rothko and Ad Reinhardt do, the most essential, even spiritual ideas of their art."
  • The author of a classic textbook on neuroscience, he seems here to have written a layman's cognition textbook wrapped within a work of art history.
  • "our initial response to the most salient features of the paintings of the Austrian Modernists, like our response to a dangerous animal, is automatic. ... The answer to James's question of how an object simply perceived turns into an object emotionally felt, then, is that the portraits are never objects simply perceived. They are more like the dangerous animal at a distance—both perceived and felt."
  • If imaging is key to gauging therapeutic practices, it will be key to neuroaesthetics as well, Kandel predicts—a broad, intense array of "imaging experiments to see what happens with exaggeration, distorted faces, in the human brain and the monkey brain," viewers' responses to "mixed eroticism and aggression," and the like.
  • while the visual-perception literature might be richer at the moment, there's no reason that neuroaesthetics should restrict its emphasis to the purely visual arts at the expense of music, dance, film, and theater.
  • although Kandel considers The Age of Insight to be more a work of intellectual history than of science, the book summarizes centuries of research on perception. And so you'll find, in those hundreds of pages between Kandel's introduction to Klimt's "Judith" and the neurochemical cadenza about the viewer's response to it, dossiers on vision as information processing; the brain's three-dimensional-space mapping and its interpretations of two-dimensional renderings; face recognition; the mirror neurons that enable us to empathize and physically reflect the affect and intentions we see in others; and many related topics. Kandel elsewhere describes the scientific evidence that creativity is nurtured by spells of relaxation, which foster a connection between conscious and unconscious cognition.
  • Zeki's message to art historians, aesthetic philosophers, and others who chafe at that idea is twofold. The more diplomatic pitch is that neuroaesthetics is different, complementary, and not oppositional to other forms of arts scholarship. But "the stick," as he puts it, is that if arts scholars "want to be taken seriously" by neurobiologists, they need to take advantage of the discoveries of the past half-century. If they don't, he says, "it's a bit like the guys who said to Galileo that we'd rather not look through your telescope."
  • Matthews, a co-author of The Bard on the Brain: Understanding the Mind Through the Art of Shakespeare and the Science of Brain Imaging (Dana Press, 2003), seems open to the elucidations that science and the humanities can cast on each other. The neural pathways of our aesthetic responses are "good explanations," he says. But "does one [type of] explanation supersede all the others? I would argue that they don't, because there's a fundamental disconnection still between ... explanations of neural correlates of conscious experience and conscious experience" itself.
  • There are, Matthews says, "certain kinds of problems that are fundamentally interesting to us as a species: What is love? What motivates us to anger?" Writers put their observations on such matters into idiosyncratic stories, psychologists conceive their observations in a more formalized framework, and neuroscientists like Zeki monitor them at the level of functional changes in the brain. All of those approaches to human experience "intersect," Matthews says, "but no one of them is the explanation."
  • "Conscious experience," he says, "is something we cannot even interrogate in ourselves adequately. What we're always trying to do in effect is capture the conscious experience of the last moment. ... As we think about it, we have no way of capturing more than one part of it."
  • Kandel sees art and art history as "parent disciplines" and psychology and brain science as "antidisciplines," to be drawn together in an E.O. Wilson-like synthesis toward "consilience as an attempt to open a discussion between restricted areas of knowledge." Kandel approvingly cites Stephen Jay Gould's wish for "the sciences and humanities to become the greatest of pals ... but to keep their ineluctably different aims and logics separate as they ply their joint projects and learn from each other."
Javier E

What Is College For? (Part 2) - NYTimes.com - 0 views

  • How, exactly, does college prepare students for the workplace? For most jobs, it provides basic intellectual skills: the ability to understand relatively complex instructions, to write and speak clearly and cogently, to evaluate options critically. Beyond these intellectual skills, earning a college degree shows that you have the “moral qualities” needed for most jobs: you have (to put it a bit cynically), for a period of four years and with relatively little supervision, deferred to authority, met deadlines and carried out difficult tasks even when you found them pointless and boring.
  • This sort of intellectual and moral training, however, does not require studying with experts doing cutting-edge work on, say, Homeric poetry, elementary particle theory or the philosophy of Kant. It does not, that is, require the immersion in the world of intellectual culture that a college faculty is designed to provide. It is, rather, the sort of training that ought to result from good elementary and high school education.
  • students graduating from high school should, to cite one plausible model, be able to read with understanding classic literature (from, say, Austen and Browning to Whitman and Hemingway) and write well-organized and grammatically sound essays; they should know the basic outlines of American and European history, have a good beginner’s grasp of at least two natural sciences as well as pre-calculus mathematics, along with a grounding in a foreign language.
  • ...4 more annotations...
  • Is it really possible to improve grade school and high school teaching to the level I’m suggesting? Yes, provided we employ the same sort of selection criteria for pre-college teachers as we do for other professionals such as doctors, lawyers and college professors. In contrast to other professions, teaching is not now the domain of the most successful students — quite the contrary. I’ve known many very bright students who had an initial interest in such teaching but soon realized that there is no comparison in terms of salary, prestige and working conditions.
  • Given this transformation in pre-college education, we could expect it to provide basic job-training for most students. At that point, we would still face a fundamental choice regarding higher education. We could see it as a highly restricted enterprise, educating only professionals who require advanced specialized skills. Correspondingly, only such professionals would have access to higher education as a locus of intellectual culture.
  • On the other hand, we could — as I would urge — see college as the entrée to intellectual culture for everyone who is capable of and interested in working at that level of intellectual engagement
  • Raising high school to the level I am proposing and opening college to everyone who will profit from it would be an expensive enterprise. We would need significant government support to ensure that all students receive an education commensurate with their abilities and aspirations, regardless of family resources. But the intellectual culture of our citizens should be a primary part of our national well-being, not just the predilection of an eccentric elite. As such, it should be among our highest priorities.
Javier E

Chick-fil-A is Bad For Your Political Health | Patrol - A review of religion and the mo... - 0 views

  • The premise is that politics and economics are separate realms, and we are “creating a culture” of division by dragging politics into such things as economic transactions. One could hardly better encapsulate the reality we live under, where economics have completely replaced politics. That’s pretty much the definition of classical liberalism: true politics, where human values are disputed, are expected to be sublimated by economic transactions.
  • The winner is the corporation, which can now reap the profits of a society where no human value is allowed to be more important than a business deal. (If you question that orthodoxy, you’re likely to be labeled a “radical” or a “partisan,” or better yet, just “too political.”) This ideology owes its entire existence to the need for capitalists to keep human values out of the way of the market. Above all, it must keep politics a dirty word, because people who know what politics are and how to use them can cause trouble for capitalists very quickly.
  • our commercial and our political lives are already completely intermeshed, because under the current regime we basically only have commercial lives. The only political power to be had in the United States is money, and even if you don’t have enough to make a corporation hurt, how you consume is one of the few expressions of political will open to the average citizen. They may not have enough money to shake the economy, and may not even when pooled with a large group of like-minded people. But a visceral awareness that money is politics is an excellent first step toward the average person realizing his or her political agency and taking responsibility for it.
  • ...2 more annotations...
  • Even if the corporatization of our society is so complete that it is objectively impossible to avoid giving money to entities that are at that very moment working to undermine our political freedoms, every religious and ethically-minded institution should be urging those under its influence to be aware and resist wherever possible.
  • In sum, you should absolutely be supporting corporations that put human values ahead of profit, and doing your best to keep your dollars away from ones that exploit workers and try to obstruct democracy, whether directly by stripping workers of their rights or indirectly by supporting the exclusionary social fantasies of religious reactionaries.
Javier E

The Crowd Pleaser - NYTimes.com - 0 views

  • Obama seems self-sufficient while Romney seems other-directed.
  • I’m borrowing the phrase “other-directed” from David Riesman’s 1950 classic, “The Lonely Crowd.”
  • Riesman argued that different eras nurture different personality types. The agricultural economy nurtured tradition-directed individuals. People lived according to the ancient cycles, customs and beliefs. Children grew up and performed the same roles as their parents.
  • ...2 more annotations...
  • The industrial era favored the inner-directed personality type. The inner-directed person was guided by a set of strong internal convictions, like Victorian morality. The inner-directed person was a hardy pioneer, the stolid engineer or the resilient steelworker — working on physical things. This person was often rigid, but also steadfast.
  • The other-directed personality type emerges in a service or information age economy. In this sort of economy, most workers are not working with physical things; they are manipulating people. The other-directed person becomes adept at pleasing others, at selling him or herself. The other-directed person is attuned to what other people want him to be. The other-directed person is a pliable member of a team and yearns for acceptance. He or she is less notable for having a rigid character than for having a smooth personality.
Duncan H

Phobias: Things to Fear and Loathe - NYTimes.com - 0 views

  • a new app for the treatment of phobias. You stare at pictures of dental drills, snakes or airplane interiors, depending on your affliction, and these totems of menace  — interspersed with reassuring images of teddy bears  — gradually cease to provoke you.
  • Another person wrote: “I am terrified of string. You know, when you have a loose string hanging off your clothes. Most people just shrug it off.” (Who knew?) “But I go insane until I get it off the item.”Balloons, pigeons, boats, bald men, cotton batten, garden peas. These have all acted as the culprits, according to reports I’ve received, in making otherwise reasonable human beings assume the visage of Edvard Munch’s screamer. People fear chins, condiments, towels, cut fruit.The object appears to be irrelevant, in many cases, beyond its subconscious assignation as the Very Thing to Fear.
  • One attempts to find logical causes for phobia at one’s peril.
  • ...6 more annotations...
  • According to the psychologist Stéphane Bouchard, who studies phobia at the University of Quebec, about a third of phobias are indeed set off by direct exposure to frightening encounters, such as a dog bite. Roughly another third are culturally suggested: a classic example being the increase in shark and water phobias after the movie “Jaws.” With that final third, Mr. Bouchard told me, shrugging, “we just have no clue.”Let me zero in on that final third.“I have a fear of honeycomb shapes,” a woman once wrote to me when I solicited examples of phobias for my research. “I can’t look at something like a beehive. The other day, I saw a box of honeycomb-shaped pasta at the grocery store and it really creeped me out.”
  • Of all the manifestations of anxiety, specific phobias are by far the most idiosyncratic. About 6 percent of Americans have an acute fear of animals like rats and birds. But after that, the sources of terror are myriad.
  • Oddly, this act of transmuting anxiety into fear does possess a kind of logic. Anxiety has been described as fear in search of a cause, and there’s little question that fear is more actionable. Instead of being paralyzed by a sense of directionless menace, as would be the case with a generalized anxiety disorder where danger is everywhere and nowhere, the phobic can pour all dread into one vessel, and then swiftly run away.In other words, phobia can be a form of compartmentalization.
  • A fear of flying, for instance, can relate to acrophobia (fear of heights), or to claustrophobia, or it can be a stand-in for a much more threatening prospect that dare not be confronted at any cost, such as the death of a parent. You’re avoiding grief, and the next thing you know you would rather be trapped in an elevator with bees than board an airplane. The airplane is departing for another world but no, that’s too obvious.
  • We are not simple creatures, we human beings, and we know it; yet we still insist on imposing simple explanations upon our emotional conduct. “They’re just freaking dandelions, Mom,” my son tells me. It’s just a garter snake. They’re merely peas. How in the world  can you be so idiotically afraid of clowns?There are wider implications here for our civic and political discourse. Certain people may be neurologically prone to anxiety, true, but fear is also circumstantial. The current economic climate is extremely anxiety-provoking, and research has shown that people can tolerate uncertainty for only so long. At some point, the neurotically wired begin to prefer negative certitudes  — or compartmentalized threats  — to ambiguity.
  • f we cannot tolerate uncertainty, then it might be reasonable to expect an increase in phobic behaviors:   xenophobia, Islamophobia, Obamafear, a terror of newts. These aren’t stances that can be dealt with by counterargument.  They can be quelled only by exposure, by a reminder that the threat is symbolic, a stand-in. Let’s invite the enemy we  fear to dine, then, and rescue ourselves  from irrational conflict.
  •  
    If only we could apply her suggestions to politics.
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

Interview: Ted Chiang | The Asian American Literary Review - 0 views

  • I think most people’s ideas of science fiction are formed by Hollywood movies, so they think most science fiction is a special effects-driven story revolving around a battle between good and evil
  • I don’t think of that as a science fiction story. You can tell a good-versus-evil story in any time period and in any setting. Setting it in the future and adding robots to it doesn’t make it a science fiction story.
  • I think science fiction is fundamentally a post-industrial revolution form of storytelling. Some literary critics have noted that the good-versus-evil story follows a pattern where the world starts out as a good place, evil intrudes, the heroes fight and eventually defeat evil, and the world goes back to being a good place. Those critics have said that this is fundamentally a conservative storyline because it’s about maintaining the status quo. This is a common story pattern in crime fiction, too—there’s some disruption to the order, but eventually order is restored. Science fiction offers a different kind of story, a story where the world starts out as recognizable and familiar but is disrupted or changed by some new discovery or technology. At the end of the story, the world is changed permanently. The original condition is never restored. And so in this sense, this story pattern is progressive because its underlying message is not that you should maintain the status quo, but that change is inevitable. The consequences of this new discovery or technology—whether they’re positive or negative—are here to stay and we’ll have to deal with them.
  • ...3 more annotations...
  • There’s also a subset of this progressive story pattern that I’m particularly interested in, and that’s the “conceptual breakthrough” story, where the characters discover something about the nature of the universe which radically expands their understanding of the world.  This is a classic science fiction storyline.
  • one of the cool things about science fiction is that it lets you dramatize the process of scientific discovery, that moment of suddenly understanding something about the universe. That is what scientists find appealing about science, and I enjoy seeing the same thing in science fiction.
  • when you mention myth or mythic structure, yes, I don’t think myths can do that, because in general, myths reflect a pre-industrial view of the world. I don’t know if there is room in mythology for a strong conception of the future, other than an end-of-the-world or Armageddon scenario …
Javier E

New Thinking and Old Books Revisited - NYTimes.com - 0 views

  • Mark Thoma’s classic crack — “I’ve learned that new economic thinking means reading old books” — has a serious point to it. We’ve had a couple of centuries of economic thought at this point, and quite a few smart people doing the thinking. It’s possible to come up with truly new concepts and approaches, but it takes a lot more than good intentions and casual observation to get there.
  • There is definitely a faction within economics that considers it taboo to introduce anything into its analysis that isn’t grounded in rational behavior and market equilibrium
  • what I do, and what everyone I’ve just named plus many others does, is a more modest, more eclectic form of analysis. You use maximization and equilibrium where it seems reasonably consistent with reality, because of its clarifying power, but you introduce ad hoc deviations where experience seems to demand them — downward rigidity of wages, balance-sheet constraints, bubbles (which are hard to predict, but you can say a lot about their consequences).
  • ...4 more annotations...
  • You may say that what we need is reconstruction from the ground up — an economics with no vestige of equilibrium analysis. Well, show me some results. As it happens, the hybrid, eclectic approach I’ve just described has done pretty well in this crisis, so you had better show me some really superior results before it gets thrown out the window.
  • if you think you’ve found a fundamental logical flaw in one of our workhorse economic models, the odds are very strong that you’ve just made a mistake.
  • it’s quite clear that the teaching of macroeconomics has gone seriously astray. As Saraceno says, the simple models that have proved so useful since 2008 are by and large taught only at the undergrad level — they’re treated as too simple, too ad hoc, whatever, to make it into the grad courses even at places that aren’t very ideological.
  • to temper your modeling with a sense of realism you need to know something about reality — and not just the statistical properties of U.S. time series since 1947. Economic history — global economic history — should be a core part of the curriculum. Nobody should be making pronouncements on macro without knowing a fair bit about the collapse of the gold standard in the 1930s, what actually happened in the stagflation of the 1970s, the Asian financial crisis of the 90s, and, looking forward, the euro crisis.
‹ Previous 21 - 40 of 122 Next › Last »
Showing 20 items per page