Skip to main content

Home/ TOK Friends/ Group items matching "Reading" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Emilio Ergueta

The truth about Darwin and God - Salon.com - 0 views

  • as Darwin himself often confessed, natural selection cannot work without prior variations in the organisms that will be selected or not for survival.
  • whatever “Darwinism” is, this is not a book about Darwinism. Nor is it a book about contemporary evolutionary theory or the “new synthesis” or the “extended synthesis.” It is rather a book about “chance” in Darwin’s writing. To that extent it must confront “Darwinism” more broadly, even in its recent and contemporary incarnations, if only to situate the problems it deals with in a proper context.
  • My view is that “Darwinism” had a single meaning to Darwin from beginning to end. Yes, changes were made in exposition, over and over again, and in one sense, as a philosophical platitude, one cannot change one’s way of saying something without changing what one says, and therefore what one is taken to mean.
  • ...3 more annotations...
  • An examination of the Darwinian corpus shows that many of the most important changes centered on how he wished to present the role of “chance” in evolution to an ever-expanding reading public, especially after the Origin first appeared.
  • But I try to set out and move through with a clean slate, basing my claims on Darwin’s ipsissima verba rather than on what others have said.
  • The earth, its geological features, and its organic inhabitants are here only through lucky accidents? For many people that was a hard pill to swallow. Darwin did accept it, but also knew he would have to get his audience to accept it too if he were to succeed in establishing his theory as the correct account of the origin of species.
Emilio Ergueta

Nietzsche on Love | Issue 104 | Philosophy Now - 0 views

  • What could Friedrich Nietzsche (1844-1900) have to teach us about love? More than we might suppose.
  • Even during these times, between physical suffering and intense periods of writing, he pursued the company of learned women. Moreover, Nietzsche grew up in a family of women, turned to women for friendship, and witnessed his friends courtin
  • By calling our attention to the base, vulgar and selfish qualities of (heterosexual) erotic or sexual love, Nietzsche aims to strip love of its privileged status and demonstrate that what we conceive to be its opposites, such as egoism and greed, are in many instances inextricably bound up in the experience of love.
  • ...7 more annotations...
  • In doing so, Nietzsche disassociates love from its other-worldly Christian-Platonic heritage, and so asserts his ethical claims concerning the value of the Earth over the other-worldly, and the truth of the body over the sacred.
  • Nietzsche speaks critically about the possessive or tyrannical qualities of masculine love alongside its fictionalising tendencies, stating that the natural functions of a woman’s body disgust men because they prevent him having complete access to her as a possession; they also encroach upon the conceptual perfection of love. He writes, “‘The human being under the skin’ is for all lovers a horror and unthinkable, a blasphemy against God and love.”
  • He proposes that love is close to greed and the lust for possession. Love is an instinctual force related to our biological and cultural drives, and as such, cannot be considered a moral good (GS 363).
  • Nietzsche pointedly distinguishes masculine from feminine love by the notions of devotion and fidelity. Whereas women want to surrender completely to love, to approach it as a faith, “to be taken and accepted as a possession” (363), Nietzsche claims male love hinges upon the possessive thirst to acquire more from the lover, and states that men who are inclined towards complete devotion are “not men.”
  • In other words, the experiences of both greed and love are the same drive or instinct, but depending upon the level of satisfaction one has achieved, this drive will be alternatively named ‘greed’ or ‘love’: satisfied people who feel their possessions (their lover for example) threatened by others will name other’s instinct for gain greed or avarice, whereas those who are still searching out something new to desire will impose a positive evaluation on that instinct and call it ‘love’.
  • In order to be successful in love, he counsels women to “simulate a lack of love” and to enact the roles that men find attractive. Nietzsche finds love comedic because it does not consist in some attempt to know the other deeply, but rather in the confirmation of male fantasies in which women perform their constructed gender roles.
  • Nietzsche’s writings on love have not surprisingly been influential on many feminist reflections on sex/gender. Although he is not making moralising claims about how one should love, his discussion of the difficult impact erotic and romantic relationships have on women, as well as his commentary on the ironies both sexes face in love, force his readers of both sexes to examine the roles that they play in love. It is difficult when reading him not to question one’s own performances in romantic relationships.
Javier E

Digital Dog Collar - NYTimes.com - 0 views

  • I hate the new Apple Watch. Hate what it will do to conversation, to the pace of the day, to my friends, to myself. I hate that it will enable the things that already make life so incremental, now-based and hyper-connected. That, and make things far worse.
  • People check their phones about 150 times a day. Now, imagine how many glances they’ll take with all the information in the world on their wrists.
  • To the complaints that our smartphone addiction has produced a world where nobody talks much anymore, nobody listens and nobody reads, you can add a new one with the smartwatch: nobody makes eye contact.
  • ...4 more annotations...
  • “The Apple Watch is the most personal device we have ever created,” he said. “It’s not just with you, it’s on you.”
  • From here on out, there is no down time, and no excuses for reality escapes. You are connected, 24/7.
  • There is some evidence that heavy smartphone use makes you dumber. The theory is that a having the world at the other end of a mobile search makes for lazy minds, while people who depend less on their devices develop more analytical skills.
  • Add to this concerns about privacy: that the watch is a tracking device, which sends all your personal information to a central database — a corporate control center that already knows far too much about the preferences and habits of smartphone users.
Javier E

Ta-Nehisi Coates defines a new race beat - Columbia Journalism Review - 0 views

  • “The Case for Reparations,” Coates’ 16,000-word cover story for The Atlantic, where he is a national correspondent. Published online in May, it was a close look at housing discrimination, such as redlining, that was really about the need for America to take a brutally honest look in the mirror and acknowledge its deep racial divisions.
  • The story broke a single-day traffic record for a magazine story on The Atlantic’s website, and in its wake, Politico named him to its list of 50 thinkers changing American politics
  • Coates believes that if there is an answer to contemporary racism, it lies in confronting the pas
  • ...24 more annotations...
  • For Coates, true equality means “black people in this country have the right to be as mediocre as white people,” he says. “Not that individual black people will be as excellent, or more excellent, than other white people.”
  • he came to see black respectability—the idea that, to succeed, African-Americans must stoically prevail against the odds and be “twice as good” as white people to get the same rights—as deeply immoral.
  • He is no soothsayer, telling people what to think from on high, but rather is refreshingly open about what he doesn’t know, inviting readers to learn with him. Coates is not merely an ivory-tower pontificator or a shiny Web 2.0 brand. He is a public intellectual for the digital age.
  • we miss the real question of why there is a systemic, historical difference in the way police treat blacks versus whites.
  • Another term for that road is “white supremacy.” This refers not so much to hate groups, but, as Coates defines it, a system of policies and beliefs that aims to keep African-Americans as “a peon class.”
  • To be “white” in this sense does not refer merely to skin color but to the degree that someone qualifies as “normal,” and thus worthy of the same rights as all Americans
  • The pool where all these ideas eventually arrive is a question: “How big-hearted can democracy be?” he says. “How many people can it actually include and sustain itself? That is the question I’m asking over and over again.”
  • it is a question of empathy. Are humans capable of forming a society where everyone can flourish?
  • there was the coverage of Michael Brown (or Jordan Davis, or Renisha McBride, or Eric Garner): unarmed African-Americans killed by police or others under controversial circumstances. In each case, the storyline was that these horrific encounters were caused either by genuine provocation, or by race-fueled fear or hatred. Either way, they were stories of personal failings.
  • When an event becomes news, there is often an implication that it is an exception—that the world is mostly working as it should and this event is newsworthy because it’s an aberration. If the race-related stories we see most often in the media are about personal bigotry, then our conception of racism is limited to the bigoted remarks or actions—racism becomes little more than uttering the n-word.
  • he cites research that in 1860 slaves were the largest asset in the US economy. “It is almost impossible to think of democracy, as it was formed in America, without the enslavement of African-Americans,” he says. “Not that these things were bumps in the road along the way, but that they were the road.”
  • a lack of historical perspective in the media’s approach to race. “Journalism privileges what’s happening now over the long reasons for things happening,” he says. “And for African-Americans, that has a particular effect.”
  • Even the very existence of racism is questioned: A recent study published by the Association of Psychological Science has shown that whites think they are discriminated against due to race as much if not more than blacks.
  • “So when you’re talking about something like institutional racism and prejudice, how do you talk about that as an objective reality?”
  • Coates’ strength is in connecting contemporary problems to historical scholarship. “I think if I bring anything to the table it’s the ability to synthesize all of that into something that people find emotionally moving,” he says. The irony of the reparations piece, as unoriginal as it may have been to scholars, is that it was news to many people.
  • Reporting on race requires simultaneously understanding multiple, contradictory worlds, with contradictory narratives. Widespread black poverty exists; so do a black middle class and a black president
  • Progress is key to the myth of American Exceptionalism, and the notion that America is built on slavery and on freedom are discordant ideas that threaten any simple storyline. Coates, together with others who join him, is trying to claim the frontier of a new narrative.
  • reading Coates is like building a worldview, piece by piece, on an area of contemporary life that’s otherwise difficult to grasp.
  • “To come and tell someone may not be as effective in convincing them as allowing them to learn on their own. If you believe you come to a conclusion on your own, you’re more likely to agree.”
  • It’s brave to bare yourself intellectually on the Web, and to acknowledge mistakes, especially when the capital that public intellectuals appear to have is their ability to be “right.”
  • Coates is equally demanding of his followers. Online he is blunt, and willing to call people out. He cares enough to be rigorous
  • despite being a master of online engagement, Coates insists he does not write for others, an idea he explained in a recent post: “I have long believed that the best part of writing is not the communication of knowledge to other people, but the acquisition and synthesizing of knowledge for oneself. The best thing I can say about the reparations piece is that I now understand.”
  • To him, it’s an open question whether or not America will ever be capable of fostering true equality. “How big-hearted can democracy be? It points to a very ugly answer: maybe not that big-hearted at all. That in fact America is not exceptional. That it’s just like every other country. That it passes its democracy and it passes all these allegedly big-hearted programs [the New Deal, the G.I. Bill] but still excludes other people,
  • In a 2010 post about antebellum America, Coates mentioned feminist and abolitionist Angelina Grimke. “Suffice to say that much like Abe Lincoln, and Ulysses Grant, Angelina Grimke was a Walker,” he wrote. “What was the Walker reference?” Rosemartian asked in the comments section. “Just someone who spends their life evolving, or, walking,” Coates replied. “Grant and Lincoln fit in there for me. Malcolm X was another Walker. Walkers tend to be sometimes—even often—wrong. But they are rarely bigots, in the sense of nakedly clinging to ignorance.”
Javier E

The Death of Adulthood in American Culture - NYTimes.com - 0 views

  • It seems that, in doing away with patriarchal authority, we have also, perhaps unwittingly, killed off all the grown-ups.
  • , the journalist and critic Ruth Graham published a polemical essay in Slate lamenting the popularity of young-adult fiction among fully adult readers. Noting that nearly a third of Y.A. books were purchased by readers ages 30 to 44 (most of them presumably without teenage children of their own), Graham insisted that such grown-ups “should feel embarrassed about reading literature for children.”
  • In my main line of work as a film critic, I have watched over the past 15 years as the studios committed their vast financial and imaginative resources to the cultivation of franchises (some of them based on those same Y.A. novels) that advance an essentially juvenile vision of the world. Comic-book movies, family-friendly animated adventures, tales of adolescent heroism and comedies of arrested development do not only make up the commercial center of 21st-century Hollywood. They are its artistic heart.
  • ...13 more annotations...
  • At sea or in the wilderness, these friends managed to escape both from the institutions of patriarchy and from the intimate authority of women, the mothers and wives who represent a check on male freedom.
  • What all of these shows grasp at, in one way or another, is that nobody knows how to be a grown-up anymore. Adulthood as we have known it has become conceptually untenable.
  • From the start, American culture was notably resistant to the claims of parental authority and the imperatives of adulthood. Surveying the canon of American literature in his magisterial “Love and Death in the American Novel,” Leslie A. Fiedler suggested, more than half a century before Ruth Graham, that “the great works of American fiction are notoriously at home in the children’s section of the library.”
  • “The typical male protagonist of our fiction has been a man on the run, harried into the forest and out to sea, down the river or into combat — anywhere to avoid ‘civilization,’ which is to say the confrontation of a man and woman which leads to the fall to sex, marriage and responsibility. One of the factors that determine theme and form in our great books is this strategy of evasion, this retreat to nature and childhood which makes our literature (and life!) so charmingly and infuriatingly ‘boyish.’ ”
  • What Fiedler notes, and what most readers of “Huckleberry Finn” will recognize, is Twain’s continual juxtaposition of Huck’s innocence and instinctual decency with the corruption and hypocrisy of the adult world.
  • we’ve also witnessed the erosion of traditional adulthood in any form, at least as it used to be portrayed in the formerly tried-and-true genres of the urban cop show, the living-room or workplace sitcom and the prime-time soap opera. Instead, we are now in the age of “Girls,” “Broad City,” “Masters of Sex” (a prehistory of the end of patriarchy), “Bob’s Burgers” (a loopy post-"Simpsons” family cartoon) and a flood of goofy, sweet, self-indulgent and obnoxious improv-based web videos.
  • we have a literature of boys’ adventures and female sentimentality. Or, to put it another way, all American fiction is young-adult fiction.
  • The bad boys of rock ‘n’ roll and the pouting screen rebels played by James Dean and Marlon Brando proved Fiedler’s point even as he was making it. So did Holden Caulfield, Dean Moriarty, Augie March and Rabbit Angstrom — a new crop of semi-antiheroes
  • We devolve from Lenny Bruce to Adam Sandler, from “Catch-22” to “The Hangover,” from “Goodbye, Columbus” to “The Forty-Year-Old Virgin.”
  • Unlike the antiheroes of eras past, whose rebellion still accepted the fact of adulthood as its premise, the man-boys simply refused to grow up, and did so proudly. Their importation of adolescent and preadolescent attitudes into the fields of adult endeavor (see “Billy Madison,” “Knocked Up,” “Step Brothers,” “Dodgeball”) delivered a bracing jolt of subversion, at least on first viewing. Why should they listen to uptight bosses, stuck-up rich guys and other readily available symbols of settled male authority?
  • That was only half the story, though. As before, the rebellious animus of the disaffected man-child was directed not just against male authority but also against women. I
  • their refusal of maturity also invites some critical reflection about just what adulthood is supposed to mean. In the old, classic comedies of the studio era — the screwbally roller coasters of marriage and remarriage, with their dizzying verbiage and sly innuendo — adulthood was a fact. It was inconvertible and burdensome but also full of opportunity. You could drink, smoke, flirt and spend money.
  • The desire of the modern comic protagonist, meanwhile, is to wallow in his own immaturity, plumbing its depths and reveling in its pleasures.
sgardner35

A High School Is Making All Of Its Female Students Get Their Prom Dresses 'Pre-Approved' - MTV - 0 views

  • While students at Delone Catholic High School in Pennsylvania will have a prom to attend, they will not necessarily
  • be allowed to wear what they want. The school has a new policy—instated this year—that requires “all young women” who plan on attending the prom—whether they attend the school or are the guest of student—to “submit a photo of the gown that will be worn to the prom for pre-approval.”
  • The petition reads, “Our children will not undergo scrutiny of prom gowns based on outdated, unrealistic expectations and rules implemented at such short notice.”
  • ...1 more annotation...
  • “distracting” and “unacceptable” and the #ClothingHasNoGender campaign that was launched by his friends in response.
maddieireland334

Is your teen using apps to keep this secret? - CNN.com - 0 views

  •  
    And if you think the only teens who sext are the ones engaging in high-risk behaviors, like drinking, using drugs or skipping school, keep reading. Studies suggest that sexting is more common than many parents might realize or want to admit.
Javier E

The Power of Nudges, for Good and Bad - The New York Times - 0 views

  • Nudges, small design changes that can markedly affect individual behavior, have been catching on. These techniques rely on insights from behavioral science
  • when used ethically, they can be very helpful. But we need to be sure that they aren’t being employed to sway people to make bad decisions that they will later regret.
  • Three principles should guide the use of nudges:■ All nudging should be transparent and never misleading.■ It should be as easy as possible to opt out of the nudge, preferably with as little as one mouse click.■ There should be good reason to believe that the behavior being encouraged will improve the welfare of those being nudged.
  • ...6 more annotations...
  • the government teams in Britain and the United States that have focused on nudging have followed these guidelines scrupulously.
  • the private sector is another matter. In this domain, I see much more troubling behavior.
  • Many companies are nudging purely for their own profit and not in customers’ best interests. In a recent column in The New York Times, Robert Shiller called such behavior “phishing.” Mr. Shiller and George Akerlof, both Nobel-winning economists, have written a book on the subject, “Phishing for Phools.”
  • Some argue that phishing — or evil nudging — is more dangerous in government than in the private sector. The argument is that government is a monopoly with coercive power, while we have more choice in the private sector over which newspapers we read and which airlines we fly.
  • I think this distinction is overstated. In a democracy, if a government creates bad policies, it can be voted out of office. Competition in the private sector, however, can easily work to encourage phishing rather than stifle it.
  • One example is the mortgage industry in the early 2000s. Borrowers were encouraged to take out loans that they could not repay when real estate prices fell. Competition did not eliminate this practice, because it was hard for anyone to make money selling the advice “Don’t take that loan.”
Javier E

The Rich Have Higher Level of Narcissism, Study Shows | TIME.com - 1 views

  • The rich really are different — and, apparently more self-absorbed, according to the latest research.
  • Recent studies show, for example, that wealthier people are more likely to cut people off in traffic and to behave unethically in simulated business and charity scenarios.
  • Earlier this year, statistics on charitable giving revealed that while the wealthy donate about 1.3% of their income to charity, the poorest actually give more than twice as much as a proportion of their earnings — 3.2%.
  • ...11 more annotations...
  • In five different experiments involving several hundred undergraduates and 100 adults recruited from online communities, the researchers found higher levels of both narcissism and entitlement among those of higher income and social class.
  • when asked to visually depict themselves as circles, with size indicating relative importance, richer people picked larger circles for themselves and smaller ones for others. Another experiment found that they also looked in the mirror more frequently.
  • The wealthier participants were also more likely to agree with statements like “I honestly feel I’m just more deserving than other people
  • But which came first — did gaining wealth increase self-aggrandizement? Were self-infatuated people more likely to seek and then gain riches
  • To explore that relationship further, the researchers also asked the college students in one experiment to report the educational attainment and annual income of their parents. Those with more highly educated and wealthier parents remained higher in their self-reported entitlement and narcissistic characteristics. “That would suggest that it’s not just [that] people who feel entitled are more likely to become wealthy,” says Piff. Wealth, in other words, may breed narcissistic tendencies — and wealthy people justify their excess by convincing themselves that they are more deserving of it
  • “The strength of the study is that it uses multiple methods for measuring narcissism and entitlement and social class and multiple populations, and that can really increase our confidence in the results,”
  • “This paper should not be read as saying that narcissists are more successful because we know from lots of other studies that that’s not true.
  • “entitlement is a facet of narcissism,” says Twenge. “And [it’s the] one most associated with high social class. It’s the idea that you deserve special treatment and that things will come to you without working hard.”
  • Manipulating the sense of entitlement, however, may provide a way to influence narcissism. In the final experiment in the paper, the researchers found that having participants who listed three benefits of seeing others as equals eliminated class differences in narcissism, while simply listing three daily activities did not.
  • In the meantime, the connection between wealth and entitlement could have troubling social implications. “You have this bifurcation of rich and poor,” says Levine. “The rich are increasingly entitled, and since they set the cultural tone for advertising and all those kinds of things, I think there’s a pervasive sense of entitlement.”
  • That could perpetuate a deepening lack of empathy that could fuel narcissistic tendencies. “You could imagine negative attitudes toward wealth redistribution as a result of entitlement,” says Piff. “The more severe inequality becomes, the more entitled people may feel and the less likely to share those resources they become.”
Javier E

The Eight-Second Attention Span - The New York Times - 3 views

  • A survey of Canadian media consumption by Microsoft concluded that the average attention span had fallen to eight seconds, down from 12 in the year 2000. We now have a shorter attention span than goldfish, the study found.
  • “The true scarce commodity” of the near future, he said, will be “human attention.”
  • there seems little doubt that our devices have rewired our brains. We think in McNugget time. The trash flows, unfiltered, along with the relevant stuff, in an eternal stream.
  • ...5 more annotations...
  • I can no longer wait in a grocery store line, or linger for a traffic light, or even pause long enough to let a bagel pop from the toaster, without reflexively reaching for my smartphone.
  • . You see it in our politics, with fear-mongering slogans replacing anything that requires sustained thought. And the collapse of a fact-based democracy, where, for example, 60 percent of Trump supporters believe Obama was born in another country, has to be a byproduct of the pick-and-choose news from the buffet line of our screens.
  • I’ve found a pair of antidotes, very old school, for my shrinking attention span.
  • . You plant something in the cold, wet soil of the fall
  • The second is deep reading
silveiragu

BBC - Future - The countries that don't exist - 2 views

  • In the deep future, every territory we know could eventually become a country that doesn’t exist.
    • silveiragu
       
      Contrary to the human expectation that situations remain constant. 
  • There really is a secret world of hidden independent nations
  • Middleton, however, is here to talk about countries missing from the vast majority of books and maps for sale here. He calls them the “countries that don’t exist”
    • silveiragu
       
      Reminds us of our strange relationship with nationalism-that we forget how artificial countries' boundaries are. 
  • ...21 more annotations...
  • The problem, he says, is that we don’t have a watertight definition of what a country is. “Which as a geographer, is kind of shocking
  • The globe, it turns out, is full of small (and not so small) regions that have all the trappings of a real country
  • and are ignored on most world maps.
  • Middleton, a geographer at the University of Oxford, has now charted these hidden lands in his new book, An Atlas of Countries that Don’t Exist
  • Middleton’s quest began, appropriately enough, with Narnia
    • silveiragu
       
      Interesting connection to imagination as a way of knowing.
  • a defined territory, a permanent population, a government, and “the capacity to enter into relations with other states”.
  • In Australia, meanwhile, the Republic of Murrawarri was founded in 2013, after the indigenous tribe wrote a letter to Queen Elizabeth II asking her to prove her legitimacy to govern their land.
  • Yet many countries that meet these criteria aren‘t members of the United Nations (commonly accepted as the final seal of a country’s statehood).
  • many of them are instead members of the “Unrepresented United Nations – an alternative body to champion their rights.
  • A handful of the names will be familiar to anyone who has read a newspaper: territories such as Taiwan, Tibet, Greenland, and Northern Cyprus.
  • The others are less famous, but they are by no means less serious
    • silveiragu
       
      By what criterion, "serious"?
  • One of the most troubling histories, he says, concerns the Republic of Lakotah (with a population of 100,000). Bang in the centre of the United States of America (just east of the Rocky Mountains), the republic is an attempt to reclaim the sacred Black Hills for the Lakota Sioux tribe.
  • Their plight began in the 18th Century, and by 1868 they had finally signed a deal with the US government that promised the right to live on the Black Hills. Unfortunately, they hadn’t accounted for a gold rush
  • Similar battles are being fought across every continent.
  • In fact, you have almost certainly, unknowingly, visited one.
  • Christiania, an enclave in the heart of Copenhagen.
  • On 26 September that year, they declared it independent, with its own “direct democracy”, in which each of the inhabitants (now numbering 850) could vote on any important matter.
    • silveiragu
       
      Interesting reminder that the label "country" does not only have to arise from military or economic struggles, as is tempting to think in our study of history. Also, interesting reminder that the label of "country"-by itself-means nothing. 
  • a blind eye to the activities
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. One interpretation of the Danish government's response to Christiania is simply that they do not know what to think. Although probably not geopolitically significant, such enclave states represent a challenge our perception of countries, one which fascinates Middleton's readers because it disconcerts them. 
  • perhaps we need to rethink the concept of the nation-state altogether? He points to Antarctica, a continent shared peacefully among the international community
    • silveiragu
       
      A sign of progress, perhaps, from the industrialism-spurred cycle of divide land, industrialize, and repeat-even if the chief reason is the region's climate. 
  • The last pages of Middleton’s Atlas contain two radical examples that question everything we think we mean by the word ‘country’.
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. These "nonexistent countries"-and our collective disregard for them-are reminiscent of the 17th and 18th centuries: then, the notion of identifying by national lines was almost as strange and artificial as these countries' borders seem to us today. 
  • “They all raise the possibility that countries as we know them are not the only legitimate basis for ordering the planet,
kushnerha

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
kushnerha

New Critique Sees Flaws in Landmark Analysis of Psychology Studies - The New York Times - 0 views

  • A landmark 2015 report that cast doubt on the results of dozens of published psychology studies has exposed deep divisions in the field, serving as a reality check for many working researchers but as an affront to others who continue to insist the original research was sound.
  • On Thursday, a group of four researchers publicly challenged the report, arguing that it was statistically flawed and, as a result, wrong.The 2015 report, called the Reproducibility Project, found that less than 40 studies in a sample of 100 psychology papers in leading journals held up when retested by an independent team. The new critique by the four researchers countered that when that team’s statistical methodology was adjusted, the rate was closer to 100 percent.Neither the original analysis nor the critique found evidence of fraud or manipulation of data.
  • “That study got so much press, and the wrong conclusions were drawn from it,” said Timothy D. Wilson, a professor of psychology at the University of Virginia and an author of the new critique. “It’s a mistake to make generalizations from something that was done poorly, and this we think was done poorly.”
  • ...6 more annotations...
  • countered that the critique was highly biased: “They are making assumptions based on selectively interpreting data and ignoring data that’s antagonistic to their point of view.”
  • The challenge comes as the field of psychology is facing a generational change, with young researchers beginning to share their data and study designs before publication, to improve transparency. Still, the new critique is likely to feed an already lively debate about how best to conduct and evaluate so-called replication projects of studies. Such projects are underway in several fields, scientists on both sides of the debate said.
  • “On some level, I suppose it is appealing to think everything is fine and there is no reason to change the status quo,” said Sanjay Srivastava, a psychologist at the University of Oregon, who was not a member of either team. “But we know too much, from many other sources, to put too much credence in an analysis that supports that remarkable conclusion.”
  • One issue the critique raised was how faithfully the replication team had adhered to the original design of the 100 studies it retested. Small alterations in design can make the difference between whether a study replicates or not, scientists say.
  • Another issue that the critique raised had to do with statistical methods. When Dr. Nosek began his study, there was no agreed-upon protocol for crunching the numbers. He and his team settled on five measures
  • He said that the original replication paper and the critique use statistical approaches that are “predictably imperfect” for this kind of analysis.One way to think about the dispute, Dr. Simohnson said, is that the original paper found that the glass was about 40 percent full, and the critique argues that it could be 100 percent full. In fact, he said in an email, “State-of-the-art techniques designed to evaluate replications say it is 40 percent full, 30 percent empty, and the remaining 30 percent could be full or empty, we can’t tell till we get more data.”
kushnerha

There's nothing wrong with grade inflation - The Washington Post - 0 views

  • By the early ’90s, so long as one had the good sense to major in the humanities — all bets were off in the STEM fields — it was nearly impossible to get a final grade below a B-minus at an elite college. According to a 2012 study, the average college GPA, which in the 1930s was a C-plus, had risen to a B at public universities and a B-plus at private schools. At Duke, Pomona and Harvard, D’s and F’s combine for just 2 percent of all grades. A Yale report found that 62 percent of all Yale grades are A or A-minus. According to a 2013 article in the Harvard Crimson, the median grade at Harvard was an A-minus , while the most common grade was an A.
  • The result is widespread panic about grade inflation at elite schools. (The phenomenon is not as prevalent at community colleges and less-selective universities.) Some blame students’ consumer mentality, a few see a correlation with small class sizes (departments with falling enrollments want to keep students happy), and many cite a general loss of rigor in a touchy-feely age.
  • Yet whenever elite schools have tried to fight grade inflation, it’s been a mess. Princeton instituted strict caps on the number of high grades awarded, then abandoned the plan, saying the caps dissuaded applicants and made students miserable. At Wellesley, grade-inflated humanities departments mandated that the average result in their introductory and intermediate classes not exceed a B-plus. According to one study, enrollment fell by one-fifth, and students were 30 percent less likely to major in one of these subjects.
  • ...12 more annotations...
  • I liked the joy my students found when they actually earned a grade they’d been reaching for. But whereas I once thought we needed to contain grades, I now see that we may as well let them float skyward. If grade inflation is bad, fighting it is worse. Our goal should be ending the centrality of grades altogether. For years, I feared that a world of only A’s would mean the end of meaningful grades; today, I’m certain of it. But what’s so bad about that?
  • It’s easy to see why schools want to fight grade inflation. Grades should motivate certain students: those afraid of the stigma of a bad grade or those ambitious, by temperament or conditioning, to succeed in measurable ways. Periodic grading during a term, on quizzes, tests or papers, provides feedback to students, which should enable them to do better. And grades theoretically signal to others, such as potential employers or graduate schools, how well the student did. (Grade-point averages are also used for prizes and class rankings, though that doesn’t strike me as an important feature.)
  • But it’s not clear that grades work well as motivators. Although recent research on the effects of grades is limited, several studies in the 1970s, 1980s and 1990s measured how students related to a task or a class when it was graded compared to when it was ungraded. Overall, graded students are less interested in the topic at hand and — and, for obvious, common-sense reasons — more inclined to pick the easiest possible task when given the chance. In the words of progressive-education theorist Alfie Kohn, author of “The Homework Myth,” “the quality of learning declines” when grades are introduced, becoming “shallower and more superficial when the point is to get a grade.”
  • Even where grades can be useful, as in describing what material a student has mastered, they are remarkably crude instruments. Yes, the student who gets a 100 on a calculus exam probably grasps the material better than the student with a 60 — but only if she retains the knowledge, which grades can’t show.
  • I still can’t say very well what separates a B from an A. What’s more, I never see the kind of incompetence or impudence that would merit a D or an F. And now, in our grade-inflated world, it’s even harder to use grades to motivate, or give feedback, or send a signal to future employers or graduate schools.
  • According to a 2012 study by the Chronicle of Higher Education, GPA was seventh out of eight factors employers considered in hiring, behind internships, extracurricular activities and previous employment. Last year, Stanford’s registrar told the Chronicle about “a clamor” from employers “for something more meaningful” than the traditional transcript. The Lumina Foundation gave a$1.27 million grant to two organizations for college administrators working to develop better student records, with grades only one part of a student’s final profile.
  • Some graduate schools, too, have basically ditched grades. “As long as you don’t bomb and flunk out, grades don’t matter very much in M.F.A. programs,” the director of one creative-writing program told the New York Times. To top humanities PhD programs, letters of reference and writing samples matter more than overall GPA (although students are surely expected to have received good grades in their intended areas of study). In fact, it’s impossible to get into good graduate or professional schools without multiple letters of reference, which have come to function as the kind of rich, descriptive comments that could go on transcripts in place of grades.
  • suggests that GPAs serve not to validate students from elite schools but to keep out those from less-prestigious schools and large public universities, where grades are less inflated. Grades at community colleges “have actually dropped” over the years, according to Stuart Rojstaczer, a co-author of the 2012 grade-inflation study. That means we have two systems: one for students at elite schools, who get jobs based on references, prestige and connections, and another for students everywhere else, who had better maintain a 3.0. Grades are a tool increasingly deployed against students without prestige.
  • The trouble is that, while it’s relatively easy for smaller colleges to go grade-free, with their low student-to-teacher ratios, it’s tough for professors at larger schools, who must evaluate more students, more quickly, with fewer resources. And adjuncts teaching five classes for poverty wages can’t write substantial term-end comments, so grades are a necessity if they want to give any feedback at all.
  • It would mean hiring more teachers and paying them better (which schools should do anyway). And if transcripts become more textured, graduate-school admission offices and employers will have to devote more resources to reading them, and to getting to know applicants through interviews and letters of reference — a salutary trend that is underway already.
  • When I think about getting rid of grades, I think of happier students, with whom I have more open, democratic relationships. I think about being forced to pay more attention to the quiet ones, since I’ll have to write something truthful about them, too. I’ve begun to wonder if a world without grades may be one of those states of affairs (like open marriages, bicycle lanes and single-payer health care) that Americans resist precisely because they seem too good, suspiciously good. Nothing worth doing is supposed to come easy.
  • Alfie Kohn, too, sees ideology at work in the grade-inflation panic. “Most of what powers the arguments against grade inflation is a very right-wing idea that excellence consists in beating everyone else around you,” he says. “Even when you have sorted them — even when they get to Harvard! — we have to sort them again.” In other words, we can trust only a system in which there are clear winners and losers.
anonymous

A Bad Review Is Forever: How to Counter Online Complaints - The New York Times - 0 views

  • One of his new Slapfish restaurants, serving sustainable seafood, was hit this year with dozens of bad reviews that complained about its prices (too high) and portions (too small).
  • “You can get buried by bad reviews,” said Mr. Gruel, whose fast-casual restaurants serve food like fish tacos and lobster burgers. “So it’s a race to stop the bleeding.”
  • “Star ratings persist forever,”
  • ...6 more annotations...
  • “Meanwhile, actual reviews can fall off the first pages of review sites. And consumers rarely read reviews older than three months.”
  • Your customers are already talking about you,” Mr. Campbell said. “So you can’t just ignore them. Anyway, businesses that engage with their customers are growing.”
  • He fears using “canned responses that aren’t personal.”
  • “The minute you see a bad review, look for a shard of truth,”
  • “I want people to know my true heart,” said Ms. Piercy, who doesn’t want to outsource her review scans either. “And I’m thankful for their feedback.”
  • “I look at the quality of the review,” she said.
Javier E

The Unpopular Virtue of Moral Certainty | Foreign Policy - 1 views

  • We are different, of course. Our household gods are not Plato and Aristotle — philosophers of a fixed cosmos — but Darwin and Freud.
  • We know the past better than Adams did, but it speaks to us from a far greater remove. And our implicit notion of what lies at the bottom of history is not a moral but a psychological one
  • What does Adams have to say to us today? I have trouble answering this question without resorting to Adams’s own habits of thought — without, that is, thinking in moral rather than psychological terms. Born in 1767, old enough to have seen the Battle of Bunker Hill with his own eyes, drilled by both parents in the imperishable virtues of republicanism, Adams exalted the ideal of public service to a degree that almost beggars our imagination.
  • ...3 more annotations...
  • after five years of reading, writing, and thinking about Adams, I’ve concluded that he really wasn’t like us at all. Of course his consciousness was different, but I imagine he was different even in the workings of his subconscious. Living in a moral rather than a psychological world, a world that does not acknowledge a subconscious realm, makes you radically different, especially if, like Adams, you have fashioned your entire life around principle
  • “I know few things in modern times so grand as that old man … a President’s son, himself a President, standing there the champion of the neediest of the oppressed.”
  • What, then, does Adams say to us — at least in the moral terms with which he, himself, would have been familiar? He says that a man can inscribe himself in the annals of posterity not only despite, but because of, his indifference to popular opinion. He might even, as Adams did, gain the esteem of his fellow man in his own lifetime, though he could do so only by virtue of not seeking it.
Javier E

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
kushnerha

In Science, It's Never 'Just a Theory' - The New York Times - 0 views

  • In everyday conversation, we tend to use the word “theory” to mean a hunch, an idle speculation, or a crackpot notion.
  • That’s not what “theory” means to scientists.“In science, the word theory isn’t applied lightly,” Kenneth R. Miller, a cell biologist at Brown University, said. “It doesn’t mean a hunch or a guess. A theory is a system of explanations that ties together a whole bunch of facts. It not only explains those facts, but predicts what you ought to find from other observations and experiments.”
  • In 2002, the board of education in Cobb County, Ga., adopted the textbook but also required science teachers to put a warning sticker inside the cover of every copy.“Evolution is a theory, not a fact, regarding the origin of living things,” the sticker read, in part.In 2004, several Cobb County parents filed a lawsuit against the county board of education to have the stickers removed. They called Dr. Miller, who testified for about two hours, explaining, among other things, the strength of evidence for the theory of evolution.
  • ...2 more annotations...
  • It’s helpful, he argues, to think about theories as being like maps.“To say something is a map is not to say it’s a hunch,” said Dr. Godfrey-Smith, a professor at the City University of New York and the University of Sydney. “It’s an attempt to represent some territory.”A theory, likewise, represents a territory of science. Instead of rivers, hills, and towns, the pieces of the territory are facts.“To call something a map is not to say anything about how good it is,” Dr. Godfrey-Smith added. “There are fantastically good maps where there’s not a shred of doubt about their accuracy. And there are maps that are speculative.”
  • To judge a map’s quality, we can see how well it guides us through its territory. In a similar way, scientists test out new theories against evidence. Just as many maps have proven to be unreliable, many theories have been cast aside.But other theories have become the foundation of modern science, such as the theory of evolution, the general theory of relativity, the theory of plate tectonics, the theory that the sun is at the center of the solar system, and the germ theory of disease.“To the best of our ability, we’ve tested them, and they’ve held up,” said Dr. Miller. “And that’s why we’ve held on to these things.”
Javier E

The Yoda of Silicon Valley - The New York Times - 0 views

  • Of course, all the algorithmic rigmarole is also causing real-world problems. Algorithms written by humans — tackling harder and harder problems, but producing code embedded with bugs and biases — are troubling enough
  • More worrisome, perhaps, are the algorithms that are not written by humans, algorithms written by the machine, as it learns.
  • Programmers still train the machine, and, crucially, feed it data
  • ...6 more annotations...
  • However, as Kevin Slavin, a research affiliate at M.I.T.’s Media Lab said, “We are now writing algorithms we cannot read. That makes this a unique moment in history, in that we are subject to ideas and actions and efforts by a set of physics that have human origins without human comprehension.
  • As Slavin has often noted, “It’s a bright future, if you’re an algorithm.”
  • “Today, programmers use stuff that Knuth, and others, have done as components of their algorithms, and then they combine that together with all the other stuff they need,”
  • “With A.I., we have the same thing. It’s just that the combining-together part will be done automatically, based on the data, rather than based on a programmer’s work. You want A.I. to be able to combine components to get a good answer based on the data
  • But you have to decide what those components are. It could happen that each component is a page or chapter out of Knuth, because that’s the best possible way to do some task.”
  • “I am worried that algorithms are getting too prominent in the world,” he added. “It started out that computer scientists were worried nobody was listening to us. Now I’m worried that too many people are listening.”
« First ‹ Previous 421 - 440 of 637 Next › Last »
Showing 20 items per page