Skip to main content

Home/ TOK Friends/ Group items tagged boundaries

Rss Feed Group items tagged

carolinewren

Humans push planet beyond boundaries towards 'danger zone' › News in Science ... - 0 views

  • Human activity has pushed the planet across four of nine environmental boundaries, sending the world towards a "danger zone", warns an international team of scientists.
  • Climate change, biodiversity loss, changes in land use, and altered biogeochemical cycles due in part to fertiliser use have fundamentally changed how the planet functions
  • destabilise complex interactions between people, oceans, land and the atmosphere,
  • ...9 more annotations...
  • For the first time in human history, we need to
  • relate to the risk of destabilising the entire planet,
  • climate change the most serious crossed boundary
  • makes the planet less hospitable,
  • - ozone depletion, ocean acidification, freshwater use, microscopic particles in the atmosphere and chemical pollution -- have not been crossed
  • Passing the boundaries does not cause immediate chaos but pushes the planet into a period of uncertainty.
  • nine planetary boundaries within which humanity can develop and thrive.
  • we're seeing extreme weather events become worse, loss of polar ice and other worrying impacts,
  • Commodity prices, a measure of scarcity for energy and other basic goods, are also falling, leading some economists to question warnings from climate scientists and environmentalists.
johnsonle1

Scientists Find First Observed Evidence That Our Universe May Be a Hologram | Big Think - 1 views

  • all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
  • the team found that the observational data they found was largely predictable by the math of holographic theory. 
  • After this phase comes to a close, the Universe goes into a geometric phase, which can be described by Einstein's equations.
  • ...1 more annotation...
  • It's a new paradigm for a physical reality.
  •  
    As we watched in the video "Spooky Science" in TOK, we saw how 2D and 3D world are very distinctive, but in this article, the author discussed another theory that our 3D reality may actually be included in the 2D surface of its boundaries. This theory is a rival to the theory of cosmic inflation. The holographic theory not only explains the abnormalities, it is also a more simple theory of the early universe. Now the scientists find that the math of holographic theory can very much predict the data, so it has the potential to be a new paradigm for a physical reality. --Sissi (2/6/2017)
  •  
    What is the holographic universe idea? It's not exactly that we are living in some kind of Star Trekky computer simulation. Rather the idea, first proposed in the 1990s by Leonard Susskind and Gerard 't Hooft, says that all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
Javier E

The Choice Explosion - The New York Times - 0 views

  • the social psychologist Sheena Iyengar asked 100 American and Japanese college students to take a piece of paper. On one side, she had them write down the decisions in life they would like to make for themselves. On the other, they wrote the decisions they would like to pass on to others.
  • The Americans desired choice in four times more domains than the Japanese.
  • Americans now have more choices over more things than any other culture in human history. We can choose between a broader array of foods, media sources, lifestyles and identities. We have more freedom to live out our own sexual identities and more religious and nonreligious options to express our spiritual natures.
  • ...15 more annotations...
  • But making decisions well is incredibly difficult, even for highly educated professional decision makers. As Chip Heath and Dan Heath point out in their book “Decisive,” 83 percent of corporate mergers and acquisitions do not increase shareholder value, 40 percent of senior hires do not last 18 months in their new position, 44 percent of lawyers would recommend that a young person not follow them into the law.
  • It’s becoming incredibly important to learn to decide well, to develop the techniques of self-distancing to counteract the flaws in our own mental machinery. The Heath book is a very good compilation of those techniques.
  • assume positive intent. When in the midst of some conflict, start with the belief that others are well intentioned. It makes it easier to absorb information from people you’d rather not listen to.
  • Suzy Welch’s 10-10-10 rule. When you’re about to make a decision, ask yourself how you will feel about it 10 minutes from now, 10 months from now and 10 years from now. People are overly biased by the immediate pain of some choice, but they can put the short-term pain in long-term perspective by asking these questions.
  • An "explosion" that may also be a "dissolution" or "disintegration," in my view. Unlimited choices. Conduct without boundaries. All of which may be viewed as either "great" or "terrible." The poor suffer when they have no means to pursue choices, which is terrible. The rich seem only to want more and more, wealth without boundaries, which is great for those so able to do. Yes, we need a new decision-making tool, but perhaps one that is also very old: simplify, simplify,simplify by setting moral boundaries that apply to all and which define concisely what our life together ought to be.
  • our tendency to narrow-frame, to see every decision as a binary “whether or not” alternative. Whenever you find yourself asking “whether or not,” it’s best to step back and ask, “How can I widen my options?”
  • deliberate mistakes. A survey of new brides found that 20 percent were not initially attracted to the man they ended up marrying. Sometimes it’s useful to make a deliberate “mistake” — agreeing to dinner with a guy who is not your normal type. Sometimes you don’t really know what you want and the filters you apply are hurting you.
  • It makes you think that we should have explicit decision-making curriculums in all schools. Maybe there should be a common course publicizing the work of Daniel Kahneman, Cass Sunstein, Dan Ariely and others who study the way we mess up and the techniques we can adopt to prevent error.
  • The explosion of choice places extra burdens on the individual. Poorer Americans have fewer resources to master decision-making techniques, less social support to guide their decision-making and less of a safety net to catch them when they err.
  • the stress of scarcity itself can distort decision-making. Those who experienced stress as children often perceive threat more acutely and live more defensively.
  • The explosion of choice means we all need more help understanding the anatomy of decision-making.
  • living in an area of concentrated poverty can close down your perceived options, and comfortably “relieve you of the burden of choosing life.” It’s hard to maintain a feeling of agency when you see no chance of opportunity.
  • In this way the choice explosion has contributed to widening inequality.
  • The relentless all-hour reruns of "Law and Order" in 100 channel cable markets provide direct rebuff to the touted but hollow promise/premise of wider "choice." The small group of personalities debating a pre-framed trivial point of view, over and over, nightly/daily (in video clips), without data, global comparison, historic reference, regional content, or a deep commitment to truth or knowledge of facts has resulted in many choosing narrower limits: streaming music, coffee shops, Facebook--now a "choice" of 1.65 billion users.
  • It’s important to offer opportunity and incentives. But we also need lessons in self-awareness — on exactly how our decision-making tool is fundamentally flawed, and on mental frameworks we can adopt to avoid messing up even more than we do.
Javier E

Are we in the Anthropocene? Geologists could define new epoch for Earth - 0 views

  • If the nearly two dozen voting members of the Anthropocene Working Group (AWG), a committee of scientists formed by the International Commission on Stratigraphy (ICS), agree on a site, the decision could usher in the end of the roughly 12,000-year-old Holocene epoch. And it would officially acknowledge that humans have had a profound influence on Earth.
  • Scientists coined the term Anthropocene in 2000, and researchers from several fields now use it informally to refer to the current geological time interval, in which human activity is driving Earth’s conditions and processes.
  • Formalizing the Anthropocene would unite efforts to study people’s influence on Earth’s systems, in fields including climatology and geology, researchers say. Transitioning to a new epoch might also coax policymakers to take into account the impact of humans on the environment during decision-making.
  • ...13 more annotations...
  • Defining the Anthropocene: nine sites are in the running to be given the ‘golden spike’ designation
  • Mentioning the Jurassic period, for instance, helps scientists to picture plants and animals that were alive during that time
  • “The Anthropocene represents an umbrella for all of these different changes that humans have made to the planet,”
  • Typically, researchers will agree that a specific change in Earth’s geology must be captured in the official timeline. The ICS will then determine which set of rock layers, called strata, best illustrates that change, and it will choose which layer marks its lower boundary
  • This is called the Global Stratotype Section and Point (GSSP), and it is defined by a signal, such as the first appearance of a fossil species, trapped in the rock, mud or other material. One location is chosen to represent the boundary, and researchers mark this site physically with a golden spike, to commemorate it.
  • “It’s a label,” says Colin Waters, who chairs the AWG and is a geologist at the University of Leicester, UK. “It’s a great way of summarizing a lot of concepts into one word.”
  • But the Anthropocene has posed problems. Geologists want to capture it in the timeline, but its beginning isn’t obvious in Earth’s strata, and signs of human activity have never before been part of the defining process.
  • “We had a vague idea about what it might be, [but] we didn’t know what kind of hard evidence would go into it.”
  • Years of debate among the group’s multidisciplinary members led them to identify a host of signals — radioactive isotopes from nuclear-bomb tests, ash from fossil-fuel combustion, microplastics, pesticides — that would be trapped in the strata of an Anthropocene-defining site. These began to appear in the early 1950s, when a booming human population started consuming materials and creating new ones faster than ever.
  • Why do some geologists oppose the Anthropocene as a new epoch?“It misrepresents what we do” in the ICS, says Stanley Finney, a stratigrapher at California State University, Long Beach, and secretary-general for the International Union of Geological Sciences (IUGS). The AWG is working backwards, Finney says: normally, geologists identify strata that should enter the geological timescale before considering a golden spike; in this case, they’re seeking out the lower boundary of an undefined set of geological layers.
  • Lucy Edwards, a palaeontologist who retired in 2008 from the Florence Bascom Geoscience Center in Reston, Virginia, agrees. For her, the strata that might define the Anthropocene do not yet exist because the proposed epoch is so young. “There is no geologic record of tomorrow,”
  • Edwards, Finney and other researchers have instead proposed calling the Anthropocene a geological ‘event’, a flexible term that can stretch in time, depending on human impact. “It’s all-encompassing,” Edwards says.
  • Zalasiewicz disagrees. “The word ‘event’ has been used and stretched to mean all kinds of things,” he says. “So simply calling something an event doesn’t give it any wider meaning.”
Javier E

How 'Concept Creep' Made Americans So Sensitive to Harm - The Atlantic - 0 views

  • How did American culture arrive at these moments? A new research paper by Nick Haslam, a professor of psychology at the University of Melbourne, Australia, offers as useful a framework for understanding what’s going on as any I’ve seen. In “Concept Creep: Psychology's Expanding Concepts of Harm and Pathology,”
  • concepts like abuse, bullying, trauma, mental disorder, addiction, and prejudice, “now encompass a much broader range of phenomena than before,”expanded meanings that reflect “an ever-increasing sensitivity to harm.”
  • “they also have potentially damaging ramifications for society and psychology that cannot be ignored.”
  • ...20 more annotations...
  • He calls these expansions of meaning “concept creep.”
  • critics may hold concept creep responsible for damaging cultural trends, he writes, “such as supposed cultures of fear, therapy, and victimhood, the shifts I present have some positive implications.”
  • Concept creep is inevitable and vital if society is to make good use of new information. But why has the direction of concept creep, across so many different concepts, trended toward greater sensitivity to harm as opposed to lesser sensitivity?
  • The concept of abuse expanded too far.
  • Classically, psychological investigations recognized two forms of child abuse, physical and sexual, Haslam writes. In more recent decades, however, the concept of abuse has witnessed “horizontal creep” as new forms of abuse were recognized or studied. For example, “emotional abuse” was added as a new subtype of abuse. Neglect, traditionally a separate category, came to be seen as a type of abuse, too.
  • Meanwhile, the concept of abuse underwent “vertical creep.” That is, the behavior seen as qualifying for a given kind of abuse became steadily less extreme. Some now regard any spanking as physical abuse. Within psychology, “the boundary of neglect is indistinct,” Haslam writes. “As a consequence, the concept of neglect can become over-inclusive, identifying behavior as negligent that is substantially milder or more subtle than other forms of abuse. This is not to deny that some forms of neglect are profoundly damaging, merely to argue that the concept’s boundaries are sufficiently vague and elastic to encompass forms that are not severe.”
  • How did a working-class mom get arrested, lose her fast food job, and temporarily lose custody of her 9-year-old for letting the child play alone at a nearby park?
  • One concerns the field of psychology and its incentives. “It could be argued that just as successful species increase their territory, invading and adapting to new habitats, successful concepts and disciplines also expand their range into new semantic niches,” he theorizes. “Concepts that successfully attract the attention of researchers and practitioners are more likely to be applied in new ways and new contexts than those that do not.”
  • Concept creep can be necessary or needless. It can align concepts more or less closely with underlying realities. It can change society for better or worse. Yet many who push for more sensitivy to harm seem unaware of how oversensitivty can do harm.
  • The other theory posits an ideological explanation. “Psychology has played a role in the liberal agenda of sensitivity to harm and responsiveness to the harmed,” he writes “and its increased focus on negative phenomena—harms such as abuse, addiction, bullying, mental disorder, prejudice, and trauma—has been symptomatic of the success of that social agenda.”
  • Jonathan Haidt, who believes it has gone too far, offers a fourth theory. “If an increasingly left-leaning academy is staffed by people who are increasingly hostile to conservatives, then we can expect that their concepts will shift, via motivated scholarship, in ways that will help them and their allies (e.g., university administrators) to prosecute and condemn conservatives,
  • While Haslam and Haidt appear to have meaningfully different beliefs about why concept creep arose within academic psychology and spread throughout society, they were in sufficient agreement about its dangers to co-author a Guardian op-ed on the subject.
  • It focuses on how greater sensitivity to harm has affected college campuses.
  • “Of course young people need to be protected from some kinds of harm, but overprotection is harmful, too, for it causes fragility and hinders the development of resilience,” they wrote. “As Nasim Taleb pointed out in his book Antifragile, muscles need resistance to develop, bones need stress and shock to strengthen and the growing immune system needs to be exposed to pathogens in order to function. Similarly, he noted, children are by nature anti-fragile – they get stronger when they learn to recover from setbacks, failures and challenges to their cherished ideas.”
  • police officers fearing harm from dogs kill them by the hundreds or perhaps thousands every year in what the DOJ calls an epidemic.
  • After the terrorist attacks of September 11, 2001, the Bush Administration and many Americans grew increasingly sensitive to harms, real and imagined, from terrorism
  • Dick Cheney declared, “If there's a 1% chance that Pakistani scientists are helping al-Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response. It's not about our analysis ... It's about our response.” The invasion of Iraq was predicated, in part, on the idea that 9/11 “changed everything,”
  • Before 9/11, the notion of torturing prisoners was verboten. After the Bush Administration’s torture was made public, popular debate focused on mythical “ticking time bomb” scenarios, in which a whole city would be obliterated but for torture. Now Donald Trump suggests that torture should be used more generally against terrorists. Torture is, as well, an instance in which people within the field of psychology pushed concept creep in the direction of less sensitivity to harm,
  • Haslam endorses two theories
  • there are many reasons to be concerned about excessive sensitivity to harm:
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

'Trespassing on Einstein's Lawn,' by Amanda Gefter - NYTimes.com - 0 views

  • It all began when Warren Gefter, a radiologist “prone to posing Zen-koan-like questions,” asked his 15-year-old daughter, Amanda, over dinner at a Chinese restaurant near their home just outside Philadelphia: “How would you define nothing?”
  • “I think we should figure it out,” he said. And his teenage daughter — sullen, rebellious, wallowing in existential dread — smiled for the first time “in what felt like years.” The project proved to be a gift from a wise, insightful father. It was Warren Gefter’s way of rescuing his child.
  • “If observers create reality, where do the observers come from?” But the great man responded in riddles. “The universe is a self-­excited circuit,” Wheeler said. “The boundary of a boundary is zero.” The unraveling of these mysteries propels the next 400 or so pages.
  • ...6 more annotations...
  • She became a science journalist. At first it was a lark, a way to get free press passes to conferences where she and her father could ask questions of the greatest minds in quantum mechanics, string theory and cosmology. But within a short time, as she started getting assignments, journalism became a calling, and an identity.
  • Tracking down the meaning of nothing — and, by extension, secrets about the origin of the universe and whether observer-independent reality exists — became the defining project of their lives. They spent hours together working on the puzzle, two dark heads bent over their physics books far into the night.
  • she has an epiphany — that for something to be real, it must be invariant — she flies home to share it with her father. They discuss her insight over breakfast at a neighborhood haunt, where they make a list on what they will affectionately call “the IHOP napkin.” They list all the possible “ingredients of ultimate reality,” planning to test each item for whether it is “real,” that is whether it is invariant and can exist in the absence of an observer.
  • their readings and interviews reveal that each item in turn is observer-dependent. Space? Observer-dependent, and therefore not real. Gravity, electromagnetism, angular momentum? No, no, and no. In the end, every putative “ingredient of ultimate reality” is eliminated, including one they hadn’t even bothered to put on the list because it seemed weird to: reality itself
  • What remained was an unsettling and essential insight: that “physics isn’t the machinery behind the workings of the world; physics is the machinery behind the illusion that there is a world.”
  • In the proposal, she clarifies how cosmology and quantum mechanics have evolved as scientists come to grips with the fact that things they had taken to be real — quantum particles, space-time, gravity, dimension — turn out to be ­observer-dependent.
carolinewren

Shady Science: How the Brain Remembers Colors - 0 views

  • When you bring home the wrong color of paint from the hardware store, it may not be your foggy memory at fault
  • Flombaum and his colleagues conducted four experiments on four different groups of people.
  • while the human brain can distinguish between millions of colors, it has difficulty remembering specific shades.
  • ...7 more annotations...
  • The exercise was designed to find the perceived boundaries between colors, the researchers said
  • scientists showed different people the same colors, but this time they asked them to find the "best example" of a particular color.
  • researchers showed participants colored squares, and asked them to select the best match on the color wheel. In a fourth experiment, another group of participants completed the same task, but there was a delay of 90 milliseconds between when each color square was displayed and when they were asked to select the best match on the color wheel.
  • This tendency to lump colors together could explain why it's so hard to match the color of house paint based on memory alone, the researchers said
  • categories are indeed important in how people identify and remember colors.
  • participants who were asked to name the colors reliably saw five hues: blue, yellow, pink, purple and green
  • "Where that fuzzy naming happened, those are the boundaries"
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

The New Atlantis » Science and the Left - 0 views

  • A casual observer of American politics in recent years could be forgiven for imagining that the legitimacy of scientific inquiry and empirical knowledge are under assault by the right, and that the left has mounted a heroic defense. Science is constantly on the lips of Democratic politicians and liberal activists, and is generally treated by them as a vulnerable and precious inheritance being pillaged by Neanderthals.
  • But beneath these grave accusations, it turns out, are some remarkably flimsy grievances, most of which seem to amount to political disputes about policy questions in which science plays a role.
  • But if this notion of a “war on science” tells us little about the right, it does tell us something important about the American left and its self-understanding. That liberals take attacks against their own political preferences to be attacks against science helps us see the degree to which they identify themselves—their ideals, their means, their ends, their cause, and their culture—with the modern scientific enterprise.
  • ...10 more annotations...
  • There is indeed a deep and well-established kinship between science and the left, one that reaches to the earliest days of modern science and politics and has grown stronger with time. Even though they go astray in caricaturing conservatives as anti-science Luddites, American liberals and progressives are not mistaken to think of themselves as the party of science. They do, however, tend to focus on only a few elements and consequences of that connection, and to look past some deep and complicated problems in the much-valued relationship. The profound ties that bind science and the left can teach us a great deal about both.
  • It is not unfair to suggest that the right emerged in response to the left, as the anti-traditional theory and practice of the French Revolution provoked a powerful reaction in defense of a political order built to suit human nature and tested and tried through generations of practice and reform.
  • The left, however, did not emerge in response to the right. It emerged in response to a new set of ideas and intellectual possibilities that burst onto the European scene in the seventeenth and eighteenth centuries—ideas and possibilities that we now think of as modern scientific thought.
  • Both as action and as knowledge, then, science has been a source of inspiration for progressives and for liberals, and its advancement has been one of their great causes. That does not mean that science captures all there is to know about the left. Far from it. The left has always had a deeply romantic and even anti-rationalist side too, reaching back almost as far as its scientism. But in its basic view of knowledge, power, nature, and man, the left owes much to science. And in the causes it chooses to advance in our time, it often looks to scientific thought and practice for guidance. In its most essential disagreements with the right—in particular, about tradition—the vision defended by the left is also a vision of scientific progress.
  • Not all environmentalism indulges in such anti-humanism, to be sure. But in all of its forms, the environmentalist ethic calls for a science of beholding nature, not of mastering it. Far from viewing nature as the oppressor, this new vision sees nature as a precious, vulnerable, and almost benevolent passive environment, held in careful balance, and under siege by human action and human power. This view of nature calls for human restraint and humility—and for diminished expectations of human power and potential.The environmental movement is, in this sense, not a natural fit for the progressive and forward-looking mentality of the left. Indeed, in many important respects environmentalism is deeply conservative. It takes no great feat of logic to show that conservation is conservative, of course, but the conservatism of the environmental movement runs far deeper than that. The movement seeks to preserve a given balance which we did not create, are not capable of fully understanding, and should not delude ourselves into imagining we can much improve—in other words, its attitude toward nature is much like the attitude of conservatism toward society.
  • Moreover, contemporary environmentalism is deeply moralistic. It speaks of duties and responsibilities, of curbing arrogance and vice.
  • But whatever the reason, environmentalism, and with it a worldview deeply at odds with that behind the scientific enterprise, has come to play a pivotal role in the thinking of the left.
  • The American left seeks to be both the party of science and the party of equality. But in the coming years, as the biotechnology revolution progresses, it will increasingly be forced to confront the powerful tension between these two aspirations.
  • To choose well, the American left will need first to understand that a choice is even needed at all—that this tension exists between the ideals of progressives, and the ideology of science.
  • The answer, as ever, is moderation. The American left, like the American right, must understand science as a human endeavor with ethical purposes and practical limits, one which must be kept within certain boundaries by a self-governing people. In failing to observe and to enforce those boundaries, the left threatens its own greatest assets, and exacerbates tensions at the foundations of American political life. To make the most of the benefits scientific advancement can bring us, we must be alert to the risks it may pose. That awareness is endangered by the closing of the gap between science and the left—and the danger is greatest for the left itself.
Javier E

Geology's Timekeepers Are Feuding - The Atlantic - 0 views

  • , in 2000, the Nobel Prize-winning chemist Paul Crutzen won permanent fame for stratigraphy. He proposed that humans had so throughly altered the fundamental processes of the planet—through agriculture, climate change, and nuclear testing, and other phenomena—that a new geological epoch had commenced: the Anthropocene, the age of humans.
  • Zalasiewicz should know. He is the chair of the Anthropocene working group, which the ICS established in 2009 to investigate whether the new epoch deserved a place in stratigraphic time.
  • In 2015, the group announced that the Anthropocene was a plausible new layer and that it should likely follow the Holocene. But the team has yet to propose a “golden spike” for the epoch: a boundary in the sedimentary rock record where the Anthropocene clearly begins.
  • ...12 more annotations...
  • Officially, the Holocene is still running today. You have lived your entire life in the Holocene, and the Holocene has constituted the geological “present” for as long as there have been geologists.But if we now live in a new epoch, the Anthropocene, then the ICS will have to chop the Holocene somewhere. It will have to choose when the Holocene ended, and it will move some amount of time out of the purview of the Holocene working group and into that of the Anthropocene working group.
  • This is politically difficult. And right now, the Anthropocene working group seems intent on not carving too deep into the Holocene. In a paper published earlier this year in Earth-Science Reviews, the Anthropocene working group’s members strongly imply that they will propose starting the new epoch in the mid-20th century.
  • Some geologists argue that the Anthropocene started even earlier: perhaps 4,000 or 6,000 years ago, as farmers began to remake the land surface.“Most of the world’s forests that were going to be converted to cropland and agriculture were already cleared well before 1950,” says Bill Ruddiman, a geology professor at the University of Virginia and an advocate of this extremely early Anthropocene.
  • “Most of the world’s prairies and steppes that were going to be cleared for crops were already gone, by then. How can you argue the Anthropocene started in 1950 when all of the major things that affect Earth’s surface were already over?”Van der Pluijm agreed that the Anthropocene working group was picking 1950 for “not very good reasons.”“Agriculture was the revolution that allowed society to develop,” he said. “That was really when people started to force the land to work for them. That massive land movement—it’s like a landslide, except it’s a humanslide. And it is not, of course, as dramatic as today’s motion of land, but it starts the clock.”
  • This muddle had to stop. The Holocene comes up constantly in discussions of modern global warming. Geologists and climate scientists did not make their jobs any easier by slicing it in different ways and telling contradictory stories about it.
  • This process started almost 10 years ago. For this reason, Zalasiewicz, the chair of the Anthropocene working group, said he wasn’t blindsided by the new subdivisions at all. In fact, he voted to adopt them as a member of the Quaternary working group.“Whether the Anthropocene works with a unified Holocene or one that’s in three parts makes for very little difference,” he told me.In fact, it had made the Anthropocene group’s work easier. “It has been useful to compare the scale of the two climate events that mark the new boundaries [within the Holocene] with the kind of changes that we’re assessing in the Anthropocene. It has been quite useful to have the compare and contrast,” he said. “Our view is that some of the changes in the Anthropocene are rather bigger.”
  • Zalasiewicz said that he and his colleagues were going as fast as they could. When the working group group began its work in 2009, it was “really starting from scratch,” he told me.While other working groups have a large body of stratigraphic research to consider, the Anthropocene working group had nothing. “We had to spend a fair bit of time deciding whether the Anthropocene was geology at all,” he said. Then they had to decide where its signal could show up. Now, they’re looking for evidence that shows it.
  • This cycle of “glacials” and “interglacials” has played out about 50 times over the last several million years. When the Holocene began, it was only another interglacial—albeit the one we live in. Until recently, glaciers were still on schedule to descend in another 30,000 years or so.Yet geologists still call the Holocene an epoch, even though they do not bestow this term on any of the previous 49 interglacials. It get special treatment because we live in it.
  • Much of this science is now moot. Humanity’s vast emissions of greenhouse gas have now so warmed the climate that they have offset the next glaciation. They may even knock us out of the ongoing cycle of Ice Ages, sending the Earth hurtling back toward a “greenhouse” climate after the more amenable “icehouse” climate during which humans evolved.For this reason, van der Pluijm wants the Anthropocene to supplant the Holocene entirely. Humans made their first great change to the environment at the close of the last glaciation, when they seem to have hunted the world’s largest mammals—the wooly mammoth, the saber-toothed tiger—to extinction. Why not start the Anthropocene then?He would even rename the pre-1800 period “the Holocene Age” as a consolation prize:
  • Zalasiewicz said he would not start the Anthropocene too early in time, as it would be too work-intensive for the field to rename such a vast swath of time. “The early-Anthropocene idea would crosscut against the Holocene as it’s seen by Holocene workers,” he said. If other academics didn’t like this, they could create their own timescales and start the Anthropocene Epoch where they choose. “We have no jurisdiction over the word Anthropocene,” he said.
  • Ruddiman, the University of Virginia professor who first argued for a very early Anthropocene, now makes an even broader case. He’s not sure it makes sense to formally define the Anthropocene at all. In a paper published this week, he objects to designating the Anthropocene as starting in the 1950s—and then he objects to delineating the Anthropocene, or indeed any new geological epoch, by name. “Keep the use of the term informal,” he told me. “Don’t make it rigid. Keep it informal so people can say the early-agricultural Anthropocene, or the industrial-era Anthropocene.”
  • “This is the age of geochemical dating,” he said. Geologists have stopped looking to the ICS to place each rock sample into the rock sequence. Instead, field geologists use laboratory techniques to get a precise year or century of origin for each rock sample. “The community just doesn’t care about these definitions,” he said.
anonymous

The Happiness Course: Here What's Some Learned - The New York Times - 0 views

  • Over 3 Million People Took This Course on Happiness. Here’s What Some Learned.
  • It may seem simple, but it bears repeating: sleep, gratitude and helping other people.
  • The Yale happiness class, formally known as Psyc 157: Psychology and the Good Life, is one of the most popular classes to be offered in the university’s 320-year history
  • ...26 more annotations...
  • To date, over 3.3 million people have signed up, according to the website.
  • “Everyone knows what they need to do to protect their physical health: wash your hands, and social distance, and wear a mask,” she added. “People were struggling with what to do to protect their mental health.”
  • The Coursera curriculum, adapted from the one Dr. Santos taught at Yale, asks students to, among other things, track their sleep patterns, keep a gratitude journal, perform random acts of kindness, and take note of whether, over time, these behaviors correlate with a positive change in their general mood.
  • Ms. McIntire took the class. She called it “life-changing.”
  • A night owl, she had struggled with sleep and enforcing her own time boundaries.
  • “It’s hard to set those boundaries with yourself sometimes and say, ‘I know this book is really exciting, but it can wait till tomorrow, sleep is more important,’”
  • “That’s discipline, right? But I had never done it in that way, where it’s like, ‘It’s going to make you happier. It’s not just good for you; it’s going to actually legitimately make you happier.’”
  • has stuck with it even after finishing the class
  • Meditation also helped her to get off social media.
  • “I found myself looking inward. It helped me become more introspective,” she said. “Honestly, it was the best thing I ever did.”
  • “There’s no reason I shouldn’t be happy,” she said. “I have a wonderful marriage. I have two kids. I have a nice job and a nice house. And I just could never find happiness.
  • Since taking the course, Ms. Morgan, 52, has made a commitment to do three things every day: practice yoga for one hour, take a walk outside in nature no matter how cold it may be in Alberta, and write three to five entries in her gratitude journal before bed
  • “When you start writing down those things at the end of the day, you only think about it at the end of the day, but once you make it a routine, you start to think about it all throughout the day,”
  • some studies show that finding reasons to be grateful can increase your general sense of well-being.
  • “Somewhere along the second or third year, you do feel a bit burned out, and you need strategies for dealing with it,”
  • “I’m still feeling that happiness months later,”
  • Matt Nadel, 21, a Yale senior, was among the 1,200 students taking the class on campus in 2018. He said the rigors of Yale were a big adjustment when he started at the university in the fall of 2017.
  • “Did the class impact my life in a long term, tangible way? The answer is no.”
  • While the class wasn’t life-changing for him, Mr. Nadel said that he is more expressive now when he feels gratitude.
  • “I think I was struggling to reconcile, and to intellectually interrogate, my religion,” he said. “Also acknowledging that I just really like to hang out with this kind of community that I think made me who I am.”
  • Life-changing? No. But certainly life-affirming
  • “The class helped make me more secure and comfortable in my pre-existing religious beliefs,”
  • negative visualization. This entails thinking of a good thing in your life (like your gorgeous, reasonably affordable apartment) and then imagining the worst-case scenario (suddenly finding yourself homeless and without a safety net).
  • If gratitude is something that doesn’t come naturally, negative visualization can help you to get there.
  • “That’s something that I really keep in mind, especially when I feel like my mind is so trapped in thinking about future hurdles,
  • “I should be so grateful for everything that I have. Because you’re not built to notice these things.”
pier-paolo

THE CLOSE READER; Powers of Perception - The New York Times - 0 views

  • Keller's writing jars the contemporary reader in three ways. First, she composes in the grandiose manner favored by the late-19th-century genteel essayist, with lots of quotations and inverted sentences. Second, she gushes with a girlish gratefulness that registers, in our more cynical time, as more ingratiating than genuine
  • Keller violates a cardinal rule of autobiography, which is to distinguish what you have been told from what you know from experience. She narrates, as if she knew them firsthand, events from very early childhood and the first stages of her education -- neither of which she could possibly remember herself, at least not in such detail.
  • When Keller's book came out in 1903, she was criticized by one reviewer for her constant, un-self-conscious allusions to color and music. ''All her knowledge is hearsay knowledge,'' this critic wrote in The Nation, ''her very sensations are for the most part vicarious, and yet she writes of things beyond her powers of perception with the assurance of one who has verified every word.'
  • ...5 more annotations...
  • Maybe Shattuck is right and we are all like this -- creatures of language, rather than its masters. Much of what we think we know firsthand we probably picked up from books or newspapers or friends or lovers and never checked against the world at all.
  • What she knew of her own observation is exactly what we want to know from her. We want to know what it felt like to be Helen Keller. We want to locate the boundaries between what was real to her and what she was forced to imagine. At least in this book, she seems not to have known where that boundary might lie.
  • Her ability to experience what others felt and heard, she said, illustrated the power of imagination, particularly one that had been developed and extended, as hers was, by books.
  • He tries to remember what he looks like and discovers that he cannot. He asks: ''To what extent is loss of the image of the face connected with loss of the image of the self? Is this one of the reasons why I often feel that I am mere spirit, a ghost, a memory?''
  • Keller, in short, matured, both as a person and a writer. She mastered a lesson that relatively few with all their senses have ever mastered, which is to write about what you know.
Javier E

Obscurity: A Better Way to Think About Your Data Than 'Privacy' - Woodrow Hartzog and E... - 1 views

  • Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn't mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.
  • Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too
  • What obscurity draws our attention to, is that while the records were accessible to any member of the public prior to the rise of big data, more effort was required to obtain, aggregate, and publish them. In that prior context, technological constraints implicitly protected privacy interests.
  • ...9 more annotations...
  • the "you choose who to let in" narrative is powerful because it trades on traditional notions of space and boundary regulation, and further appeals to our heightened sense of individual responsibility, and, possibly even vanity. The basic message is that so long as we exercise good judgment when selecting our friends, no privacy problems will arise
  • What this appeal to status quo relations and existing privacy settings conceals is the transformative potential of Graph : new types of searching can emerge that, due to enhanced frequency and newly created associations between data points, weaken, and possibly obliterate obscurity.
  • the stalker frame muddies the concept, implying that the problem is people with bad intentions getting our information. Determined stalkers certainly pose a threat to the obscurity of information because they represent an increased likelihood that obscure information will be found and understood.
  • he other dominant narrative emerging is that the Graph will simplify "stalking."
  • Well-intentioned searches can be problematic, too.
  • It is not a stretch to assume Graph could enable searching through the content of posts a user has liked or commented on and generating categories of interests from it. For example, users could search which of their friends are interested in politics, or, perhaps, specifically, in left-wing politics.
  • In this scenario, a user who wasn't a fan of political groups or causes, didn't list political groups or causes as interests, and didn't post political stories, could still be identified as political.
  • In a system that purportedly relies upon user control, it is still unclear how and if users will be able to detect when their personal information is no longer obscure. How will they be able to anticipate the numerous different queries that might expose previously obscure information? Will users even be aware of all the composite results including their information?
  • Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power. A major task ahead is for society to determine how much obscurity citizens need to thrive.
Javier E

The Importance of Doing Recent History | History News Network - 1 views

  • We argue that writing contemporary history is different from the role historians might play as public intellectuals who draw on their expertise to comment on recent events in the media. Instead, the writing of recent history shifts the boundaries of what might be considered a legitimate topic of historical study. The very definition of “history” has hinged on the sense of a break between past and present that allows for critical perspective. The historians’ traditional task has been to bring a “dead,” absent past back into the present. However, those doing recent history recognize that their subject matter is not fully past, or as Renee Romano puts it in our edited collection about recent history, it’s “not dead yet.”
  • studying the recent past presents real methodological challenges. It untethers the academic historian from the aspects of our practice that give us all, regardless of field or political bent, a sense of common enterprise: objectivity, perspective, a defined archive, and a secondary literature that is there to be argued with, corrected and leaned upon.
Javier E

Coping with Chaos in the White House - Medium - 0 views

  • I am not a professional and this is not a diagnosis. My post is not intended to persuade anyone or provide a comprehensive description of NPD. I am speaking purely from decades of dealing with NPD and sharing strategies that were helpful for me in coping and predicting behavior.
  • Here are a few things to keep in mind:
  • 1) It’s not curable and it’s barely treatable. He is who he is. There is no getting better, or learning, or adapting. He’s not going to “rise to the occasion” for more than maybe a couple hours. So just put that out of your mind.
  • ...11 more annotations...
  • 2) He will say whatever feels most comfortable or good to him at any given time. He will lie a lot, and say totally different things to different people. Stop being surprised by this. While it’s important to pretend “good faith” and remind him of promises, as Bernie Sanders and others are doing, that’s for his supporters, so *they* can see the inconsistency as it comes. He won’t care. So if you’re trying to reconcile or analyze his words, don’t. It’s 100% not worth your time. Only pay attention to and address his actions.
  • 3) You can influence him by making him feel good. There are already people like Bannon who appear ready to use him for their own ends. The GOP is excited to try. Watch them, not him.
  • 4) Entitlement is a key aspect of the disorder. As we are already seeing, he will likely not observe traditional boundaries of the office. He has already stated that rules don’t apply to him. This particular attribute has huge implications for the presidency and it will be important for everyone who can to hold him to the same standards as previous presidents.
  • 5) We should expect that he only cares about himself and those he views as extensions of himself, like his children. (People with NPD often can’t understand others as fully human or distinct.) He desires accumulation of wealth and power because it fills a hole.
  • He will have no qualms *at all* about stealing everything he can from the country, and he’ll be happy to help others do so, if they make him feel good. He won’t view it as stealing but rather as something he’s entitled to do. This is likely the only thing he will intentionally accomplish.
  • 6) It’s very, very confusing for non-disordered people to experience a disordered person with NPD. While often intelligent, charismatic and charming, they do not reliably observe social conventions or demonstrate basic human empathy. It’s very common for non-disordered people to lower their own expectations and try to normalize the behavior. DO NOT DO THIS
  • 7) People with NPD often recruit helpers, referred to in the literature as “enablers” when they allow or cover for bad behavior and “flying monkeys” when they perpetrate bad behavior
  • 8) People with NPD often foster competition for sport in people they control. Expect lots of chaos, firings and recriminations. He will probably behave worst toward those closest to him, but that doesn’t mean (obviously) that his actions won’t have consequences for the rest of us. He will punish enemies.
  • 9) Gaslighting — where someone tries to convince you that the reality you’ve experienced isn’t true — is real and torturous. He will gaslight, his followers will gaslight.
  • Learn the signs and find ways to stay focused on what you know to be true. Note: it is typically not helpful to argue with people who are attempting to gaslight. You will only confuse yourself. Just walk away.
  • 10) Whenever possible, do not focus on the narcissist or give him attention. Unfortunately we can’t and shouldn’t ignore the president, but don’t circulate his tweets or laugh at him — you are enabling him and getting his word out.
Javier E

The decline effect and the scientific method : The New Yorker - 3 views

  • The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
  • But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.
  • This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.
  • ...39 more annotations...
  • If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?
  • Schooler demonstrated that subjects shown a face and asked to describe it were much less likely to recognize the face when shown it later than those who had simply looked at it. Schooler called the phenomenon “verbal overshadowing.”
  • The most likely explanation for the decline is an obvious one: regression to the mean. As the experiment is repeated, that is, an early statistical fluke gets cancelled out. The extrasensory powers of Schooler’s subjects didn’t decline—they were simply an illusion that vanished over time.
  • yet Schooler has noticed that many of the data sets that end up declining seem statistically solid—that is, they contain enough data that any regression to the mean shouldn’t be dramatic. “These are the results that pass all the tests,” he says. “The odds of them being random are typically quite remote, like one in a million. This means that the decline effect should almost never happen. But it happens all the time!
  • this is why Schooler believes that the decline effect deserves more attention: its ubiquity seems to violate the laws of statistics
  • In 2001, Michael Jennions, a biologist at the Australian National University, set out to analyze “temporal trends” across a wide range of subjects in ecology and evolutionary biology. He looked at hundreds of papers and forty-four meta-analyses (that is, statistical syntheses of related studies), and discovered a consistent decline effect over time, as many of the theories seemed to fade into irrelevance.
  • Jennions admits that his findings are troubling, but expresses a reluctance to talk about them
  • publicly. “This is a very sensitive issue for scientists,” he says. “You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.”
  • While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like Schooler, who have been unable to replicate their initial data despite their best efforts.
  • Jennions, similarly, argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found. The bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for
  • Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.
  • One of his most cited papers has a deliberately provocative title: “Why Most Published Research Findings Are False.”
  • suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place. Palmer’s most convincing evidence relies on a statistical tool known as a funnel graph. When a large number of studies have been done on a single subject, the data should follow a pattern: studies with a large sample size should all cluster around a common value—the true result—whereas those with a smaller sample size should exhibit a random scattering, since they’re subject to greater sampling error. This pattern gives the graph its name, since the distribution resembles a funnel.
  • after Palmer plotted every study of fluctuating asymmetry, he noticed that the distribution of results with smaller sample sizes wasn’t random at all but instead skewed heavily toward positive results. Palmer has since documented a similar problem in several other contested subject areas. “Once I realized that selective reporting is everywhere in science, I got quite depressed,” Palmer told me. “As a researcher, you’re always aware that there might be some nonrandom patterns, but I had no idea how widespread it is.”
  • Palmer summarized the impact of selective reporting on his field: “We cannot escape the troubling conclusion that some—perhaps many—cherished generalities are at best exaggerated in their biological significance and at worst a collective illusion nurtured by strong a-priori beliefs often repeated.”
  • Palmer emphasizes that selective reporting is not the same as scientific fraud. Rather, the problem seems to be one of subtle omissions and unconscious misperceptions, as researchers struggle to make sense of their results. Stephen Jay Gould referred to this as the “sho
  • horning” process.
  • “A lot of scientific measurement is really hard,” Simmons told me. “If you’re talking about fluctuating asymmetry, then it’s a matter of minuscule differences between the right and left sides of an animal. It’s millimetres of a tail feather. And so maybe a researcher knows that he’s measuring a good male”—an animal that has successfully mated—“and he knows that it’s supposed to be symmetrical. Well, that act of measurement is going to be vulnerable to all sorts of perception biases. That’s not a cynical statement. That’s just the way human beings work.”
  • For Simmons, the steep rise and slow fall of fluctuating asymmetry is a clear example of a scientific paradigm, one of those intellectual fads that both guide and constrain research: after a new paradigm is proposed, the peer-review process is tilted toward positive results. But then, after a few years, the academic incentives shift—the paradigm has become entrenched—so that the most notable results are now those that disprove the theory.
  • John Ioannidis, an epidemiologist at Stanford University, argues that such distortions are a serious issue in biomedical research. “These exaggerations are why the decline has become so common,” he says. “It’d be really great if the initial studies gave us an accurate summary of things. But they don’t. And so what happens is we waste a lot of money treating millions of patients and doing lots of follow-up studies on other themes based on results that are misleading.”
  • In 2005, Ioannidis published an article in the Journal of the American Medical Association that looked at the forty-nine most cited clinical-research studies in three major medical journals.
  • the data Ioannidis found were disturbing: of the thirty-four claims that had been subject to replication, forty-one per cent had either been directly contradicted or had their effect sizes significantly downgraded.
  • the most troubling fact emerged when he looked at the test of replication: out of four hundred and thirty-two claims, only a single one was consistently replicable. “This doesn’t mean that none of these claims will turn out to be true,” he says. “But, given that most of them were done badly, I wouldn’t hold my breath.”
  • According to Ioannidis, the main problem is that too many researchers engage in what he calls “significance chasing,” or finding ways to interpret the data so that it passes the statistical test of significance—the ninety-five-per-cent boundary invented by Ronald Fisher.
  • One of the classic examples of selective reporting concerns the testing of acupuncture in different countries. While acupuncture is widely accepted as a medical treatment in various Asian countries, its use is much more contested in the West. These cultural differences have profoundly influenced the results of clinical trials.
  • The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong.
  • “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies
  • That’s why Schooler argues that scientists need to become more rigorous about data collection before they publish. “We’re wasting too much time chasing after bad studies and underpowered experiments,”
  • The current “obsession” with replicability distracts from the real problem, which is faulty design.
  • “Every researcher should have to spell out, in advance, how many subjects they’re going to use, and what exactly they’re testing, and what constitutes a sufficient level of proof. We have the tools to be much more transparent about our experiments.”
  • Schooler recommends the establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results. “I think this would provide a huge increase in access to scientific work and give us a much better way to judge the quality of an experiment,”
  • scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.
  • The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a by-product of invisible variables we don’t understand.
  • The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected
  • This suggests that the decline effect is actually a decline of illusion. While Karl Popper imagined falsification occurring with a single, definitive experiment—Galileo refuted Aristotelian mechanics in an afternoon—the process turns out to be much messier than that.
  • Many scientific theories continue to be considered true even after failing numerous experimental tests.
  • Even the law of gravity hasn’t always been perfect at predicting real-world phenomena. (In one test, physicists measuring gravity by means of deep boreholes in the Nevada desert found a two-and-a-half-per-cent discrepancy between the theoretical predictions and the actual data.)
  • Such anomalies demonstrate the slipperiness of empiricism. Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.)
  • The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe. ♦
Javier E

'Messages of Shame Are Organized Around Gender' - The Atlantic - 3 views

  • Brown is a bestselling author and research professor who studies "vulnerability, courage, worthiness, and shame,
  • I recognized the culture she described in which institutionalized, nagging shame can cause people to put on so much emotional armor that they can't connect with others or access their authentic selves. I might have called those repressive forces "fear," or "self-doubt," or "insecurity," but, yeah, "shame" kind of covers all the bases.
  • the antidote to crippling shame is vulnerability. We tend to think of vulnerability as weakness; but in fact, she argues, it is the highest form of courage. To admit fear and pain, to reach out to others for help, to quiet the "gremlins" that tell us to keep our mouths shut and soldier on: this is how we become engaged, make human connections, and live "wholeheartedly."
  • ...5 more annotations...
  • "messages of shame are organized around gender." For women, she said, there are whole constellations of often contradictory expectations that, if not met, are sources of shame.
  • But for men, the overarching message is that any weakness is shameful. And since vulnerability is often perceived as weakness, it is especially risky for men to practice vulnerability.
  • men's shame is not primarily inflicted by other men. Instead, it is the women in their lives who tend to be repelled when men show the chinks in their armor.
  • Ironically, she explained, men are often pressured to open up and talk about their feelings, and they are criticized for being emotionally walled-off; but if they get too real, they are met with revulsion. She recalled the first time she realized that she had been complicit in the shaming: "Holy Shit!" she said. "I am the patriarchy!"
  • there are three main practices men, in particular, need to engage in. The first is asking for help. The second is setting boundaries; for example, not taking on work or activities that you don't want to do. And the third is apologizing and "owning it" when you are wrong.
Javier E

How One Stupid Tweet Blew Up Justine Sacco's Life - NYTimes.com - 1 views

  • I started to wonder about the recipients of our shamings, the real humans who were the virtual targets of these campaigns. So for the past two years, I’ve been interviewing individuals like Justine Sacco: everyday people pilloried brutally, most often for posting some poorly considered joke on social media. Whenever possible, I have met them in person, to truly grasp the emotional toll at the other end of our screens. The people I met were mostly unemployed, fired for their transgressions, and they seemed broken somehow — deeply confused and traumatized.
  • Read literally, she said that white people don’t get AIDS, but it seems doubtful many interpreted it that way. More likely it was her apparently gleeful flaunting of her privilege that angered people. But after thinking about her tweet for a few seconds more, I began to suspect that it wasn’t racist but a reflexive critique of white privilege — on our tendency to naïvely imagine ourselves immune from life’s horrors. Sacco, like Stone, had been yanked violently out of the context of her small social circle. Right?
  • “To me it was so insane of a comment for anyone to make,” she said. “I thought there was no way that anyone could possibly think it was literal.” (She would later write me an email to elaborate on this point. “Unfortunately, I am not a character on ‘South Park’ or a comedian, so I had no business commenting on the epidemic in such a politically incorrect manner on a public platform,” she wrote. “To put it simply, I wasn’t trying to raise awareness of AIDS or piss off the world or ruin my life. Living in America puts us in a bit of a bubble when it comes to what is going on in the third world. I was making fun of that bubble.”)
  • ...11 more annotations...
  • Her extended family in South Africa were African National Congress supporters — the party of Nelson Mandela. They were longtime activists for racial equality. When Justine arrived at the family home from the airport, one of the first things her aunt said to her was: “This is not what our family stands for. And now, by association, you’ve almost tarnished the family.”
  • I wanted to learn about the last era of American history when public shaming was a common form of punishment, so I was seeking out court transcripts from the 18th and early 19th centuries. I had assumed that the demise of public punishments was caused by the migration from villages to cities. Shame became ineffectual, I thought, because a person in the stocks could just lose himself or herself in the anonymous crowd as soon as the chastisement was over. Modernity had diminished shame’s power to shame — or so I assumed.
  • The pillory and whippings were abolished at the federal level in 1839, although Delaware kept the pillory until 1905 and whippings until 1972. An 1867 editorial in The Times excoriated the state for its obstinacy. “If [the convicted person] had previously existing in his bosom a spark of self-respect this exposure to public shame utterly extinguishes it. . . . The boy of 18 who is whipped at New Castle for larceny is in nine cases out of 10 ruined. With his self-respect destroyed and the taunt and sneer of public disgrace branded upon his forehead, he feels himself lost and abandoned by his fellows.”
  • I told her what Biddle had said — about how she was probably fine now. I was sure he wasn’t being deliberately glib, but like everyone who participates in mass online destruction, uninterested in learning that it comes with a cost.
  • “Well, I’m not fine yet,” Sacco said to me. “I had a great career, and I loved my job, and it was taken away from me, and there was a lot of glory in that. Everybody else was very happy about that.”
  • her shaming wasn’t really about her at all. Social media is so perfectly designed to manipulate our desire for approval, and that is what led to her undoing. Her tormentors were instantly congratulated as they took Sacco down, bit by bit, and so they continued to do so. Their motivation was much the same as Sacco’s own — a bid for the attention of strangers — as she milled about Heathrow, hoping to amuse people she couldn’t see.
  • Social media is, on the whole, a very bad thing. It wastes time, gives at best ephemeral pleasure with a modicum of interest, causes privacy and necessary social boundaries to disintegrate, and enriches people very much at the expense of others. Anyone can make a statement they later regret. It is now impossible to genuinely retract or escape such a statement. This is outrageous. Social media brings out the very worst in people. Rather than free speech, ot also promotes - essentially requires - a ridiculous level of self-censorship or imposition of extreme global shaming. This is not a societal good.
  • Reading this article, it made me very happy to not have a Twitter account. Anyone can say something some group doesn't like and interpret its meaning in negative ways, gang up on someone and bring them down
  • Look at Sacco's tweets on her flight and at the airport...absolutely meaningless junk that has no value to anyone. Why did she feel the need to post such thoughts? Post enough mindless thoughts and you'll probably post something really, really stupid you'd wished you hadn't.
  • I do feel sorry for the guy that made a stupid joke at a conference. When he said it, it was directed to one person and someone else decided to post to the world. That kind of stuff keeps up and nobody will ever do anything remotely interesting in public for fear it is misrepresented and their life ends. Getting fired for making a (to me, anyway) harmless joke seems severe
  • The offendee, it seems to me, would have done herself and others a favor by addressing the issue directly with him. Why the need to bypass any direct communication when you can post it and shame the person for the world? That's the act of a coward and someone who's out to punish.
Javier E

Science on the Rampage by Freeman Dyson | The New York Review of Books - 0 views

  • science is only a small part of human capability. We gain knowledge of our place in the universe not only from science but also from history, art, and literature. Science is a creative interaction of observation with imagination. “Physics at the Fringe” is what happens when imagination loses touch with observation. Imagination by itself can still enlarge our vision when observation fails. The mythologies of Carter and Velikovsky fail to be science, but they are works of art and high imagining. As William Blake told us long ago, “You never know what is enough unless you know what is more than enough.”
  • Over most of the territory of physics, theorists and experimenters are engaged in a common enterprise, and theories are tested rigorously by experiment. The theorists listen to the voice of nature speaking through experimental tools. This was true for the great theorists of the early twentieth century, Einstein and Heisenberg and Schrödinger, whose revolutionary theories of relativity and quantum mechanics were tested by precise experiments and found to fit the facts of nature. The new mathematical abstractions fit the facts, while the old mechanical models did not.
  • String cosmology is different. String cosmology is a part of theoretical physics that has become detached from experiments. String cosmologists are free to imagine universes and multiverses, guided by intuition and aesthetic judgment alone. Their creations must be logically consistent and mathematically elegant, but they are otherwise unconstrained.
  • ...1 more annotation...
  • The fringe of physics is not a sharp boundary with truth on one side and fantasy on the other. All of science is uncertain and subject to revision. The glory of science is to imagine more than we can prove. The fringe is the unexplored territory where truth and fantasy are not yet disentangled.
1 - 20 of 63 Next › Last »
Showing 20 items per page