Skip to main content

Home/ TOK Friends/ Group items tagged 20th century

Rss Feed Group items tagged

Javier E

Where We Went Wrong | Harvard Magazine - 0 views

  • John Kenneth Galbraith assessed the trajectory of America’s increasingly “affluent society.” His outlook was not a happy one. The nation’s increasingly evident material prosperity was not making its citizens any more satisfied. Nor, at least in its existing form, was it likely to do so
  • One reason, Galbraith argued, was the glaring imbalance between the opulence in consumption of private goods and the poverty, often squalor, of public services like schools and parks
  • nother was that even the bountifully supplied private goods often satisfied no genuine need, or even desire; a vast advertising apparatus generated artificial demand for them, and satisfying this demand failed to provide meaningful or lasting satisfaction.
  • ...28 more annotations...
  • economist J. Bradford DeLong ’82, Ph.D. ’87, looking back on the twentieth century two decades after its end, comes to a similar conclusion but on different grounds.
  • DeLong, professor of economics at Berkeley, looks to matters of “contingency” and “choice”: at key junctures the economy suffered “bad luck,” and the actions taken by the responsible policymakers were “incompetent.”
  • these were “the most consequential years of all humanity’s centuries.” The changes they saw, while in the first instance economic, also “shaped and transformed nearly everything sociological, political, and cultural.”
  • DeLong’s look back over the twentieth century energetically encompasses political and social trends as well; nor is his scope limited to the United States. The result is a work of strikingly expansive breadth and scope
  • labeling the book an economic history fails to convey its sweeping frame.
  • The century that is DeLong’s focus is what he calls the “long twentieth century,” running from just after the Civil War to the end of the 2000s when a series of events, including the biggest financial crisis since the 1930s followed by likewise the most severe business downturn, finally rendered the advanced Western economies “unable to resume economic growth at anything near the average pace that had been the rule since 1870.
  • d behind those missteps in policy stood not just failures of economic thinking but a voting public that reacted perversely, even if understandably, to the frustrations poor economic outcomes had brought them.
  • Within this 140-year span, DeLong identifies two eras of “El Dorado” economic growth, each facilitated by expanding globalization, and each driven by rapid advances in technology and changes in business organization for applying technology to economic ends
  • from 1870 to World War I, and again from World War II to 197
  • fellow economist Robert J. Gordon ’62, who in his monumental treatise on The Rise and Fall of American Economic Growth (reviewed in “How America Grew,” May-June 2016, page 68) hailed 1870-1970 as a “special century” in this regard (interrupted midway by the disaster of the 1930s).
  • Gordon highlighted the role of a cluster of once-for-all-time technological advances—the steam engine, railroads, electrification, the internal combustion engine, radio and television, powered flight
  • Pessimistic that future technological advances (most obviously, the computer and electronics revolutions) will generate productivity gains to match those of the special century, Gordon therefore saw little prospect of a return to the rapid growth of those halcyon days.
  • DeLong instead points to a series of noneconomic (and non-technological) events that slowed growth, followed by a perverse turn in economic policy triggered in part by public frustration: In 1973 the OPEC cartel tripled the price of oil, and then quadrupled it yet again six years later.
  • For all too many Americans (and citizens of other countries too), the combination of high inflation and sluggish growth meant that “social democracy was no longer delivering the rapid progress toward utopia that it had delivered in the first post-World War II generation.”
  • Frustration over these and other ills in turn spawned what DeLong calls the “neoliberal turn” in public attitudes and economic policy. The new economic policies introduced under this rubric “did not end the slowdown in productivity growth but reinforced it.
  • the tax and regulatory changes enacted in this new climate channeled most of what economic gains there were to people already at the top of the income scale
  • Meanwhile, progressive “inclusion” of women and African Americans in the economy (and in American society more broadly) meant that middle- and lower-income white men saw even smaller gains—and, perversely, reacted by providing still greater support for policies like tax cuts for those with far higher incomes than their own.
  • Daniel Bell’s argument in his 1976 classic The Cultural Contradictions of Capitalism. Bell famously suggested that the very success of a capitalist economy would eventually undermine a society’s commitment to the values and institutions that made capitalism possible in the first plac
  • In DeLong’s view, the “greatest cause” of the neoliberal turn was “the extraordinary pace of rising prosperity during the Thirty Glorious Years, which raised the bar that a political-economic order had to surpass in order to generate broad acceptance.” At the same time, “the fading memory of the Great Depression led to the fading of the belief, or rather recognition, by the middle class that they, as well as the working class, needed social insurance.”
  • what the economy delivered to “hard-working white men” no longer matched what they saw as their just deserts: in their eyes, “the rich got richer, the unworthy and minority poor got handouts.”
  • As Bell would have put it, the politics of entitlement, bred by years of economic success that so many people had come to take for granted, squeezed out the politics of opportunity and ambition, giving rise to the politics of resentment.
  • The new era therefore became “a time to question the bourgeois virtues of hard, regular work and thrift in pursuit of material abundance.”
  • DeLong’s unspoken agenda would surely include rolling back many of the changes made in the U.S. tax code over the past half-century, as well as reinvigorating antitrust policy to blunt the dominance, and therefore outsize profits, of the mega-firms that now tower over key sectors of the economy
  • He would also surely reverse the recent trend moving away from free trade. Central bankers should certainly behave like Paul Volcker (appointed by President Carter), whose decisive action finally broke the 1970s inflation even at considerable economic cost
  • Not only Galbraith’s main themes but many of his more specific observations as well seem as pertinent, and important, today as they did then.
  • What will future readers of Slouching Towards Utopia conclude?
  • If anything, DeLong’s narratives will become more valuable as those events fade into the past. Alas, his description of fascism as having at its center “a contempt for limits, especially those implied by reason-based arguments; a belief that reality could be altered by the will; and an exaltation of the violent assertion of that will as the ultimate argument” will likely strike a nerve with many Americans not just today but in years to come.
  • what about DeLong’s core explanation of what went wrong in the latter third of his, and our, “long century”? I predict that it too will still look right, and important.
Javier E

Why the Past 10 Years of American Life Have Been Uniquely Stupid - The Atlantic - 0 views

  • Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories.
  • Social media has weakened all three.
  • gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.
  • ...118 more annotations...
  • the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.
  • Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom
  • That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers.
  • “Like” and “Share” buttons quickly became standard features of most other platforms.
  • Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well.
  • Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.
  • By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous”
  • If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
  • This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment,
  • As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.
  • It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution.
  • The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.”
  • The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.
  • The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare.
  • a less quoted yet equally important insight, about democracy’s vulnerability to triviality.
  • Madison notes that people are so prone to factionalism that “where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
  • Social media has both magnified and weaponized the frivolous.
  • It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust.
  • a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions.
  • when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side
  • The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
  • The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.
  • When people lose trust in institutions, they lose trust in the stories told by those institutions. That’s particularly true of the institutions entrusted with the education of children.
  • Facebook and Twitter make it possible for parents to become outraged every day over a new snippet from their children’s history lessons––and math lessons and literature selections, and any new pedagogical shifts anywhere in the country
  • The motives of teachers and administrators come into question, and overreaching laws or curricular reforms sometimes follow, dumbing down education and reducing trust in it further.
  • young people educated in the post-Babel era are less likely to arrive at a coherent story of who we are as a people, and less likely to share any such story with those who attended different schools or who were educated in a different decade.
  • former CIA analyst Martin Gurri predicted these fracturing effects in his 2014 book, The Revolt of the Public. Gurri’s analysis focused on the authority-subverting effects of information’s exponential growth, beginning with the internet in the 1990s. Writing nearly a decade ago, Gurri could already see the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached.
  • he notes a constructive feature of the pre-digital era: a single “mass audience,” all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society. I
  • The digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass. So the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile
  • Facebook, Twitter, YouTube, and a few other large platforms unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.
  • I think we can date the fall of the tower to the years between 2011 (Gurri’s focal year of “nihilistic” protests) and 2015, a year marked by the “great awokening” on the left and the ascendancy of Donald Trump on the right.
  • Twitter can overpower all the newspapers in the country, and stories cannot be shared (or at least trusted) across more than a few adjacent fragments—so truth cannot achieve widespread adherence.
  • fter Babel, nothing really means anything anymore––at least not in a way that is durable and on which people widely agree.
  • Politics After Babel
  • “Politics is the art of the possible,” the German statesman Otto von Bismarck said in 1867. In a post-Babel democracy, not much may be possible.
  • The ideological distance between the two parties began increasing faster in the 1990s. Fox News and the 1994 “Republican Revolution” converted the GOP into a more combative party.
  • So cross-party relationships were already strained before 2009. But the enhanced virality of social media thereafter made it more hazardous to be seen fraternizing with the enemy or even failing to attack the enemy with sufficient vigor.
  • What changed in the 2010s? Let’s revisit that Twitter engineer’s metaphor of handing a loaded gun to a 4-year-old. A mean tweet doesn’t kill anyone; it is an attempt to shame or punish someone publicly while broadcasting one’s own virtue, brilliance, or tribal loyalties. It’s more a dart than a bullet
  • from 2009 to 2012, Facebook and Twitter passed out roughly 1 billion dart guns globally. We’ve been shooting one another ever since.
  • “devoted conservatives,” comprised 6 percent of the U.S. population.
  • the warped “accountability” of social media has also brought injustice—and political dysfunction—in three ways.
  • First, the dart guns of social media give more power to trolls and provocateurs while silencing good citizens.
  • a small subset of people on social-media platforms are highly concerned with gaining status and are willing to use aggression to do so.
  • Across eight studies, Bor and Petersen found that being online did not make most people more aggressive or hostile; rather, it allowed a small number of aggressive people to attack a much larger set of victims. Even a small number of jerks were able to dominate discussion forums,
  • Additional research finds that women and Black people are harassed disproportionately, so the digital public square is less welcoming to their voices.
  • Second, the dart guns of social media give more power and voice to the political extremes while reducing the power and voice of the moderate majority.
  • The “Hidden Tribes” study, by the pro-democracy group More in Common, surveyed 8,000 Americans in 2017 and 2018 and identified seven groups that shared beliefs and behaviors.
  • Social media has given voice to some people who had little previously, and it has made it easier to hold powerful people accountable for their misdeeds
  • The group furthest to the left, the “progressive activists,” comprised 8 percent of the population. The progressive activists were by far the most prolific group on social media: 70 percent had shared political content over the previous year. The devoted conservatives followed, at 56 percent.
  • These two extreme groups are similar in surprising ways. They are the whitest and richest of the seven groups, which suggests that America is being torn apart by a battle between two subsets of the elite who are not representative of the broader society.
  • they are the two groups that show the greatest homogeneity in their moral and political attitudes.
  • likely a result of thought-policing on social media:
  • political extremists don’t just shoot darts at their enemies; they spend a lot of their ammunition targeting dissenters or nuanced thinkers on their own team.
  • Finally, by giving everyone a dart gun, social media deputizes everyone to administer justice with no due process. Platforms like Twitter devolve into the Wild West, with no accountability for vigilantes.
  • Enhanced-virality platforms thereby facilitate massive collective punishment for small or imagined offenses, with real-world consequences, including innocent people losing their jobs and being shamed into suicide
  • we don’t get justice and inclusion; we get a society that ignores context, proportionality, mercy, and truth.
  • Since the tower fell, debates of all kinds have grown more and more confused. The most pervasive obstacle to good thinking is confirmation bias, which refers to the human tendency to search only for evidence that confirms our preferred beliefs
  • search engines were supercharging confirmation bias, making it far easier for people to find evidence for absurd beliefs and conspiracy theorie
  • The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument.
  • In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system”—that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals
  • English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury.
  • Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking.
  • Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.
  • Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history
  • But this arrangement, Rauch notes, “is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.”
  • This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted
  • it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight
  • Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas—even those presented in class by their students—that they believed to be ill-supported or wrong.
  • The stupefying process plays out differently on the right and the left because their activist wings subscribe to different narratives with different sacred values.
  • The “Hidden Tribes” study tells us that the “devoted conservatives” score highest on beliefs related to authoritarianism. They share a narrative in which America is eternally under threat from enemies outside and subversives within; they see life as a battle between patriots and traitors.
  • they are psychologically different from the larger group of “traditional conservatives” (19 percent of the population), who emphasize order, decorum, and slow rather than radical change.
  • The traditional punishment for treason is death, hence the battle cry on January 6: “Hang Mike Pence.”
  • Right-wing death threats, many delivered by anonymous accounts, are proving effective in cowing traditional conservatives
  • The wave of threats delivered to dissenting Republican members of Congress has similarly pushed many of the remaining moderates to quit or go silent, giving us a party ever more divorced from the conservative tradition, constitutional responsibility, and reality.
  • The stupidity on the right is most visible in the many conspiracy theories spreading across right-wing media and now into Congress.
  • The Democrats have also been hit hard by structural stupidity, though in a different way. In the Democratic Party, the struggle between the progressive wing and the more moderate factions is open and ongoing, and often the moderates win.
  • The problem is that the left controls the commanding heights of the culture: universities, news organizations, Hollywood, art museums, advertising, much of Silicon Valley, and the teachers’ unions and teaching colleges that shape K–12 education. And in many of those institutions, dissent has been stifled:
  • Liberals in the late 20th century shared a belief that the sociologist Christian Smith called the “liberal progress” narrative, in which America used to be horrifically unjust and repressive, but, thanks to the struggles of activists and heroes, has made (and continues to make) progress toward realizing the noble promise of its founding.
  • It is also the view of the “traditional liberals” in the “Hidden Tribes” study (11 percent of the population), who have strong humanitarian values, are older than average, and are largely the people leading America’s cultural and intellectual institutions.
  • when the newly viralized social-media platforms gave everyone a dart gun, it was younger progressive activists who did the most shooting, and they aimed a disproportionate number of their darts at these older liberal leaders.
  • Confused and fearful, the leaders rarely challenged the activists or their nonliberal narrative in which life at every institution is an eternal battle among identity groups over a zero-sum pie, and the people on top got there by oppressing the people on the bottom. This new narrative is rigidly egalitarian––focused on equality of outcomes, not of rights or opportunities. It is unconcerned with individual rights.
  • The universal charge against people who disagree with this narrative is not “traitor”; it is “racist,” “transphobe,” “Karen,” or some related scarlet letter marking the perpetrator as one who hates or harms a marginalized group.
  • The punishment that feels right for such crimes is not execution; it is public shaming and social death.
  • anyone on Twitter had already seen dozens of examples teaching the basic lesson: Don’t question your own side’s beliefs, policies, or actions. And when traditional liberals go silent, as so many did in the summer of 2020, the progressive activists’ more radical narrative takes over as the governing narrative of an organization.
  • This is why so many epistemic institutions seemed to “go woke” in rapid succession that year and the next, beginning with a wave of controversies and resignations at The New York Times and other newspapers, and continuing on to social-justice pronouncements by groups of doctors and medical associations
  • The problem is structural. Thanks to enhanced-virality social media, dissent is punished within many of our institutions, which means that bad ideas get elevated into official policy.
  • In a 2018 interview, Steve Bannon, the former adviser to Donald Trump, said that the way to deal with the media is “to flood the zone with shit.” He was describing the “firehose of falsehood” tactic pioneered by Russian disinformation programs to keep Americans confused, disoriented, and angry.
  • artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence.
  • Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)
  • American factions won’t be the only ones using AI and social media to generate attack content; our adversaries will too.
  • In the 20th century, America’s shared identity as the country leading the fight to make the world safe for democracy was a strong force that helped keep the culture and the polity together.
  • In the 21st century, America’s tech companies have rewired the world and created products that now appear to be corrosive to democracy, obstacles to shared understanding, and destroyers of the modern tower.
  • What changes are needed?
  • I can suggest three categories of reforms––three goals that must be achieved if democracy is to remain viable in the post-Babel era.
  • We must harden democratic institutions so that they can withstand chronic anger and mistrust, reform social media so that it becomes less socially corrosive, and better prepare the next generation for democratic citizenship in this new age.
  • Harden Democratic Institutions
  • we must reform key institutions so that they can continue to function even if levels of anger, misinformation, and violence increase far above those we have today.
  • Reforms should reduce the outsize influence of angry extremists and make legislators more responsive to the average voter in their district.
  • One example of such a reform is to end closed party primaries, replacing them with a single, nonpartisan, open primary from which the top several candidates advance to a general election that also uses ranked-choice voting
  • A second way to harden democratic institutions is to reduce the power of either political party to game the system in its favor, for example by drawing its preferred electoral districts or selecting the officials who will supervise elections
  • These jobs should all be done in a nonpartisan way.
  • Reform Social Media
  • Social media’s empowerment of the far left, the far right, domestic trolls, and foreign agents is creating a system that looks less like democracy and more like rule by the most aggressive.
  • it is within our power to reduce social media’s ability to dissolve trust and foment structural stupidity. Reforms should limit the platforms’ amplification of the aggressive fringes while giving more voice to what More in Common calls “the exhausted majority.”
  • the main problem with social media is not that some people post fake or toxic stuff; it’s that fake and outrage-inducing content can now attain a level of reach and influence that was not possible before
  • Perhaps the biggest single change that would reduce the toxicity of existing platforms would be user verification as a precondition for gaining the algorithmic amplification that social media offers.
  • One of the first orders of business should be compelling the platforms to share their data and their algorithms with academic researchers.
  • Prepare the Next Generation
  • Childhood has become more tightly circumscribed in recent generations––with less opportunity for free, unstructured play; less unsupervised time outside; more time online. Whatever else the effects of these shifts, they have likely impeded the development of abilities needed for effective self-governance for many young adults
  • Depression makes people less likely to want to engage with new people, ideas, and experiences. Anxiety makes new things seem more threatening. As these conditions have risen and as the lessons on nuanced social behavior learned through free play have been delayed, tolerance for diverse viewpoints and the ability to work out disputes have diminished among many young people
  • Students did not just say that they disagreed with visiting speakers; some said that those lectures would be dangerous, emotionally devastating, a form of violence. Because rates of teen depression and anxiety have continued to rise into the 2020s, we should expect these views to continue in the generations to follow, and indeed to become more severe.
  • The most important change we can make to reduce the damaging effects of social media on children is to delay entry until they have passed through puberty.
  • The age should be raised to at least 16, and companies should be held responsible for enforcing it.
  • et them out to play. Stop starving children of the experiences they most need to become good citizens: free play in mixed-age groups of children with minimal adult supervision
  • while social media has eroded the art of association throughout society, it may be leaving its deepest and most enduring marks on adolescents. A surge in rates of anxiety, depression, and self-harm among American teens began suddenly in the early 2010s. (The same thing happened to Canadian and British teens, at the same time.) The cause is not known, but the timing points to social media as a substantial contributor—the surge began just as the large majority of American teens became daily users of the major platforms.
  • What would it be like to live in Babel in the days after its destruction? We know. It is a time of confusion and loss. But it is also a time to reflect, listen, and build.
  • In recent years, Americans have started hundreds of groups and organizations dedicated to building trust and friendship across the political divide, including BridgeUSA, Braver Angels (on whose board I serve), and many others listed at BridgeAlliance.us. We cannot expect Congress and the tech companies to save us. We must change ourselves and our communities.
  • when we look away from our dysfunctional federal government, disconnect from social media, and talk with our neighbors directly, things seem more hopeful. Most Americans in the More in Common report are members of the “exhausted majority,” which is tired of the fighting and is willing to listen to the other side and compromise. Most Americans now see that social media is having a negative impact on the country, and are becoming more aware of its damaging effects on children.
caelengrubb

How Einstein Challenged Newtonian Physics - 0 views

  • Any discussion of Einstein should begin with what is probably his single greatest contribution to physics—the theory of relativity.
  • Between the late 1600s and the beginning of the 20th century, the field of physics was dominated by the ideas of Isaac Newton. The Newtonian laws of motion and gravitation had, up to that point in time, been the most successful scientific theory in all of history.
  • Newton’s ideas were, of course, challenged from time to time during those two centuries, but these ideas always seemed to hold up
  • ...9 more annotations...
  • There were many new phenomena that were discovered and that came to be understood in the centuries that followed Newton’s era. Take electricity and magnetism, for example. Until the 19th century, we didn’t really know what electricity or magnetism were, or how they worked. Isaac Newton certainly didn’t have a clue.
  • To many physicists around the turn of the 20th century, the state of physics seemed very settled. The Newtonian worldview had been very successful, and for a very long time.
  • In 1905, however, a revolution in physics did come. And perhaps even more surprising than the revolution itself was where that revolution came from. In 1905, Albert Einstein was not working as a professor at some prestigious university. He was not famous, or even well-known among other physicists.
  • Things didn’t stay this way for long, however. In 1905, Einstein wrote not one or two, but four absolutely groundbreaking papers. Any one of these four papers would have made him a star within the field of physics, and would have certainly secured him a position of prominence in the history of science.
  • It seems that having so many breakthroughs of this magnitude in such a short period of time had never happened before, and has never happened since. In the first of Einstein’s 1905 papers, he proposed that light doesn’t only behave like a wave, but that it is also made up of individual pieces or particles.
  • But Einstein’s paper provided concrete empirical evidence that atoms were, in fact, real and tangible objects. He was even able to use these arguments to make a pretty good estimate for the size and mass of atoms and molecules. It was a huge step forward.
  • The equations that physicists use to describe the propagation of light waves—what are known as Maxwell’s equations—predict that light should move through space at a speed of about 670 million miles per hour. And more interestingly, these equations don’t make any reference to any medium that the light waves propagate through.
  • Although no experiment had ever detected this aether, they argued that it must fill virtually all of space. After all, they argued, the light from a distant star could only reach us if there was a continuous path filled with aether, extending all the way from the star to us.
  • ventually, though, physicists discovered that there was no aether. It would be Einstein who would come up with an equation to explain this conundrum.
blythewallick

Why Americans turn to conspiracy theories - The Washington Post - 0 views

  • As the impeachment inquiry heats up, members of Congress and the media are left with the difficult job of untangling the conspiracy theory that seems to have driven the president’s actions in Ukraine: a wild tale of a missing computer server whisked off to Eastern Europe for nefarious, if never entirely clear, purposes, and something involving Joe Biden, his son Hunter and, for good measure, China, too.
  • Seeing the full ideological array of conspiratorial thinking and understanding its deep history are essential to understanding how paranoid thinking about Russian conspiracies, which so troubled the McCarthyites in the 1950s and 1960s, could jump from right to left in the wake of the 2016 election.
  • Republican fears of power’s expansionist tendencies spurred the revolutionary generation to regard British taxation after 1763 as not simply a deviation from prior norms, but as the first step on a swift descent toward political enslavement. American revolutionaries were not simply whiny about taxes; they were paranoid.
  • ...5 more annotations...
  • Did Federalists just use the specter of the Illuminati to tar their rivals? Or did they mean it? Did the Jeffersonians really think the Federalists were conspiring to bring back monarchy as they alleged? Or were they just trying to win elections? The answer depends on who and when, but it’s safe to say that some did believe these theories.
  • Conspiracy theory after theory, Americans cast a paranoid eye on their partisan opponents throughout the 19th and 20th centuries. According to the Oxford English Dictionary, the phrase “conspiracy theory” first appeared in the early 20th century United States, in the context of political histories of the 19th century.
  • Democrats’ anxieties about Russian conspiracies to interfere in the 2016 campaign cannot be extricated from this historical context of paranoia just because they have a significant basis in fact. As Joseph Heller wrote, “Just because you’re paranoid doesn’t mean they’re not after you.”
  • The republican political theory underlying the American paranoid style had its origin in the writings of opposition politicians in 18th-century Britain. Since then, conspiratorial thinking, has remained most attractive to opposition parties seeking to discredit their establishment rivals. This is the nature of Trump’s criticism of Democratic investigations of Russian conspiracies to hack the 2016 campaign. They’re just whining because they lost, Trump has said repeatedly.
  • If Trump’s embrace of the Ukraine conspiracy doesn’t sink his political future by leading to impeachment, it may nonetheless signal that his political future is bleak.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Emily Horwitz

Mining Books To Map Emotions Through A Century : Shots - Health News : NPR - 1 views

  • Were people happier in the 1950s than they are today? Or were they more frustrated, repressed and sad? To find out, you'd have to compare the emotions of one generation to another. British anthropologists think they may have found the answer — embedded in literature.
  • This effort began simply with lists of "emotion" words: 146 different words that connote anger; 92 words for fear; 224 for joy; 115 for sadness; 30 for disgust; and 41 words for surprise. All were from standardized word lists used in linguistic research.
  • We didn't really expect to find anything," he says. "We were just curious. We really expected the use of emotion words to be constant through time." Instead, in the study they published in the journal PLOS ONE, the anthropologists found very distinct peaks and valleys, Bently says.
  • ...4 more annotations...
  • "The twenties were the highest peak of joy-related words that we see," he says. "They really were roaring." But then there came 1941, which, of course, marked the beginning of America's entry into World War II. It doesn't take a historian to see that peaks and valleys like these roughly mirror the major economic and social events of the century. "In 1941, sadness is at its peak," Bently says.
  • They weren't just novels or books about current events, Bentley says. Many were books without clear emotional content — technical manuals about plants and animals, for example, or automotive repair guides. "It's not like the change in emotion is because people are writing about the Depression, and people are writing about the war," he says. "There might be a little bit of that, but this is just, kind of, averaged over all books and it's just kind of creeping in."
  • Generally speaking the usage of these commonly known emotion words has been in decline over the 20th century," Bentley says. We used words that expressed our emotions less in the year 2000 than we did 100 years earlier — words about sadness, and joy, and anger, and disgust and surprise. In fact, there is only one exception that Bentley and his colleagues found: fear. "The fear-related words start to increase just before the 1980s," he says.
  • For psychologists, he says, there are only a handful of ways to try to understand what is actually going on with somebody emotionally. "One is what a person says," Pennebaker explains, "kind of the 'self report' of emotion. Another might be the physiological links, and the third is what slips out when they're talking to other people, when they're writing a book or something like that."
  •  
    Researchers have found a connection between economic troubles and the emotions used in various genres of literature over time. What I found most interesting was that, even in non-fiction, technical literature, the researchers still found differences in words used that had a certain emotional connotation, depending on the emotions of the time period. It seems that these anthropologists are finding a link between the way our culture and emotion influences our language.
Adam Clark

Does Your Language Shape How You Think? - NYTimes.com - 0 views

  •  
    "Seventy years ago, in 1940, a popular science magazine published a short article that set in motion one of the trendiest intellectual fads of the 20th century. At first glance, there seemed little about the article to augur its subsequent celebrity. Neither the title, "Science and Linguistics," nor the magazine, M.I.T.'s Technology Review, was most people's idea of glamour. And the author, a chemical engineer who worked for an insurance company and moonlighted as an anthropology lecturer at Yale University, was an unlikely candidate for international superstardom. And yet Benjamin Lee Whorf let loose an alluring idea about language's power over the mind, and his stirring prose seduced a whole generation into believing that our mother tongue restricts what we are able to think."
anonymous

The Artist's Intentions and the Intentional Fallacy in Fine Arts Conservation on JSTOR - 0 views

shared by anonymous on 04 Nov 20 - No Cached
  • A formal claim was made in the mid-20th century that the goal of art conservation is to present the artwork as the artist intended it to be seen.
  • Dispute over this claim among conservators and art historians involved differences of perspective on the relative roles of science and art history in the interpretation of artist's intention.
  • The author finds that the interpretation and application of artist's intention is an interdisciplinary task and that its evaluation in conservation contexts is limited to consideration of distinctive stylistic characteristics that demonstrate the correlated individuality of artists and their work
  • ...4 more annotations...
  • In the mid-20th century, it was claimed that the goal of restoration was to restore works of art to the appearance that artist had wanted to give them
  • In this particular debate, the difficulty in assessing and applying the artist's intention arose from the very ambiguity of the term "intention.
  • The author examines 11 different meanings of this word, and it poses the problem of the artist's intention in contexts related to the field of restoration.
  • The author believes that the interpretation and study of the artist's intentions is an interdisciplinary task, and that its evaluation in the contexts of restoration must be limited to consideration of the particular stylistic characteristics which demonstrate the correlative individuality of the artists and their works
  •  
    Hopefully, this is long enough, but I had already read it. This is bascially a description of a book and how people talk about artists intentions.
tongoscar

Quartz - Global business news and insights - 0 views

  • Climate change is already here. It’s not something that can simply be ignored by cable news or dismissed by sitting US senators in a Twitter joke.
  • Instead, we are seeing its creeping effects now—with hurricanes like Maria and Harvey that caused hundreds of deaths and billions of dollars in economic damage; with the Mississippi River and its tributaries overflowing their banks this spring,
  • Climate change is, at this very moment, taking a real toll on wildlife, ecosystems, economies, and human beings, particularly in the global south, which experts expect will be hit first and hardest.
  • ...7 more annotations...
  • “The amount of change that we’re going to see—whether it’s serious, whether it’s dangerous, whether it’s devastating, whether it’s civilization-threatening—the amount of change we’re going to see is up to us,” she continued. “It depends on our choices today and in the next few years.”
  • Houston’s starting to get hot. It’s now about one degree fahrenheit warmer in Houston than it was in the second half of the 20th century. Houstonians can expect especially balmy falls this decade, as autumns are warming faster than other seasons in Texas.
  • This decade, St. Louis is expected to be more than two degrees fahrenheit warmer than it was, on average, during the latter half of the 20th century. While locals have endured more sweltering summer days, they have felt the change the most during the cold months.
  • Warmer air holds more water, which can lead to more severe rainfall. In recent years, rainstorms have pummeled the Midwest and led to widespread flooding across the region. In 2019 in St. Louis, rivers reached near-historic levels, and floodwaters inundated the area around the city’s iconic Gateway Arch.
  • For San Franciscans, the beginning of the decade will feel only a little different from past years. In 2020, it’s expected to be less than one degree fahrenheit warmer in San Francisco than it was, on average, between 1950 and 2000.
  • But there are new worries for the city. Rising temperatures have fueled ongoing drought in recent years, which has, in turn, led to more wildfires. Fires now burn more regularly across the Sierra Nevada as well as coastal mountain ranges.
  • By 2030, temperatures are expected to have warmed almost two degrees fahrenheit in Houston. Seas are expected to have risen a little more than a foot, enough to occasionally flood some low-lying areas outside the city.
Javier E

Some on the Left Turn Against the Label 'Progressive' - The New York Times - 0 views

  • Christopher Lasch, the historian and social critic, posed a banger of a question in his 1991 book, “The True and Only Heaven: Progress and Its Critics.”
  • “How does it happen,” Lasch asked, “that serious people continue to believe in progress, in the face of massive evidence that might have been expected to refute the idea of progress once and for all?”
  • A review in The New York Times Book Review by William Julius Wilson, a professor at Harvard, was titled: “Where Has Progress Got Us?”
  • ...17 more annotations...
  • Essentially, Lasch was attacking the notion, fashionable as Americans basked in their seeming victory over the Soviet Union in the Cold War, that history had a direction — and that one would be wise to stay on the “right side” of it.
  • Francis Fukuyama expressed a version of this triumphalist idea in his famous 1992 book, “The End of History and the Last Man,” in which he celebrated the notion that History with a capital “H,” in the sense of a battle between competing ideas, was ending with communism left to smolder on Ronald Reagan’s famous ash heap.
  • One of Martin Luther King Jr.’s most frequently quoted lines speaks to a similar thought, albeit in a different context: “T​he arc of the moral universe is long, but it bends toward justice.” Though he had read Lasch, Obama quoted that line often, just as he liked to say that so-and-so would be “on the wrong side of history” if they didn’t live up to his ideals — whether the issue was same-sex marriage, health policy or the Russian occupation of Crimea.
  • The memo goes on to list two sets of words: “Optimistic Positive Governing Words” and “Contrasting Words,” which carried negative connotations. One of the latter group was the word “liberal,” sandwiched between “intolerant” and “lie.”
  • So what’s the difference between a progressive and a liberal?To vastly oversimplify matters, liberal usually refers to someone on the center-left on a two-dimensional political spectrum, while progressive refers to someone further left.
  • But “liberal” has taken a beating in recent decades — from both left and right.
  • In the late 1980s and 1990s, Republicans successfully demonized the word “liberal,” to the point where many Democrats shied away from it in favor of labels like “conservative Democrat” or, more recently, “progressive.”
  • “Is the story of the 20th century about the defeat of the Soviet Union, or was it about two world wars and a Holocaust?” asked Matthew Sitman, the co-host of the “Know Your Enemy” podcast, which recently hosted a discussion on Lasch and the fascination many conservatives have with his ideas. “It really depends on how you look at it.”
  • None of this was an accident. In 1996, Representative Newt Gingrich of Georgia circulated a now-famous memo called “Language: A Key Mechanism of Control.”
  • The authors urged their readers: “The words and phrases are powerful. Read them. Memorize as many as possible.”
  • Republicans subsequently had a great deal of success in associating the term “liberal” with other words and phrases voters found uncongenial: wasteful spending, high rates of taxation and libertinism that repelled socially conservative voters.
  • Many on the left began identifying themselves as “progressive” — which had the added benefit of harking back to movements of the late 19th and early 20th centuries that fought against corruption, opposed corporate monopolies, pushed for good-government reforms and food safety and labor laws and established women’s right to vote.
  • Allies of Bill Clinton founded the Progressive Policy Institute, a think tank associated with so-called Blue Dog Democrats from the South.
  • Now, scrambling the terminology, groups like the Progressive Change Campaign Committee agitate on behalf of proudly left-wing candidates
  • In 2014, Charles Murray, the polarizing conservative scholar, urged readers of The Wall Street Journal’s staunchly right-wing editorial section to “start using ‘liberal’ to designate the good guys on the left, reserving ‘progressive’ for those who are enthusiastic about an unrestrained regulatory state.”
  • As Sanders and acolytes like Representative Alexandria Ocasio-Cortez of New York have gained prominence over the last few election cycles, many on the left-wing end of the spectrum have begun proudly applying other labels to themselves, such as “democratic socialist.”
  • To little avail so far, Kazin, the Georgetown historian, has been urging them to call themselves “social democrats” instead — as many mainstream parties do in Europe.“It’s not a good way to win elections in this country, to call yourself a socialist,” he said.
Javier E

Michael Chwe, Author, Sees Jane Austen as Game Theorist - NYTimes.com - 0 views

  • It’s not every day that someone stumbles upon a major new strategic thinker during family movie night. But that’s what happened to Michael Chwe, an associate professor of political science at the University of California, Los Angeles, when he sat down with his children some eight years ago to watch “Clueless,” the 1995 romantic comedy based on Jane Austen’s “Emma.”
  • In 230 diagram-heavy pages, Mr. Chwe argues that Austen isn’t merely fodder for game-theoretical analysis, but an unacknowledged founder of the discipline itself: a kind of Empire-waisted version of the mathematician and cold war thinker John von Neumann, ruthlessly breaking down the stratagems of 18th-century social warfare.
  • Or, as Mr. Chwe puts it in the book, “Anyone interested in human behavior should read Austen because her research program has results.”
  • ...7 more annotations...
  • Modern game theory is generally dated to 1944, with the publication of von Neumann’s “Theory of Games and Economic Behavior,” which imagined human interactions as a series of moves and countermoves aimed at maximizing “payoff.” Since then the discipline has thrived, often dominating political science, economics and biology
  • But a century and a half earlier, Mr. Chwe argues, Austen was very deliberately trying to lay philosophical groundwork for a new theory of strategic action, sometimes charting territory that today’s theoreticians have themselves failed to reach.
  • Game theory, he argues, isn’t just part of “hegemonic cold war discourse,” but what the political scientist James Scott called a subversive “weapon of the weak.”
  • many situations, Mr. Chwe points out, involve parties with unequal levels of strategic thinking. Sometimes a party may simply lack ability. But sometimes a powerful party faced with a weaker one may not realize it even needs to think strategically.
  • Mr. Chwe, who identifies some 50 “strategic manipulations” in Austen
  • First among her as yet unequaled concepts is “cluelessness
  • Even some humanists who admire Mr. Chwe’s work suggest that when it comes to appreciating Austen, social scientists may be the clueless ones. Austen scholars “will not be surprised at all to see the depths of her grasp of strategic thinking and the way she anticipated a 20th-century field of inquiry,”
Javier E

About Face: Emotions and Facial Expressions May Not Be Directly Related | Boston Magazine - 0 views

  • Ekman had traveled the globe with photographs that showed faces experiencing six basic emotions—happiness, sadness, fear, disgust, anger, and surprise. Everywhere he went, from Japan to Brazil to the remotest village of Papua New Guinea, he asked subjects to look at those faces and then to identify the emotions they saw on them. To do so, they had to pick from a set list of options presented to them by Ekman. The results were impressive. Everybody, it turned out, even preliterate Fore tribesmen in New Guinea who’d never seen a foreigner before in their lives, matched the same emotions to the same faces. Darwin, it seemed, had been right.
  • Ekman’s findings energized the previously marginal field of emotion science. Suddenly, researchers had an objective way to measure and compare human emotions—by reading the universal language of feeling written on the face. In the years that followed, Ekman would develop this idea, arguing that each emotion is like a reflex, with its own circuit in the brain and its own unique pattern of effects on the face and the body. He and his peers came to refer to it as the Basic Emotion model—and it had significant practical applications
  • What if he’s wrong?
  • ...15 more annotations...
  • Barrett is a professor of psychology at Northeastern
  • her research has led her to conclude that each of us constructs them in our own individual ways, from a diversity of sources: our internal sensations, our reactions to the environments we live in, our ever-evolving bodies of experience and learning, our cultures.
  • if Barrett is correct, we’ll need to rethink how we interpret mental illness, how we understand the mind and self, and even what psychology as a whole should become in the 21st century.
  • The problem was the options that Ekman had given his subjects when asking them to identify the emotions shown on the faces they were presented with. Those options, Barrett discovered, had limited the ways in which people allowed themselves to think. Barrett explained the problem to me this way: “I can break that experiment really easily, just by removing the words. I can just show you a face and ask how this person feels. Or I can show you two faces, two scowling faces, and I can say, ‘Do these people feel the same thing?’ And agreement drops into the toilet.”
  • Just as that first picture of the bee actually wasn’t a picture of a bee for me until I taught myself that it was, my emotions aren’t actually emotions until I’ve taught myself to think of them that way. Without that, I have only a meaningless mishmash of information about what I’m feeling.
  • emotion isn’t a simple reflex or a bodily state that’s hard-wired into our DNA, and it’s certainly not universally expressed. It’s a contingent act of perception that makes sense of the information coming in from the world around you, how your body is feeling in the moment, and everything you’ve ever been taught to understand as emotion. Culture to culture, person to person even, it’s never quite the same. What’s felt as sadness in one person might as easily be felt as weariness in another, or frustration in someone else.
  • The brain, it turns out, doesn’t consciously process every single piece of information that comes its way. Think of how impossibly distracting the regular act of blinking would be if it did. Instead, it pays attention to what you need to pay attention to, then raids your memory stores to fill in the blanks.
  • In many quarters, Barrett was angrily attacked for her ideas, and she’s been the subject of criticism ever since. “I think Lisa does a disservice to the actual empirical progress that we’re making,” says Dacher Keltner, a Berkeley psychologist
  • Keltner told me that he himself has coded thousands of facial expressions using Ekman’s system, and the results are strikingly consistent: Certain face-emotion combinations recur regularly, and others never occur. “That tells me, ‘Wow, this approach to distinct emotions has real power,’” he says.
  • Ekman reached the peak of his fame in the years following 2001. That’s the year the American Psychological Association named him one of the most influential psychologists of the 20th century. The next year, Malcolm Gladwell wrote an article about him in the New Yorker, and in 2003 he began working pro bono for the TSA. A year later, riding the updraft of success, he left his university post and started the Paul Ekman Group,
  • a small research team to visit the isolated Himba tribe in Namibia, in southern Africa. The plan was this: The team, led by Maria Gendron, would do a study similar to Ekman’s original cross-cultural one, but without providing any of the special words or context-heavy stories that Ekman had used to guide his subjects’ answers. Barrett’s researchers would simply hand a jumbled pile of different expressions (happy, sad, fearful, angry, disgusted, and neutral) to their subjects, and would ask them to sort them into six piles. If emotional expressions are indeed universal, they reasoned, then the Himba would put all low-browed, tight-lipped expressions into an anger pile, all wrinkled-nose faces into a disgust pile, and so on.
  • It didn’t happen that way. The Himba sorted some of the faces in ways that aligned with Ekman’s theory: smiling faces went into one pile, wide-eyed fearful faces went into another, and affectless faces went mostly into a third. But in the other three piles, the Himba mixed up angry scowls, disgusted grimaces, and sad frowns. Without any suggestive context, of the kind that Ekman had originally provided, they simply didn’t recognize the differences that leap out so naturally to Westerners.
  • “What we’re trying to do,” she told me, “is to just get people to pay attention to the fact that there’s a mountain of evidence that does not support the idea that facial expressions are universally recognized as emotional expressions.” That’s the crucial point, of course, because if we acknowledge that, then the entire edifice that Paul Ekman and others have been constructing for the past half-century comes tumbling down. And all sorts of things that we take for granted today—how we understand ourselves and our relationships with others, how we practice psychology
  • Barrett’s theory is still only in its infancy. But other researchers are beginning to take up her ideas, sometimes in part, sometimes in full, and where the science will take us as it expands is impossible to predict. It’s even possible that Barrett will turn out to be wrong, as she herself acknowledges. “Every scientist has to face that,” she says. Still, if she is right, then perhaps the most important change we’ll need to make is in our own heads. If our emotions are not universal physiological responses but concepts we’ve constructed from various biological signals and stashed memories, then perhaps we can exercise more control over our emotional lives than we’ve assumed.
  • “Every experience you have now is seeding your experience for the future,” Barrett told me. “Knowing that, would you choose to do what you’re doing now?” She paused a beat and looked me in the eye. “Well? Would you? You are the architect of your own experience.”
Duncan H

Rick Santorum Campaigning Against the Modern World - NYTimes.com - 0 views

  • As a journalist who covered Rick Santorum in Pennsylvania for years, I can understand the Tea Party’s infatuation with him. It’s his anger. It is in perfect synch with the constituency he is wooing.
  • Even at the height of his political success, when he had a lot to be happy about, Santorum was an angry man. I found it odd. I was used to covering politicians who had good dispositions — or were good at pretending they had good dispositions.
  • You could easily get him revved by bringing up the wrong topic or taking an opposing point of view. His nostrils would flare, his eyes would glare and he would launch into a disquisition on how, deep down, you were a shallow guy who could not grasp the truth and rightness of his positions.
  • ...11 more annotations...
  • “It’s just a curious bias of the media around here. It’s wonderful. One person says something negative and the media rushes and covers that. The wonderful balanced media that I love in this community.”
  • Santorum had reason to be peeved. He was running against the Democrat Bob Casey. He was trailing by double digits and knew he was going to lose. He was not a happy camper, but then he rarely is.
  • As he has shown in the Republican debates, Santorum can be equable. The anger usually flares on matters closest to his heart: faith, family and morals. And if, by chance, you get him started on the role of religion in American life, get ready for a Vesuvius moment.
  • Outside of these areas, he was more pragmatic. Then and now, Santorum held predictably conservative views, but he was astute enough to bend on some issues and be — as he put it in the Arizona debate — “a team player.”
  • In the Senate, he represented a state with a relentlessly moderate-to-centrist electorate so when campaigning he emphasized the good deeds he did in Washington. Editorial board meetings with Santorum usually began with him listing federal money he had brought in for local projects.People who don’t know him — and just see the angry Rick — don’t realize what a clever politician Santorum is. He didn’t rise to become a Washington insider through the power of prayer. He may say the Rosary, but he knows his Machiavelli.
  • That said, Santorum’s anger is not an act.  It is genuine. It has its roots in the fact that he had the misfortune to be born in the second half of the 20th century. In his view, it was an era when moral relativism and anti-religious feeling held sway, where traditional values were ignored or mocked, where heretics ruled civic and political life. If anything, it’s gotten worse in the 21st, with the election of Barack Obama.Leave it to Santorum to attack Obama on his theology, of all things. He sees the president as an exemplar of mushy, feel-good Christianity that emphasizes tolerance over rectitude, and the love of Jesus over the wrath of God.
  • Like many American Catholics, I struggle with the church’s teachings as they apply to the modern world. Santorum does not.
  • I once wrote that Santorum has one of the finest minds of the 13th century. It was meant to elicit a laugh, but there’s truth behind the remark. No Vatican II for Santorum. His belief system is the fixed and firm Catholicism of the Council of Trent in the mid-16th century. And Santorum is a warrior for those beliefs.
  • During the campaign, he has regularly criticized the media for harping on his public statements on homosexuality, contraception, abortion, the decline in American morals. Still, he can’t resist talking about them. These are the issues that get his juices flowing, not the deficit or federal energy policy.
  • Santorum went to Houston not to praise Kennedy but to bash him. To Santorum, the Kennedy speech did permanent damage because it led to secularization of American politics. He said it laid the foundation for attacks on religion by the secular left that has led to denial of free speech rights to religious people. “John F. Kennedy chose not to just dispel fear,” Santorum said, “he chose to expel faith.”
  • Ultimately Kennedy’s attempt to reassure Protestants that the Catholic Church would not control the government and suborn its independence advanced a philosophy of strict separation that would create a purely secular public square cleansed of all religious wisdom and the voice of religious people of all faiths. He laid the foundation for attacks on religious freedom and freedom of speech by the secular left and its political arms like the A.C.L.U and the People for the American Way. This has and will continue to create dissension and division in this country as people of faith increasingly feel like second-class citizens.One consequence of Kennedy’s speech, Santorum said,is the debasement of our First Amendment right of religious freedom. Of all the great and necessary freedoms listed in the First Amendment, freedom to exercise religion (not just to believe, but to live out that belief) is the most important; before freedom of speech, before freedom of the press, before freedom of assembly, before freedom to petition the government for redress of grievances, before all others. This freedom of religion, freedom of conscience, is the trunk from which all other branches of freedom on our great tree of liberty get their life.As so it went for 5,000 words. It is a revelatory critique of the modern world and Santorum quoted G.K. Chesterton, Edmund Burke, St. Thomas Aquinas and Martin Luther King to give heft to his assertions.That said, it was an angry speech, conjuring up images of people of faith cowering before leftist thought police. Who could rescue us from this predicament? Who could banish the secularists and restore religious morality to its throne?
  •  
    An interesting critique of Santorum and his religious beliefs.
Javier E

Why Not Just Weigh the Fish? - NYTimes.com - 0 views

  • The essence of philosophy is abstract reasoning – not because the philosopher is too lazy to attempt a more hands-on approach, but because the subjects at issue do not readily submit to it. If we could simply weigh the fish, we gladly would. In recent centuries, philosophers in fact have discovered how to weigh that allegorical fish, in various fields, and on each occasion a new discipline has been born: physics in the 17th century; chemistry in the 18th; biology in the 19th and psychology in the 20th.
  • The remaining problems of philosophy today concern issues like justice, morality, free will, knowledge and the origins of the universe. In dismissing philosophy as an antiquated relic of our prescientific past, the scientist is making a very large and dubious assumption: that the abstract methods of philosophy, despite the discipline’s string of successes over recent centuries, have nothing more to contribute to our developing understanding of the world
  • Philosophy today continues to rest, albeit somewhat precariously, at the center of human inquiry, continuous at one end with the sciences to which it gave rise, and at the other end with history and literature. Professional philosophers tend to crowd around the scientific end of that spectrum, busily looking for new ways to weigh their fish. But much of what gives philosophy its continuing fascination is its connection with the humanities. To weigh the fish is doubtless desirable, but there is just as much to be learned in understanding where that fish came from, and in telling stories about where it might go.
julia rhodes

"Carrot and Stick" Motivation Revisited by New Research | Psychology Today - 1 views

  • We continue to revisit the issue of motivation and specifically, the “carrot and stick” aspect.  New research seems to indicate that brain chemicals may control behavior and for people to learn and adapt in the world; therefore, both punishment and reward may be necessary. T
  • The real question is, which route would you choose—positive or negative? Most people are taught to refrain from engaging in a certain behavior by being given punishments that create negative feelings.
  • Different players use different strategies. It all depends on their genetic material. People's tendency to change their choice immediately after receiving a punishment depends on which serotonin gene variant they inherited from their parents. The dopamine gene variant, on the other hand, exerts influence on whether people can stop themselves making the choice that was previously rewarded, but no longer is
  • ...7 more annotations...
  • What do we mean by motivation? It's been defined as a predisposition to behave in a purposeful manner to achieve specific, unmet needs and the will to achieve, and the inner force that drives individuals to accomplish personal and organizational goals. And why do we need motivated employees? The answer is survival.
  • It turns out that people are motivated by interesting work, challenge, and increasing responsibility—intrinsic factors. People have a deep-seated need for growth and achievement.
  • Even understanding what constitutes human motivation  has been a centuries old puzzle, addressed as far back as Aristotle.
  • . Pink concludes that extrinsic motivators work only in a surprisingly narrow band of circumstances; rewards often destroy creativity and employee performance; and the secret to high performance isn’t reward and punishment but that unseen intrinsic drive—the drive to do something  because it is meaningful.
  • true motivation boils down to three elements: Autonomy, the desire to direct our own lives; mastery, the desire to continually improve at something that matters to us, and purpose, the desire to do things in service of something larger than ourselves.
  • The carrot-and-stick approach worked well for typical tasks of the early 20th century —routine, unchallenging and highly controlled. For these tasks, where the process is straightforward and lateral thinking is not required, rewards can provide a small motivational boost without any harmful side effects
  • obs in the 21st century have changed dramatically. They have become more complex, more interesting and more self-directed, and this is where the carrot-and-stick approach has become unstuck.
Javier E

Opinion | Is There Such a Thing as an Authoritarian Voter? - The New York Times - 0 views

  • Jonathan Weiler, a political scientist at the University of North Carolina at Chapel Hill, has spent much of his career studying the appeal of authoritarian figures: politicians who preach xenophobia, beat up on the press and place themselves above the law while extolling “law and order” for everyone else.
  • He is one of many scholars who believe that deep-seated psychological traits help explain voters’ attraction to such leaders. “These days,” he told me, “audiences are more receptive to the idea” than they used to be.
  • “In 2018, the sense of fear and panic — the disorientation about how people who are not like us could see the world the way they do — it’s so elemental,” Mr. Weiler said. “People understand how deeply divided we are, and they are looking for explanations that match the depth of that division.”
  • ...24 more annotations...
  • Moreover, using the child-rearing questionnaire, African-Americans score as far more authoritarian than whites
  • what, exactly, is an “authoritarian” personality? How do you measure it?
  • for more than half a century — social scientists have tried to figure out why some seemingly mild-mannered people gravitate toward a strongman
  • the philosopher (and German refugee) Theodor Adorno collaborated with social scientists at the University of California at Berkeley to investigate why ordinary people supported fascist, anti-Semitic ideology during the war. They used a questionnaire called the F-scale (F is for fascism) and follow-up interviews to analyze the “total personality” of the “potentially antidemocratic individual.”
  • The resulting 1,000-page tome, “The Authoritarian Personality,” published in 1950, found that subjects who scored high on the F-scale disdained the weak and marginalized. They fixated on sexual deviance, embraced conspiracy theories and aligned themselves with domineering leaders “to serve powerful interests and so participate in their power,”
  • “Globalized free trade has shafted American workers and left us looking for a strong male leader, a ‘real man,’” he wrote. “Trump offers exactly what my maladapted unconscious most craves.”
  • one of the F-scale’s prompts: “Obedience and respect for authority are the most important virtues children should learn.” Today’s researchers often diagnose latent authoritarians through a set of questions about preferred traits in children: Would you rather your child be independent or have respect for elders? Have curiosity or good manners? Be self-reliant or obedient? Be well behaved or considerate?
  • a glance at the Christian group Focus on the Family’s “biblical principles for spanking” reminds us that your approach to child rearing is not pre-political; it is shorthand for your stance in the culture wars.
  • “All the social sciences are brought to bear to try to explain all the evil that persists in the world, even though the liberal Enlightenment worldview says that we should be able to perfect things,” said Mr. Strouse, the Trump voter
  • what should have been obvious:
  • “Trump’s electoral strength — and his staying power — have been buoyed, above all, by Americans with authoritarian inclinations,” wrote Matthew MacWilliams, a political consultant who surveyed voters during the 2016 election
  • The child-trait test, then, is a tool to identify white people who are anxious about their decline in status and power.
  • new book, “Prius or Pickup?,” by ditching the charged term “authoritarian.” Instead, they divide people into three temperamental camps: fixed (people who are wary of change and “set in their ways”), fluid (those who are more open to new experiences and people) and mixed (those who are ambivalent).
  • “The term ‘authoritarian’ connotes a fringe perspective, and the perspective we’re describing is far from fringe,” Mr. Weiler said. “It’s central to American public opinion, especially on cultural issues like immigration and race.”
  • Other scholars apply a typology based on the “Big Five” personality traits identified by psychologists in the mid-20th century: extroversion, agreeableness, conscientiousness, neuroticism and openness to experience. (It seems that liberals are open but possibly neurotic, while conservatives are more conscientious.)
  • Historical context matters — it shapes who we are and how we debate politics. “Reason moves slowly,” William English, a political economist at Georgetown, told me. “It’s constituted sociologically, by deep community attachments, things that change over generations.”
  • “it is a deep-seated aspiration of many social scientists — sometimes conscious and sometimes unconscious — to get past wishy-washy culture and belief. Discourses that can’t be scientifically reduced are problematic” for researchers who want to provide “a universal account of behavior.”
  • in our current environment, where polarization is so unyielding, the apparent clarity of psychological and biological explanations becomes seductive
  • Attitudes toward parenting vary across cultures, and for centuries African-Americans have seen the consequences of a social and political hierarchy arrayed against them, so they can hardly be expected to favor it — no matter what they think about child rearing
  • — we know that’s not going to happen. People have wicked tendencies.”
  • as the social scientific portrait of humanity grows more psychological and irrational, it comes closer and closer to approximating the old Adam of traditional Christianity: a fallen, depraved creature, unable to see himself clearly except with the aid of a higher power
  • The conclusions of political scientists should inspire humility rather than hubris. In the end, they have confirmed what so many observers of our species have long suspected: None of us are particularly free or rational creatures.
  • Allen Strouse is not the archetypal Trump voter whom journalists discover in Rust Belt diners. He is a queer Catholic poet and scholar of medieval literature who teaches at the New School in New York City. He voted for Mr. Trump “as a protest against the Democrats’ failures on economic issues,” but the psychological dimensions of his vote intrigue him. “Having studied Freudian analysis, and being in therapy for 10 years, I couldn’t not reflexively ask myself, ‘How does this decision have to do with my psychology?’” he told me.
  • their preoccupation with childhood and “primitive and irrational wishes and fears” have influenced the study of authoritarianism ever since.
Javier E

Opinion | Have Some Sympathy - The New York Times - 0 views

  • Schools and parenting guides instruct children in how to cultivate empathy, as do workplace culture and wellness programs. You could fill entire bookshelves with guides to finding, embracing and sharing empathy. Few books or lesson plans extol sympathy’s virtues.
  • “Sympathy focuses on offering support from a distance,” a therapist explains on LinkedIn, whereas empathy “goes beyond sympathy by actively immersing oneself in another person’s emotions and attempting to comprehend their point of view.”
  • In use since the 16th century, when the Greek “syn-” (“with”) combined with pathos (experience, misfortune, emotion, condition) to mean “having common feelings,” sympathy preceded empathy by a good four centuries
  • ...8 more annotations...
  • Empathy (the “em” means “into”) barged in from the German in the 20th century and gained popularity through its usage in fields like philosophy, aesthetics and psychology. According to my benighted 1989 edition of Webster’s Unabridged, empathy was the more self-centered emotion, “the intellectual identification with or vicarious experiencing of the feelings, thoughts or attitudes of another.”
  • in more updated lexicons, it’s as if the two words had reversed. Sympathy now implies a hierarchy whereas empathy is the more egalitarian sentiment.
  • Sympathy, the session’s leader explained to school staff members, was seeing someone in a hole and saying, “Too bad you’re in a hole,” whereas empathy meant getting in the hole, too.
  • “Empathy is a choice and it’s a vulnerable choice because in order to connect with you, I have to connect with something in myself that knows that feeling,”
  • Still, it’s hard to square the new emphasis on empathy — you must feel what others feel — with another element of the current discourse. According to what’s known as “standpoint theory,” your view necessarily depends on your own experience: You can’t possibly know what others feel.
  • In short, no matter how much an empath you may be, unless you have actually been in someone’s place, with all its experiences and limitations, you cannot understand where that person is coming from. The object of your empathy may find it presumptuous of you to think that you “get it.”
  • Bloom asks us to imagine what empathy demands should a friend’s child drown. “A highly empathetic response would be to feel what your friend feels, to experience, as much as you can, the terrible sorrow and pain,” he writes. “In contrast, compassion involves concern and love for your friend, and the desire and motivation to help, but it need not involve mirroring your friend’s anguish.”
  • Bloom argues for a more rational, modulated, compassionate response. Something that sounds a little more like our old friend sympathy.
Javier E

Among the Disrupted - The New York Times - 0 views

  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university,
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods
  • ...27 more annotations...
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • Greif’s book is a prehistory of our predicament, of our own “crisis of man.” (The “man” is archaic, the “crisis” is not.) It recognizes that the intellectual history of modernity may be written in part as the epic tale of a series of rebellions against humanism
  • We are not becoming transhumanists, obviously. We are too singular for the Singularity. But are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.
  • Here is his conclusion: “Anytime your inquiries lead you to say, ‘At this moment we must ask and decide who we fundamentally are, our solution and salvation must lie in a new picture of ourselves and humanity, this is our profound responsibility and a new opportunity’ — just stop.” Greif seems not to realize that his own book is a lasting monument to precisely such inquiry, and to its grandeur
  • “Answer, rather, the practical matters,” he counsels, in accordance with the current pragmatist orthodoxy. “Find the immediate actions necessary to achieve an aim.” But before an aim is achieved, should it not be justified? And the activity of justification may require a “picture of ourselves.” Don’t just stop. Think harder. Get it right.
  • — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • Who has not felt superior to humanism? It is the cheapest target of all: Humanism is sentimental, flabby, bourgeois, hypocritical, complacent, middlebrow, liberal, sanctimonious, constricting and often an alibi for power
  • what is humanism? For a start, humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating
  • The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality
  • Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.
  • And posthumanism? It elects to understand the world in terms of impersonal forces and structures, and to deny the importance, and even the legitimacy, of human agency.
  • There have been humane posthumanists and there have been inhumane humanists. But the inhumanity of humanists may be refuted on the basis of their own worldview
  • the condemnation of cruelty toward “man the machine,” to borrow the old but enduring notion of an 18th-century French materialist, requires the importation of another framework of judgment. The same is true about universalism, which every critic of humanism has arraigned for its failure to live up to the promise of a perfect inclusiveness
  • there has never been a universalism that did not exclude. Yet the same is plainly the case about every particularism, which is nothing but a doctrine of exclusion; and the correction of particularism, the extension of its concept and its care, cannot be accomplished in its own name. It requires an idea from outside, an idea external to itself, a universalistic idea, a humanistic idea.
  • Asking universalism to keep faith with its own principles is a perennial activity of moral life. Asking particularism to keep faith with its own principles is asking for trouble.
  • there is no more urgent task for American intellectuals and writers than to think critically about the salience, even the tyranny, of technology in individual and collective life
  • a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion
  • “Our very mastery seems to escape our mastery,” Michel Serres has anxiously remarked. “How can we dominate our domination; how can we master our own mastery?”
  • universal accessibility is not the end of the story, it is the beginning. The humanistic methods that were practiced before digitalization will be even more urgent after digitalization, because we will need help in navigating the unprecedented welter
  • Searches for keywords will not provide contexts for keywords. Patterns that are revealed by searches will not identify their own causes and reasons
  • The new order will not relieve us of the old burdens, and the old pleasures, of erudition and interpretation.
  • Is all this — is humanism — sentimental? But sentimentality is not always a counterfeit emotion. Sometimes sentiment is warranted by reality.
  • The persistence of humanism through the centuries, in the face of formidable intellectual and social obstacles, has been owed to the truth of its representations of our complexly beating hearts, and to the guidance that it has offered, in its variegated and conflicting versions, for a soulful and sensitive existence
  • a complacent humanist is a humanist who has not read his books closely, since they teach disquiet and difficulty. In a society rife with theories and practices that flatten and shrink and chill the human subject, the humanist is the dissenter.
carolinewren

Playing Dumb on Climate Change - NYTimes.com - 1 views

  • SCIENTISTS have often been accused of exaggerating the threat of climate change,
  • The year just concluded is about to be declared the hottest one on record,
  • Science is conservative, and new claims of knowledge are greeted with high degrees of skepticism.
  • ...14 more annotations...
  • if there’s more than even a scant 5 percent possibility that an event occurred by chance, scientists will reject the causal claim.
  • correlation is not necessarily causation, because we need to rule out the possibility that we are just observing a coincidence.
  • . In the 18th and 19th centuries, this conservatism generally took the form of a demand for a large amount of evidence; in the 20th century, it took on the form of a demand for statistical significance
  • The 95 percent confidence level is generally credited to the British statistician R. A. Fisher, who was interested in the problem of how to be sure an observed effect of an experiment was not just the result of chance.
  • the 95 percent level has no actual basis in nature. It is a convention, a value judgment.
  • scientists place the burden of proof on the person making an affirmative claim.
  • It places the burden of proof on the victim rather than, for example, on the manufacturer of a harmful product.
  • it might be reasonable to accept a lower statistical threshold when examining effects in people, because you already have reason to believe that the observed effect is not just chance.
  • WHY don’t scientists pick the standard that is appropriate to the case at hand, instead of adhering to an absolutist one?
  • the history of science in relation to religion.
  • long tradition in the history of science that valorizes skepticism as an antidote to religious faith
  • scientists consciously rejected religion as a basis of natural knowledge, they held on to certain cultural presumptions about what kind of person had access to reliable knowledge.
  • they do practice a form of self-denial, denying themselves the right to believe anything that has not passed very high intellectual hurdles.
  • vigorously denying its relation to religion, modern science retains symbolic vestiges of prophetic tradition, so many scientists bend over backward to avoid these associations.
Javier E

Elon studies future of "Generation Always-On" - 1 views

  • Elon studies the future of "Generation Always-On"
  • By the year 2020, it is expected that youth of the “always-on generation,” brought up from childhood with a continuous connection to each other and to information, will be nimble, quick-acting multitaskers who count on the Internet as their external brain and who approach problems in a different way from their elders. "There is no doubt that brains are being rewired,"
  • the Internet Center, refers to the teens-to-20s age group born since the turn of the century as Generation AO, for “always-on." “They have grown up in a world that has come to offer them instant access to nearly the entirety of human knowledge, and incredible opportunities to connect, create and collaborate,"
  • ...10 more annotations...
  • some said they are already witnessing deficiencies in young peoples’ abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."
  • Many of the respondents in this survey predict that Gen AO will exhibit a thirst for instant gratification and quick fixes and a lack of patience and deep-thinking ability due to what one referred to as “fast-twitch wiring.”
  • “The replacement of memorization by analysis will be the biggest boon to society since the coming of mass literacy in the late 19th to early 20th century.” — Paul Jones, University of North Carolina-Chapel Hill
  • “Teens find distraction while working, distraction while driving, distraction while talking to the neighbours. Parents and teachers will have to invest major time and efforts into solving this issue – silence zones, time-out zones, meditation classes without mobile, lessons in ignoring people.”
  • “Society is becoming conditioned into dependence on technology in ways that, if that technology suddenly disappears or breaks down, will render people functionally useless. What does that mean for individual and social resiliency?
  • “Short attention spans resulting from quick interactions will be detrimental to focusing on the harder problems and we will probably see a stagnation in many areas: technology, even social venues such as literature. The people who will strive and lead the charge will be the ones able to disconnect themselves to focus.”
  • “The underlying issue is that they will become dependent on the Internet in order to solve problems and conduct their personal, professional, and civic lives. Thus centralized powers that can control access to the Internet will be able to significantly control future generations. It will be much as in Orwell's 1984, where control was achieved by using language to shape and limit thought, so future regimes may use control of access to the Internet to shape and limit thought.”
  • “Increasingly, teens and young adults rely on the first bit of information they find on a topic, assuming that they have found the ‘right’ answer, rather than using context and vetting/questioning the sources of information to gain a holistic view of a topic.”
  • “Parents and kids will spend less time developing meaningful and bonded relationships in deference to the pursuit and processing of more and more segmented information competing for space in their heads, slowly changing their connection to humanity.”
  • “It’s simply not possible to discuss, let alone form societal consensus around major problems without lengthy, messy conversations about those problems. A generation that expects to spend 140 or fewer characters on a topic and rejects nuance is incapable of tackling these problems.”
1 - 20 of 76 Next › Last »
Showing 20 items per page