Skip to main content

Home/ TOK Friends/ Group items matching "Criticism" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Guns, Germs, and The Future of Us - Wyatt Edward Gates - Medium - 0 views

  • ared Daimond’s seminal work Guns, Germs, and Steel has many flaws, but it provides some useful anecdotes about how narrative and consciousness shapes human organization progresses
  • Past critical transformations of thought can help us see how we need to transform ourselves now in order to survive the future.
  • something both ancient and immediate: the way we define who is in our tribe plays a critical role in what kind of social organization we can build and maintain
  • ...25 more annotations...
  • You can’t have a blood family of 300 million, nor even a large enough one to do things like build an agrarian society
  • In order to have large cities built on agrarianism it was necessary not only to innovate technology, but to transform our very consciousness as it related to how we defined what a person was, both ourselves and others
  • Instead of needing to have real, flowing blood with common DNA from birth, it was merely necessary to be among the same abstract family organized under a king of some kind — a kind of stand in for the father or patriarch. We developed law and law enforcement as abstract disembodied voices of the father. This allowed total strangers without any family ties to interact in the same society in a constructive and organized way. Thus: civilization as we know it
  • Those ancient polities have developed finally into the Nation, a kind of tribe so fully abstracted that you can be of any blood and language and religion and still function within it.
  • So, too, are all other forms of human separation — and the opposition and conflicts they spawn — illusory in nature. We moved beyond blood, but then it was language or religion or fealty that made it impossible to work together, and we warred over that
  • we’re told these borders mean everything, that they are real and urgent and demand constant sacrifice to maintain.
  • why is that border there? Why borders?
  • We’re stuck in a mode of thinking that’s no longer sensible. There isn’t a reason for borders. There never really was, but now more than ever we have no utility for them, no need for them
  • What humanity has to do is wake up to the reality of post-tribalism. This means seeing through all these invented borders to the truth that we are all people, we are all fundamentally the same, and we can all learn to live with one another.
  • It was the idea of necessary conflict based on blood that preceded the fights that appeared to justify the belief in that blood-based conflict.
  • Nations have saturated the entire globe. There are no more frontiers. It’s all Nations butting up against one another.
  • We are all people of a similar nature and we do have the option to relate to one another as people for the sake of saving our shared homes and futures. We all hunger and thirst and become lonely, we all laugh and weep in the same language. Stripped of confounding symbols we are undivided.
  • There are a lot of people upset about the illusion of borders. They want a different reality, one in which there are Good Tribes (their tribe) and Bad Tribes (all the other ones).
  • but the world is already so mixed together they can’t draw those borders anymore. Hence: fascism.
  • There are no firm foundations for defining this tribe, however, so he’s left to cobble together some kind of ad hoc notion of in- and out-group. Like a magpie he collects ways of dividing people as appeals to his caprice: race, sex, Nation, etc., but there’s no greater sense to it, so it’s all arbitrary, all a mess.
  • No amount of magical thinking from conservatives can change the reality of globalism, however; what one Nation does to pollute will affect us all, and that is according to the laws of physics. No political movement can change those physics. We have to adapt or perish.
  • a key part of it is a simple lack of imagination. He just doesn’t realize there’s an option to not have borders, because his entire consciousness is married to the idea of of-me and not-of-me, Us and Them, and if there is no Them there can’t be an Us, and therefore life stops making sense
  • What has to be true if there are no tribes? We have no need to discriminate among who we may love. Loving and caring for all people as if they were blood family is the path forward
  • There needs to be a new story for us to share. It’s not enough to stop believing in the old way of borders, we have to actively seek out a new way of thinking and speaking and living that reflects the world as it is and as it can be.
  • there are others who have more tangible investments in borders: Those who have grown fat off the conflicts driven by these invented borders don’t want us to see how pointless it all is. These billionaires and presidents and kings want us to keep fighting against one another over the borders they so lazily define because it gives them a means of power and control.
  • We have to be ready for their opposition, however. They’ll do what they can to force us to act as if their borders are real. We don’t need to listen, though we do need to be ready to sacrifice.
  • Without a globally-coordinated response we can’t resolve a globally-driven problem such as climate change. If we can grant the humanity of all people we can start to imagine ways of relating to one another that aren’t opposed and antagonistic, but which are cooperative and aimed at harmony.
  • This transformation of consciousness must happen in our own hearts and minds before it can happen in concert.
  • the Nation has already been shown to be unnecessary because of social globalism. Pick a major city on earth and you’ll find every kind of person living together in peace! Not perfect peace, but not constant and unavoidable war, and that is what counts.
  • We can’t keep pretending as if borders matter when we can so clearly see that they don’t, but we can’t just have no story at all, there must be a way of contextualizing a future without borders. I don’t know what that story is, exactly, but I believe it is something like love writ large. Once we’re ready to start telling it we can start living it.
sissij

Google and Facebook Take Aim at Fake News Sites - The New York Times - 0 views

  • Over the last week, two of the world’s biggest internet companies have faced mounting criticism over how fake news on their sites may have influenced the presidential election’s outcome.
  • Hours later, Facebook, the social network, updated the language in its Facebook Audience Network policy, which already says it will not display ads in sites that show misleading or illegal content, to include fake news sites.
  • Google did not escape the glare, with critics saying the company gave too much prominence to false news stories.
  • ...2 more annotations...
  • Facebook has long spoken of how it helped influence and stoke democratic movements in places like the Middle East, and it tells its advertisers that it can help sway its users with ads.
  • It remains to be seen how effective Google’s new policy on fake news will be in practice. The policy will rely on a combination of automated and human reviews to help determine what is fake. Although satire sites like The Onion are not the target of the policy, it is not clear whether some of them, which often run fake news stories written for humorous effect, will be inadvertently affected by Google’s change.
  •  
    Company start to pay attention to the fake news on the social media. It reminded me of the government involvement in economics. Although internet should be a place free of speech, there are mounting amount of fake news and alternative facts now that the company need to regulate and make rules to restrict it. I think as long as there is human society, we need rule. In free markets, we also need government regulation to remain a balance. --Sissi (3/6/2017)
sissij

Trump Wants It Known: Grading 100 Days Is 'Ridiculous' (but His Were the Best) - The New York Times - 0 views

  • “As with so much else, Trump is a study in inconsistency,” said Robert Dallek, the presidential historian. “One minute he says his 100 days have been the best of any president, and the next minute he decries the idea of measuring a president by the 100 days.”
  • Mr. Trump has already told supporters not to believe contrary assessments, anticipating more critical evaluations by journalists, not to mention partisan attacks by Democrats.
  • If nothing else, Mr. Trump’s first 100 days have certainly been eventful.
  • ...6 more annotations...
  • Whether they have accomplished much is more a subject of debate.
  • Others were less weighty, like one officially naming a veterans’ health center in Butler County, Pa., the “Abie Abraham V.A. Clinic.”
  • To the extent that he is being held to a measurement he disdains, he has no one to blame but himself.
  • only one has even been introduced.
  • “It is hard to judge any of these other presidents after that, and I think all of them are cursing the idea that this got started,” said Doris Kearns Goodwin, author of “No Ordinary Time,” a book about Roosevelt. “That’s the one thing they might all agree on, the post-F.D.R. presidents: ‘No way; this isn’t fair.’”
  • “I don’t think the first 100 days are by themselves that important,” he said. “The first year is critically important, and the first 100 days set the tone for the first year.”
  •  
    I think there is a confirmation bias in Mr. Trump's argument. He was quoting the previous presidents to suggest that the first 100 days of presidency is not important. However, what the previous presidents meant by saying "100-days" is not a fair grading mark is because the time is too short to show anything. It's not that it is not important. I think Mr. Trump himself is not even convinced with that since he tried so hard to make his first hundred days look good. Quantity does not equal quality. --Sissi (4/25/2017)
Javier E

There's No Such Thing As 'Sound Science' | FiveThirtyEight - 1 views

  • cience is being turned against itself. For decades, its twin ideals of transparency and rigor have been weaponized by those who disagree with results produced by the scientific method. Under the Trump administration, that fight has ramped up again.
  • The same entreaties crop up again and again: We need to root out conflicts. We need more precise evidence. What makes these arguments so powerful is that they sound quite similar to the points raised by proponents of a very different call for change that’s coming from within science.
  • Despite having dissimilar goals, the two forces espouse principles that look surprisingly alike: Science needs to be transparent. Results and methods should be openly shared so that outside researchers can independently reproduce and validate them. The methods used to collect and analyze data should be rigorous and clear, and conclusions must be supported by evidence.
  • ...26 more annotations...
  • they’re also used as talking points by politicians who are working to make it more difficult for the EPA and other federal agencies to use science in their regulatory decision-making, under the guise of basing policy on “sound science.” Science’s virtues are being wielded against it.
  • What distinguishes the two calls for transparency is intent: Whereas the “open science” movement aims to make science more reliable, reproducible and robust, proponents of “sound science” have historically worked to amplify uncertainty, create doubt and undermine scientific discoveries that threaten their interests.
  • “Our criticisms are founded in a confidence in science,” said Steven Goodman, co-director of the Meta-Research Innovation Center at Stanford and a proponent of open science. “That’s a fundamental difference — we’re critiquing science to make it better. Others are critiquing it to devalue the approach itself.”
  • alls to base public policy on “sound science” seem unassailable if you don’t know the term’s history. The phrase was adopted by the tobacco industry in the 1990s to counteract mounting evidence linking secondhand smoke to cancer.
  • The sound science tactic exploits a fundamental feature of the scientific process: Science does not produce absolute certainty. Contrary to how it’s sometimes represented to the public, science is not a magic wand that turns everything it touches to truth. Instead, it’s a process of uncertainty reduction, much like a game of 20 Questions.
  • Any given study can rarely answer more than one question at a time, and each study usually raises a bunch of new questions in the process of answering old ones. “Science is a process rather than an answer,” said psychologist Alison Ledgerwood of the University of California, Davis. Every answer is provisional and subject to change in the face of new evidence. It’s not entirely correct to say that “this study proves this fact,” Ledgerwood said. “We should be talking instead about how science increases or decreases our confidence in something.”
  • While insisting that they merely wanted to ensure that public policy was based on sound science, tobacco companies defined the term in a way that ensured that no science could ever be sound enough. The only sound science was certain science, which is an impossible standard to achieve.
  • “Doubt is our product,” wrote one employee of the Brown & Williamson tobacco company in a 1969 internal memo. The note went on to say that doubt “is the best means of competing with the ‘body of fact’” and “establishing a controversy.” These strategies for undermining inconvenient science were so effective that they’ve served as a sort of playbook for industry interests ever since
  • Doubt merchants aren’t pushing for knowledge, they’re practicing what Proctor has dubbed “agnogenesis” — the intentional manufacture of ignorance. This ignorance isn’t simply the absence of knowing something; it’s a lack of comprehension deliberately created by agents who don’t want you to know,
  • In the hands of doubt-makers, transparency becomes a rhetorical move. “It’s really difficult as a scientist or policy maker to make a stand against transparency and openness, because well, who would be against it?
  • But at the same time, “you can couch everything in the language of transparency and it becomes a powerful weapon.” For instance, when the EPA was preparing to set new limits on particulate pollution in the 1990s, industry groups pushed back against the research and demanded access to primary data (including records that researchers had promised participants would remain confidential) and a reanalysis of the evidence. Their calls succeeded and a new analysis was performed. The reanalysis essentially confirmed the original conclusions, but the process of conducting it delayed the implementation of regulations and cost researchers time and money.
  • Delay is a time-tested strategy. “Gridlock is the greatest friend a global warming skeptic has,” said Marc Morano, a prominent critic of global warming research
  • which has received funding from the oil and gas industry. “We’re the negative force. We’re just trying to stop stuff.”
  • these ploys are getting a fresh boost from Congress. The Data Quality Act (also known as the Information Quality Act) was reportedly written by an industry lobbyist and quietly passed as part of an appropriations bill in 2000. The rule mandates that federal agencies ensure the “quality, objectivity, utility, and integrity of information” that they disseminate, though it does little to define what these terms mean. The law also provides a mechanism for citizens and groups to challenge information that they deem inaccurate, including science that they disagree with. “It was passed in this very quiet way with no explicit debate about it — that should tell you a lot about the real goals,” Levy said.
  • in the 20 months following its implementation, the act was repeatedly used by industry groups to push back against proposed regulations and bog down the decision-making process. Instead of deploying transparency as a fundamental principle that applies to all science, these interests have used transparency as a weapon to attack very particular findings that they would like to eradicate.
  • Now Congress is considering another way to legislate how science is used. The Honest Act, a bill sponsored by Rep. Lamar Smith of Texas,3The bill has been passed by the House but still awaits a vote in the Senate. is another example of what Levy calls a “Trojan horse” law that uses the language of transparency as a cover to achieve other political goals. Smith’s legislation would severely limit the kind of evidence the EPA could use for decision-making. Only studies whose raw data and computer codes were publicly available would be allowed for consideration.
  • It might seem like an easy task to sort good science from bad, but in reality it’s not so simple. “There’s a misplaced idea that we can definitively distinguish the good from the not-good science, but it’s all a matter of degree,” said Brian Nosek, executive director of the Center for Open Science. “There is no perfect study.” Requiring regulators to wait until they have (nonexistent) perfect evidence is essentially “a way of saying, ‘We don’t want to use evidence for our decision-making,’
  • ost scientific controversies aren’t about science at all, and once the sides are drawn, more data is unlikely to bring opponents into agreement.
  • objective knowledge is not enough to resolve environmental controversies. “While these controversies may appear on the surface to rest on disputed questions of fact, beneath often reside differing positions of value; values that can give shape to differing understandings of what ‘the facts’ are.” What’s needed in these cases isn’t more or better science, but mechanisms to bring those hidden values to the forefront of the discussion so that they can be debated transparently. “As long as we continue down this unabashedly naive road about what science is, and what it is capable of doing, we will continue to fail to reach any sort of meaningful consensus on these matters,”
  • The dispute over tobacco was never about the science of cigarettes’ link to cancer. It was about whether companies have the right to sell dangerous products and, if so, what obligations they have to the consumers who purchased them.
  • Similarly, the debate over climate change isn’t about whether our planet is heating, but about how much responsibility each country and person bears for stopping it
  • While researching her book “Merchants of Doubt,” science historian Naomi Oreskes found that some of the same people who were defending the tobacco industry as scientific experts were also receiving industry money to deny the role of human activity in global warming. What these issues had in common, she realized, was that they all involved the need for government action. “None of this is about the science. All of this is a political debate about the role of government,”
  • These controversies are really about values, not scientific facts, and acknowledging that would allow us to have more truthful and productive debates. What would that look like in practice? Instead of cherry-picking evidence to support a particular view (and insisting that the science points to a desired action), the various sides could lay out the values they are using to assess the evidence.
  • For instance, in Europe, many decisions are guided by the precautionary principle — a system that values caution in the face of uncertainty and says that when the risks are unclear, it should be up to industries to show that their products and processes are not harmful, rather than requiring the government to prove that they are harmful before they can be regulated. By contrast, U.S. agencies tend to wait for strong evidence of harm before issuing regulations
  • the difference between them comes down to priorities: Is it better to exercise caution at the risk of burdening companies and perhaps the economy, or is it more important to avoid potential economic downsides even if it means that sometimes a harmful product or industrial process goes unregulated?
  • But science can’t tell us how risky is too risky to allow products like cigarettes or potentially harmful pesticides to be sold — those are value judgements that only humans can make.
lucieperloff

Biden Administration Ramps Up Debt Relief Program to Help Black Farmers - The New York Times - 0 views

  • pledging to reverse decades of discriminatory agricultural lending and subsidy policies that have left Black farmers at an economic disadvantage and is racing to deploy $5 billion in aid and debt relief to help them.
  • Now the department is in the middle of a drastic overhaul, both of its personnel and of policies that it acknowledges have perpetuated inequality in rural America for years.
  • root out the vestiges of racism at his agency and to redress “systemic discrimination” that Black farmers had faced.
  • ...12 more annotations...
  • The Agriculture Department has faced sharp criticism from minority farmer groups for lacking diversity and ignoring complaints of bias in its programs.
  • That includes $8 billion that the Agriculture Department will use toward crop purchases, grants to food processors and distributors, and other programs to help farmers struggling with the pandemic.
  • The bill also provides “sums as may be necessary” from the Treasury Department to help minority farmers and ranchers pay off loans granted or guaranteed by the Agriculture Department, providing debt relief and aid for members of minority racial and ethnic groups that have long experienced discrimination at the hands of the government.
  • With the onset of the coronavirus, many farmers were forced to plow under crops or dump their milk, even as grocery store shelves emptied out and many American families went hungry.
  • A driving force behind the provisions for minority farmers was Raphael Warnock, the Democratic senator from Georgia whose election in January helped give Democrats control of the chamber.
  • Those factors led to a substantial loss of land.
  • The bill would provide billions of dollars in rental and utility assistance to people who are struggling and in danger of being evicted from their homes.
  • “If we had the same amount of investment that the other farmers had, a lot of Black farmers would still be farming this date.”
  • Injecting race into the relief effort has stirred backlash and criticism from some Republican lawmakers who have described the program as a kind of “reparations” for discrimination toward Black farmers.
  • “The same people who are complaining and using language to try to vilify this priority for African-American farmers are the same people who didn’t say a peep when tens of billions of dollars went out to the richest farmers in America,”
  • Mr. Vilsack is creating a commission to be a watchdog for racial equity issues at the agency
  • “Black farmers don’t trust the United States Department of Agriculture.”
caelengrubb

Free Market - Econlib - 0 views

  • Free market” is a summary term for an array of exchanges that take place in society.
  • Each exchange is undertaken as a voluntary agreement between two people or between groups of people represented by agents. These two individuals (or agents) exchange two economic goods, either tangible commodities or nontangible services
  • Both parties undertake the exchange because each expects to gain from it. Also, each will repeat the exchange next time (or refuse to) because his expectation has proved correct (or incorrect) in the recent past.
  • ...25 more annotations...
  • Trade, or exchange, is engaged in precisely because both parties benefit; if they did not expect to gain, they would not agree to the exchange.
  • This simple reasoning refutes the argument against free trade typical of the “mercantilist” period of sixteenth- to eighteenth-century Europe and classically expounded by the famed sixteenth-century French essayist Montaigne.
  • The mercantilists argued that in any trade, one party can benefit only at the expense of the other—that in every transaction there is a winner and a loser, an “exploiter” and an “exploited.”
  • We can immediately see the fallacy in this still-popular viewpoint: the willingness and even eagerness to trade means that both parties benefit. In modern game-theory jargon, trade is a win-win situation, a “positive-sum” rather than a “zero-sum” or “negative-sum” game.
  • Each one values the two goods or services differently, and these differences set the scene for an exchange.
  • Two factors determine the terms of any agreement: how much each participant values each good in question, and each participant’s bargaining skills.
  • the market in relation to how favorably buyers evaluate these goods—in shorthand, by the interaction of their supply with the demand for them.
  • On the other hand, given the buyers’ evaluation, or demand, for a good, if the supply increases, each unit of supply—each baseball card or loaf of bread—will fall in value, and therefore the price of the good will fall. The reverse occurs if the supply of the good decreases.
  • The market, then, is not simply an array; it is a highly complex, interacting latticework of exchanges.
  • Production begins with natural resources, and then various forms of machines and capital goods, until finally, goods are sold to the consumer.
  • At each stage of production from natural resource to consumer good, money is voluntarily exchanged for capital goods, labor services, and land resources. At each step of the way, terms of exchanges, or prices, are determined by the voluntary interactions of suppliers and demanders. This market is “free” because choices, at each step, are made freely and voluntarily.
  • The free market and the free price system make goods from around the world available to consumers.
  • Saving and investment can then develop capital goods and increase the productivity and wages of workers, thereby increasing their standard of living.
  • The free competitive market also rewards and stimulates technological innovation that allows the innovator to get a head start in satisfying consumer wants in new and creative ways.
  • Government, in every society, is the only lawful system of coercion. Taxation is a coerced exchange, and the heavier the burden of taxation on production, the more likely it is that economic growth will falter and decline
  • The ultimate in government coercion is socialism.
  • Under socialist central planning the socialist planning board lacks a price system for land or capital goods.
  • Market socialism is, in fact, a contradiction in terms.
  • The fashionable discussion of market socialism often overlooks one crucial aspect of the market: When two goods are exchanged, what is really exchanged is the property titles in those goods.
  • This means that the key to the existence and flourishing of the free market is a society in which the rights and titles of private property are respected, defended, and kept secure.
  • The key to socialism, on the other hand, is government ownership of the means of production, land, and capital goods.
  • Under socialism, therefore, there can be no market in land or capital goods worthy of the name.
  • ome critics of the free market argue that property rights are in conflict with “human” rights. But the critics fail to realize that in a free-market system, every person has a property right over his own person and his own labor and can make free contracts for those services.
  • A common charge against the free-market society is that it institutes “the law of the jungle,” of “dog eat dog,” that it spurns human cooperation for competition and exalts material success as opposed to spiritual values, philosophy, or leisure activities.
  • It is the coercive countries with little or no market activity—the notable examples in the last half of the twentieth century were the communist countries—where the grind of daily existence not only impoverishes people materially but also deadens their spirit.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Revisiting the prophetic work of Neil Postman about the media » MercatorNet - 1 views

  • The NYU professor was surely prophetic. “Our own tribe is undergoing a vast and trembling shift from the magic of writing to the magic of electronics,” he cautioned.
  • “We face the rapid dissolution of the assumptions of an education organised around the slow-moving printed word, and the equally rapid emergence of a new education based on the speed-of-light electronic message.”
  • What Postman perceived in television has been dramatically intensified by smartphones and social media
  • ...31 more annotations...
  • Postman also recognised that technology was changing our mental processes and social habits.
  • Today corporations like Google and Amazon collect data on Internet users based on their browsing history, the things they purchase, and the apps they use
  • Yet all citizens are undergoing this same transformation. Our digital devices undermine social interactions by isolating us,
  • “Years from now, it will be noticed that the massive collection and speed-of-light retrieval of data have been of great value to large-scale organisations, but have solved very little of importance to most people, and have created at least as many problems for them as they may have solved.”
  • “Television has by its power to control the time, attention, and cognitive habits of our youth gained the power to control their education.”
  • As a student of Canadian philosopher Marshall McLuhan, Postman believed that the medium of information was critical to understanding its social and political effects. Every technology has its own agenda. Postman worried that the very nature of television undermined American democratic institutions.
  • Many Americans tuned in to the presidential debate looking for something substantial and meaty
  • It was simply another manifestation of the incoherence and vitriol of cable news
  • “When, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility,” warned Postman.
  • Technology Is Never Neutral
  • As for new problems, we have increased addictions (technological and pornographic); increased loneliness, anxiety, and distraction; and inhibited social and intellectual maturation.
  • The average length of a shot on network television is only 3.5 seconds, so that the eye never rests, always has something new to see. Moreover, television offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification.
  • This is far truer of the Internet and social media, where more than a third of Americans, and almost half of young people, now get their news.
  • with smartphones now ubiquitous, the Internet has replaced television as the “background radiation of the social and intellectual universe.”
  • Is There Any Solution?
  • Reading news or commentary in print, in contrast, requires concentration, patience, and careful reflection, virtues that our digital age vitiates.
  • Politics as Entertainment
  • “How television stages the world becomes the model for how the world is properly to be staged,” observed Postman. In the case of politics, television fashions public discourse into yet another form of entertainment
  • In America, the fundamental metaphor for political discourse is the television commercial. The television commercial is not at all about the character of products to be consumed. … They tell everything about the fears, fancies, and dreams of those who might buy them.
  • The television commercial has oriented business away from making products of value and towards making consumers feel valuable, which means that the business of business has now become pseudo-therapy. The consumer is a patient assured by psycho-dramas.
  • Such is the case with the way politics is “advertised” to different subsets of the American electorate. The “consumer,” depending on his political leanings, may be manipulated by fears of either an impending white-nationalist, fascist dictatorship, or a radical, woke socialist takeover.
  • This paradigm is aggravated by the hypersiloing of media content, which explains why Americans who read left-leaning media view the Proud Boys as a legitimate, existential threat to national civil order, while those who read right-leaning media believe the real immediate enemies of our nation are Antifa
  • Regardless of whether either of these groups represents a real public menace, the loss of any national consensus over what constitutes objective news means that Americans effectively talk past one another: they use the Proud Boys or Antifa as rhetorical barbs to smear their ideological opponents as extremists.
  • Yet these technologies are far from neutral. They are, rather, “equipped with a program for social change.
  • Postman’s analysis of technology is prophetic and profound. He warned of the trivialising of our media, defined by “broken time and broken attention,” in which “facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.” He warned of “a neighborhood of strangers and pointless quantity.”
  • does Postman offer any solutions to this seemingly uncontrollable technological juggernaut?
  • Postman’s suggestions regarding education are certainly relevant. He unequivocally condemned education that mimics entertainment, and urged a return to learning that is hierarchical, meaning that it first gives students a foundation of essential knowledge before teaching “critical thinking.”
  • Postman also argued that education must avoid a lowest-common-denominator approach in favor of complexity and the perplexing: the latter method elicits in the student a desire to make sense of what perplexes him.
  • Finally, Postman promoted education of vigorous exposition, logic, and rhetoric, all being necessary for citizenship
  • Another course of action is to understand what these media, by their very nature, do to us and to public discourse.
  • We must, as Postman exhorts us, “demystify the data” and dominate our technology, lest it dominate us. We must identify and resist how television, social media, and smartphones manipulate our emotions, infantilise us, and weaken our ability to rebuild what 2020 has ravaged.
caelengrubb

Cognitive Bias and Public Health Policy During the COVID-19 Pandemic | Critical Care Medicine | JAMA | JAMA Network - 0 views

  • As the coronavirus disease 2019 (COVID-19) pandemic abates in many countries worldwide, and a new normal phase arrives, critically assessing policy responses to this public health crisis may promote better preparedness for the next wave or the next pandemic
  • A key lesson is revealed by one of the earliest and most sizeable US federal responses to the pandemic: the investment of $3 billion to build more ventilators. These extra ventilators, even had they been needed, would likely have done little to improve population survival because of the high mortality among patients with COVID-19 who require mechanical ventilation and diversion of clinicians away from more health-promoting endeavors.
  • Why are so many people distressed at the possibility that a patient in plain view—such as a person presenting to an emergency department with severe respiratory distress—would be denied an attempt at rescue because of a ventilator shortfall, but do not mount similarly impassioned concerns regarding failures to implement earlier, more aggressive physical distancing, testing, and contact tracing policies that would have saved far more lives?
  • ...12 more annotations...
  • These cognitive errors, which distract leaders from optimal policy making and citizens from taking steps to promote their own and others’ interests, cannot merely be ascribed to repudiations of science.
  • The first error that thwarts effective policy making during crises stems from what economists have called the “identifiable victim effect.” Humans respond more aggressively to threats to identifiable lives, ie, those that an individual can easily imagine being their own or belonging to people they care about (such as family members) or care for (such as a clinician’s patients) than to the hidden, “statistical” deaths reported in accounts of the population-level tolls of the crisis
  • Yet such views represent a second reason for the broad endorsement of policies that prioritize saving visible, immediately jeopardized lives: that humans are imbued with a strong and neurally mediated3 tendency to predict outcomes that are systematically more optimistic than observed outcomes
  • A third driver of misguided policy responses is that humans are present biased, ie, people tend to prefer immediate benefits to even larger benefits in the future.
  • Even if the tendency to prioritize visibly affected individuals could be resisted, many people would still place greater value on saving a life today than a life tomorrow.
  • Similar psychology helps explain the reluctance of many nations to limit refrigeration and air conditioning, forgo fuel-inefficient transportation, and take other near-term steps to reduce the future effects of climate change
  • The fourth contributing factor is that virtually everyone is subject to omission bias, which involves the tendency to prefer that a harm occur by failure to take action rather than as direct consequence of the actions that are taken
  • Although those who set policies for rationing ventilators and other scarce therapies do not intend the deaths of those who receive insufficient priority for these treatments, such policies nevertheless prevent clinicians from taking all possible steps to save certain lives.
  • An important goal of governance is to mitigate the effects of these and other biases on public policy and to effectively communicate the reasons for difficult decisions to the public. However, health systems’ routine use of wartime terminology of “standing up” and “standing down” intensive care units illustrate problematic messaging aimed at the need to address immediate danger
  • Second, had governments, health systems, and clinicians better understood the “identifiable victim effect,” they may have realized that promoting flattening the curve as a way to reduce pressure on hospitals and health care workers would be less effective than promoting early restaurant and retail store closures by saying “The lives you save when you close your doors include your own.”
  • Third, these leaders’ routine use of terms such as “nonpharmaceutical interventions”9 portrays public health responses negatively by labeling them according to what they are not. Instead, support for heavily funding contact tracing could have been generated by communicating such efforts as “lifesaving.
  • Fourth, although errors of human cognition are challenging to surmount, policy making, even in a crisis, occurs over a sufficient period to be meaningfully improved by deliberate efforts to counter untoward biases
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
Javier E

Why Study History? (1985) | AHA - 0 views

  • Isn't there quite enough to learn about the world today? Why add to the burden by looking at the past
  • Historical knowledge is no more and no less than carefully and critically constructed collective memory. As such it can both make us wiser in our public choices and more richly human in our private lives.
  • Without individual memory, a person literally loses his or her identity, and would not know how to act in encounters with others. Imagine waking up one morning unable to tell total strangers from family and friends!
  • ...37 more annotations...
  • Collective memory is similar, though its loss does not immediately paralyze everyday private activity. But ignorance of history-that is, absent or defective collective memory-does deprive us of the best available guide for public action, especially in encounters with outsider
  • Often it is enough for experts to know about outsiders, if their advice is listened to. But democratic citizenship and effective participation in the determination of public policy require citizens to share a collective memory, organized into historical knowledge and belief
  • This value of historical knowledge obviously justifies teaching and learning about what happened in recent times, for the way things are descends from the way they were yesterday and the day before that
  • in fact, institutions that govern a great deal of our everyday behavior took shape hundreds or even thousands of years ago
  • Only an acquaintance with the entire human adventure on earth allows us to understand these dimensions of contemporary reality.
  • it follows that study of history is essential for every young person.
  • Collective memory is quite the same. Historians are always at work reinterpreting the past, asking new questions, searching new sources and finding new meanings in old documents in order to bring the perspective of new knowledge and experience to bear on the task of understanding the past.
  • what we know and believe about history is always changing. In other words, our collective, codified memory alters with time just as personal memories do, and for the same reasons.
  • skeptics are likely to conclude that history has no right to take student time from other subjects. If what is taught today is not really true, how can it claim space in a crowded school curriculum?
  • what if the world is more complicated and diverse than words can ever tell? What if human minds are incapable of finding' neat pigeon holes into which everything that happens will fit?
  • What if we have to learn to live with uncertainty and probabilities, and act on the basis of the best guesswork we are capable of?
  • Then, surely, the changing perspectives of historical understanding are the very best introduction we can have to the practical problems of real life. Then, surely, a serious effort to understand the interplay of change and continuity in human affairs is the only adequate introduction human beings can have to the confusing flow of events that constitutes the actual, adult world.
  • Memory is not something fixed and forever. As time passes, remembered personal experiences take on new meanings.
  • Early in this century, teachers and academic administrators pretty well agreed that two sorts of history courses were needed: a survey of the national history of the United States and a survey of European history.
  • Memory, indeed, makes us human. History, our collective memory, carefully codified and critically revised, makes us social, sharing ideas and ideals with others so as to form all sorts of different human groups
  • The varieties of history are enormous; facts and probabilities about the past are far too numerous for anyone to comprehend them all. Every sort of human group has its own histor
  • Where to start? How bring some sort of order to the enormous variety of things known and believed about the past?
  • Systematic sciences are not enough. They discount time, and therefore oversimplify reality, especially human reality.
  • This second course was often broadened into a survey of Western civilization in the 1930s and 1940s
  • But by the 1960s and 1970s these courses were becoming outdated, left behind by the rise of new kinds social and quantitative history, especially the history of women, of Blacks, and of other formerly overlooked groups within the borders of the United States, and of peoples emerging from colonial status in the world beyond our borders.
  • much harder to combine old with new to make an inclusive, judiciously balanced (and far less novel) introductory course for high school or college students.
  • But abandoning the effort to present a meaningful portrait of the entire national and civilizational past destroyed the original justification for requiring students to study history
  • Competing subjects abounded, and no one could or would decide what mattered most and should take precedence. As this happened, studying history became only one among many possible ways of spending time in school.
  • The costs of this change are now becoming apparent, and many concerned persons agree that returning to a more structured curriculum, in which history ought to play a prominent part, is imperative.
  • three levels of generality seem likely to have the greatest importance for ordinary people.
  • First is family, local, neighborhood history
  • Second is national history, because that is where political power is concentrated in our time.
  • Last is global history, because intensified communications make encounters with all the other peoples of the earth increasingly important.
  • Other pasts are certainly worth attention, but are better studied in the context of a prior acquaintance with personal-local, national, and global history. That is because these three levels are the ones that affect most powerfully what all other groups and segments of society actually do.
  • National history that leaves out Blacks and women and other minorities is no longer acceptable; but American history that leaves out the Founding Fathers and the Constitution is not acceptable either. What is needed is a vision of the whole, warts and all.
  • the study of history does not lead to exact prediction of future events. Though it fosters practical wisdom, knowledge of the past does not permit anyone to know exactly what is going to happen
  • Consequently, the lessons of history, though supremely valuable when wisely formulated, become grossly misleading when oversimplifiers try to transfer them mechanically from one age to another, or from one place to another.
  • Predictable fixity is simply not the human way of behaving. Probabilities and possibilities-together with a few complete surprises-are what we live with and must learn to expect.
  • Second, as acquaintance with the past expands, delight in knowing more and more can and often does become an end in itself.
  • On the other hand, studying alien religious beliefs, strange customs, diverse family patterns and vanished social structures shows how differently various human groups have tried to cop
  • Broadening our humanity and extending our sensibilities by recognizing sameness and difference throughout the recorded past is therefore an important reason for studying history, and especially the history of peoples far away and long ago
  • For we can only know ourselves by knowing how we resemble and how we differ from others. Acquaintance with the human past is the only way to such self knowledge.
Javier E

The View from Nowhere: Questions and Answers » Pressthink - 2 views

  • In pro journalism, American style, the View from Nowhere is a bid for trust that advertises the viewlessness of the news producer. Frequently it places the journalist between polarized extremes, and calls that neither-nor position “impartial.” Second, it’s a means of defense against a style of criticism that is fully anticipated: charges of bias originating in partisan politics and the two-party system. Third: it’s an attempt to secure a kind of universal legitimacy that is implicitly denied to those who stake out positions or betray a point of view. American journalists have almost a lust for the View from Nowhere because they think it has more authority than any other possible stance.
  • Who gets credit for the phrase, “view from nowhere?” # A. The philosopher Thomas Nagel, who wrote a very important book with that title.
  • Q. What does it say? # A. It says that human beings are, in fact, capable of stepping back from their position to gain an enlarged understanding, which includes the more limited view they had before the step back. Think of the cinema: when the camera pulls back to reveal where a character had been standing and shows us a fuller tableau. To Nagel, objectivity is that kind of motion. We try to “transcend our particular viewpoint and develop an expanded consciousness that takes in the world more fully.” #
  • ...11 more annotations...
  • But there are limits to this motion. We can’t transcend all our starting points. No matter how far it pulls back the camera is still occupying a position. We can’t actually take the “view from nowhere,” but this doesn’t mean that objectivity is a lie or an illusion. Our ability to step back and the fact that there are limits to it– both are real. And realism demands that we acknowledge both.
  • Q. So is objectivity a myth… or not? # A. One of the many interesting things Nagel says in that book is that “objectivity is both underrated and overrated, sometimes by the same persons.” It’s underrated by those who scoff at it as a myth. It is overrated by people who think it can replace the view from somewhere or transcend the human subject. It can’t.
  • When MSNBC suspends Keith Olbermann for donating without company permission to candidates he supports– that’s dumb. When NPR forbids its “news analysts” from expressing a view on matters they are empowered to analyze– that’s dumb. When reporters have to “launder” their views by putting them in the mouths of think tank experts: dumb. When editors at the Washington Post decline even to investigate whether the size of rallies on the Mall can be reliably estimated because they want to avoid charges of “leaning one way or the other,” as one of them recently put it, that is dumb. When CNN thinks that, because it’s not MSNBC and it’s not Fox, it’s the only the “real news network” on cable, CNN is being dumb about itself.
  • Let some in the press continue on with the mask of impartiality, which has advantages for cultivating sources and soothing advertisers. Let others experiment with transparency as the basis for trust. When you click on their by-line it takes you to a disclosure page where there is a bio, a kind of mission statement, and a creative attempt to say: here’s where I’m coming from (one example) along with campaign contributions, any affiliations or memberships, and–I’m just speculating now–a list of heroes and villains, or major influences, along with an archive of the work, plus anything else that might assist the user in placing this person on the user’s mattering map.
  • if objectivity means trying to ground truth claims in verifiable facts, I am definitely for that. If it means there’s a “hard” reality out there that exists beyond any of our descriptions of it, sign me up. If objectivity is the requirement to acknowledge what is, regardless of whether we want it to be that way, then I want journalists who can be objective in that sense.
  • If it means trying to see things in that fuller perspective Thomas Nagel talked about–pulling the camera back, revealing our previous position as only one of many–I second the motion. If it means the struggle to get beyond the limited perspective that our experience and upbringing afford us… yeah, we need more of that, not less. I think there is value in acts of description that do not attempt to say whether the thing described is good or bad
  • I think we are in the midst of shift in the system by which trust is sustained in professional journalism. David Weinberger tried to capture it with his phrase: transparency is the new objectivity. My version of that: it’s easier to trust in “here’s where I’m coming from” than the View from Nowhere. These are two different ways of bidding for the confidence of the users.
  • In the newer way, the logic is different. “Look, I’m not going to pretend that I have no view. Instead, I am going to level with you about where I’m coming from on this. So factor that in when you evaluate my report. Because I’ve done the work and this is what I’ve concluded…”
  • it has unearned authority in the American press. If in doing the serious work of journalism–digging, reporting, verification, mastering a beat–you develop a view, expressing that view does not diminish your authority. It may even add to it. The View from Nowhere doesn’t know from this. It also encourages journalists to develop bad habits. Like: criticism from both sides is a sign that you’re doing something right, when you could be doing everything wrong.
  • Who gets credit for the phrase, “view from nowhere?” # A. The philosopher Thomas Nagel, who wrote a very important book with that title.
  • It says that human beings are, in fact, capable of stepping back from their position to gain an enlarged understanding, which includes the more limited view they had before the step back. Think of the cinema: when the camera pulls back to reveal where a character had been standing and shows us a fuller tableau. To Nagel, objectivity is that kind of motion. We try to “transcend our particular viewpoint and develop an expanded consciousness that takes in the world more fully.”
melnikju

ABOUT EDUCATION; CRITICAL THINKING - The New York Times - 1 views

    • melnikju
       
      Essentially what our TOK class is!
  • Often, it is more difficult to find out what causes a problem than to solve it
  • Everyday problems usually do not have one ''right'' solution.
  • ...2 more annotations...
    • melnikju
       
      Extremely stressful for high school and college students because your choices can decide your entire future
  • Finally, somebody warned some time ago that when a problem is insoluble, it is not a problem: it is a fact
Javier E

Facebook and Twitter Dodge a 2016 Repeat, and Ignite a 2020 Firestorm - The New York Times - 1 views

  • It’s true that banning links to a story published by a 200-year-old American newspaper — albeit one that is now a Rupert Murdoch-owned tabloid — is a more dramatic step than cutting off WikiLeaks or some lesser-known misinformation purveyor. Still, it’s clear that what Facebook and Twitter were actually trying to prevent was not free expression, but a bad actor using their services as a conduit for a damaging cyberattack or misinformation.
  • These decisions get made quickly, in the heat of the moment, and it’s possible that more contemplation and debate would produce more satisfying choices. But time is a luxury these platforms don’t always have. In the past, they have been slow to label or remove dangerous misinformation about Covid-19, mail-in voting and more, and have only taken action after the bad posts have gone viral, defeating the purpose.
  • That left the companies with three options, none of them great. Option A: They could treat the Post’s article as part of a hack-and-leak operation, and risk a backlash if it turned out to be more innocent. Option B: They could limit the article’s reach, allowing it to stay up but choosing not to amplify it until more facts emerged. Or, Option C: They could do nothing, and risk getting played again by a foreign actor seeking to disrupt an American election.
  • ...8 more annotations...
  • On Wednesday, several prominent Republicans, including Mr. Trump, repeated their calls for Congress to repeal Section 230 of the Communications Decency Act, a law that shields tech platforms from many lawsuits over user-generated content.
  • That leaves the companies in a precarious spot. They are criticized when they allow misinformation to spread. They are also criticized when they try to prevent it.
  • Perhaps the strangest idea to emerge in the past couple of days, though, is that these services are only now beginning to exert control over what we see. Representative Doug Collins, Republican of Georgia, made this point in a letter to Mark Zuckerberg, the chief executive of Facebook, in which he derided the social network for using “its monopoly to control what news Americans have access to.”
  • The truth, of course, is that tech platforms have been controlling our information diets for years, whether we realized it or not. Their decisions were often buried in obscure “community standards” updates, or hidden in tweaks to the black-box algorithms that govern which posts users see.
  • Their leaders have always been editors masquerading as engineers.
  • What’s happening now is simply that, as these companies move to rid their platforms of bad behavior, their influence is being made more visible.
  • Rather than letting their algorithms run amok (which is an editorial choice in itself), they’re making high-stakes decisions about flammable political misinformation in full public view, with human decision makers who can be debated and held accountable for their choices.
  • After years of inaction, Facebook and Twitter are finally starting to clean up their messes. And in the process, they’re enraging the powerful people who have thrived under the old system.
Javier E

Harold Bloom Is Dead. But His 'Rage for Reading' Is Undiminished. - The New York Times - 0 views

  • It’s a series of meditations on what Bloom believes to be the most important novels we have, and it takes for granted that its readers already know the books under consideration; in other words, that they have already absorbed “the canon,” and are eager to reconsider it later in their lives.
  • A not atypical, almost throwaway passage for you to test the waters on: “Tolstoy, as befits the writer since Shakespeare who most has the art of the actual, combines in his representational praxis the incompatible powers of Homer and the Yahwist.” This is not Bloom showing off; it’s the way Bloom thinks and proceeds.
  • Apart from his novelists, his frame of reference rests on Shakespeare above all others, Homer, Chaucer, Dante, Montaigne, Emerson, Dr. Johnson (the “shrewdest of all literary critics”), Blake, Wordsworth, Whitman (for him, the central American writer of the 19th century), Wallace Stevens, Freud
  • ...6 more annotations...
  • Among the novelists, Cervantes, Tolstoy (supreme), Melville, Austen, Proust, Joyce.
  • He is inevitably at his strongest when dealing with those writers he cares most about. With Jane Austen, for one. And, above all, with Tolstoy:
  • As for Dickens, whose “David Copperfield” was a direct influence on Tolstoy, to Bloom his greatest achievement is “Bleak House”
  • He pairs it with Dickens’s final complete novel, “Our Mutual Friend,” a book I care for so extravagantly that I’ve read it three times
  • The two works in which Bloom is most fully invested are “Moby-Dick” (40 pages) and “Ulysses” (54)
  • He chooses to give room to not one but two of Le Guin’s novels, “The Left Hand of Darkness” and “The Dispossessed,”
Javier E

The Dictionary Is Telling People How to Speak Again - The Atlantic - 1 views

  • print dictionaries have embodied certain ideas about democracy and capitalism that seem especially American—specifically, the notion that “good” English can be packaged and sold, becoming accessible to anyone willing to work hard enough to learn it.
  • Massive social changes in the 1960s accompanied the appearance Webster’s Third, and a new era arose for dictionaries: one in which describing how people use language became more important than showing them how to do so properly. But that era might finally be coming to an end, thanks to the internet, the decline of print dictionaries, and the political consequences of an anything-goes approach to language.
  • The standard way of describing these two approaches in lexicography is to call them “descriptivist” and “prescriptivist.” Descriptivist lexicographers, steeped in linguistic theory, eschew value judgements about so-called correct English and instead describe how people are using the language. Prescriptivists, by contrast, inform readers which usage is “right” and which is “wrong.”
  • ...11 more annotations...
  • Many American readers, though, didn’t want a non-hierarchical assessment of their language. They wanted to know which usages were “correct,” because being able to rely on a dictionary to tell you how to sound educated and upper class made becoming upper class seem as if it might be possible. That’s why the public responded badly to Webster’s latest: They craved guidance and rules.
  • Webster’s Third so unnerved critics and customers because the American idea of social mobility is limited, provisional, and full of paradoxes
  • There’s no such thing as social mobility if everyone can enjoy it. To be allowed to move around within a hierarchy implies that the hierarchy must be left largely intact. But in America, people have generally accepted the idea of inherited upper-class status, while seeing upward social mobility as something that must be earned.
  • In a 2001 Harper’s essay about the Webster’s Third controversy, David Foster Wallace called the publication of the dictionary “the Fort Sumter of the contemporary usage wars.”
  • for decades after the publication of Webster’s Third, people still had intense opinions about dictionaries. In the 1990s, an elderly copy editor once told me, with considerable vehemence, that Merriam-Webster’s Dictionaries were “garbage.” She would only use Houghton Mifflin’s American Heritage Dictionary, which boasted a Usage Panel of experts to advise readers about the finer points of English grammar
  • what descriptivists do: They describe rather than judge. Nowadays, this approach to dictionary making is generally not contested or even really discussed.
  • In his 2009 book Going Nucular, Geoffrey Nunberg observes that we now live in a culture in which there are no clear distinctions between highbrow, middlebrow, and lowbrow culture. It stands to reason that in a society in which speaking in a recognizably “highbrow” way confers no benefits, dictionaries will likely matter less
  • If American Heritage was aggressively branding itself in the 1960s, Merriam-Webster is doing the same now.
  • The company has a feisty blog and Twitter feed that it uses to criticize linguistic and grammatical choices. President Trump and his administration are regular catalysts for social-media clarifications by Merriam-Webster. The company seems bothered when Trump and his associates change the meanings of words for their own convenience, or when they debase the language more generally.
  • it seems that the way the company has regained its relevance in the post-print era is by having a strong opinions about how people should use English.
  • It may be that in spite of Webster’s Third’s noble intentions, language may just be too human a thing to be treated in an entirely detached, scientific way. Indeed, I’m not sure I want to live in a society in which citizens can’t call out government leaders when they start subverting language in distressing ways.
Javier E

'I Like to Watch,' by Emily Nussbaum book review - The Washington Post - 0 views

  • Nussbaum’s case: That television could be great, and not because it was “novelistic” or “cinematic” but because it was, simply, television, “episodic, collaborative, writer-driven, and formulaic” by design.
  • According to Nussbaum, a TV show achieved greatness not despite these facts (which assumes they are limitations) but because of them (which sees them as an infrastructure that provokes creativity and beauty — “the sort that govern sonnets,”
  • Nussbaum’s once-iconoclastic views have become mainstream.
  • ...8 more annotations...
  • It is increasingly common to find yourself apologizing not for watching too much TV but for having failed to spend 70 hours of your precious, finite life binge-watching one of the Golden Age of Television’s finest offerings.
  • Nussbaum writes of her male classmates at NYU, where she was a literature doctoral student in the late 1990s. These men worshiped literature and film; they thought TV was trash. These men “were also, not coincidentally, the ones whose opinions tended to dominate mainstream media conversation.”
  • the same forces that marginalize the already-marginalized still work to keep TV shows by and about women, people of color, and LGBTQ+ individuals on a lower tier than those about cis, straight, white men: Your Tony Sopranos, your Walter Whites, your Don Drapers, your True Detectives
  • Over and over, Nussbaum pushes back against a hierarchy that rewards dramas centered on men and hyperbolically masculine pursuits (dealing drugs, being a cop, committing murders, having sex with beautiful women) and shoves comedies and whatever scans as “female” to the side.
  • Nussbaum sticks up for soaps, rom-coms, romance novels and reality television, “the genres that get dismissed as fluff, which is how our culture regards art that makes women’s lives look like fun.
  • Nussbaum’s writing consistently comes back to the question of “whose stories carried weight . . . what kind of creativity counted as ambitious, and who . . . deserved attention . . . Whose story counted as universal?
  • What does it mean to think morally about the art we consume — and, by extension, financially support, and center in our emotional and imaginative lives? The art that informs, on some near-cellular level, who we want to know and love and be?
  • maybe the next frontier of cultural thought is in thinking more cohesively about what we’ve long compartmentalized — of not stashing conflicting feelings about good art by bad men in some dark corner of our minds, but in holding our discomfort and contradictions up to the light, for a clearer view.
ilanaprincilus06

Humans need to become smarter thinkers to beat climate denial | Dana Nuccitelli | Environment | The Guardian - 0 views

  • using ‘misconception-based learning’ to dislodge climate myths from peoples’ brains and replace them with facts, and beating denial by inoculating people against misinformers’ tricks.
  • The idea is that when people are faced with a myth and a competing fact, the fact will more easily win out if the fallacy underpinning the myth is revealed.
  • If people can learn to implement a simple six-step critical thinking process, they’ll be able to evaluate whether climate-related claims are valid.
  • ...13 more annotations...
  • Identify the claim being made
  • the most popular contrarian argument: “Earth’s climate has changed naturally in the past, so current climate change is natural.”
  • Construct the argument by identifying the premises leading to that conclusion.
  • Determine whether the argument is deductive, meaning that it starts out with a general statement and reaches a definitive conclusion.
  • the first premise is that Earth’s climate has changed in the past through natural processes, and the second premise is that the climate is currently changing.
  • Check the argument for validity; does the conclusion follow from the premises?
  • Identify hidden premises. By adding an extra premise to make an invalid argument valid, we can gain a deeper understanding of why the argument is flawed.
  • the hidden assumption is “if nature caused climate change in the past, it must always be the cause of climate change.”
  • Check to see if the argument relies on ambiguity.
  • Not all climate change is equal
  • Therefore, human activity is necessary to explain current climate change.
  • If the argument hasn’t yet been ruled out, determine the truth of its premises.
  • the argument that “if something was the cause in the past, it will be the cause in the future” is invalid if the effect has multiple plausible causes or mechanisms
pier-paolo

THE CLOSE READER; Powers of Perception - The New York Times - 0 views

  • Keller's writing jars the contemporary reader in three ways. First, she composes in the grandiose manner favored by the late-19th-century genteel essayist, with lots of quotations and inverted sentences. Second, she gushes with a girlish gratefulness that registers, in our more cynical time, as more ingratiating than genuine
  • Keller violates a cardinal rule of autobiography, which is to distinguish what you have been told from what you know from experience. She narrates, as if she knew them firsthand, events from very early childhood and the first stages of her education -- neither of which she could possibly remember herself, at least not in such detail.
  • When Keller's book came out in 1903, she was criticized by one reviewer for her constant, un-self-conscious allusions to color and music. ''All her knowledge is hearsay knowledge,'' this critic wrote in The Nation, ''her very sensations are for the most part vicarious, and yet she writes of things beyond her powers of perception with the assurance of one who has verified every word.'
  • ...5 more annotations...
  • Maybe Shattuck is right and we are all like this -- creatures of language, rather than its masters. Much of what we think we know firsthand we probably picked up from books or newspapers or friends or lovers and never checked against the world at all.
  • Her ability to experience what others felt and heard, she said, illustrated the power of imagination, particularly one that had been developed and extended, as hers was, by books.
  • What she knew of her own observation is exactly what we want to know from her. We want to know what it felt like to be Helen Keller. We want to locate the boundaries between what was real to her and what she was forced to imagine. At least in this book, she seems not to have known where that boundary might lie.
  • He tries to remember what he looks like and discovers that he cannot. He asks: ''To what extent is loss of the image of the face connected with loss of the image of the self? Is this one of the reasons why I often feel that I am mere spirit, a ghost, a memory?''
  • Keller, in short, matured, both as a person and a writer. She mastered a lesson that relatively few with all their senses have ever mastered, which is to write about what you know.
runlai_jiang

Trump told to mute Twitter critics, not block them, by New York judge - BBC News - 0 views

  • A judge has advised US President Donald Trump to mute rather than block his Twitter critics after users of the service filed a lawsuit against him.
« First ‹ Previous 101 - 120 of 505 Next › Last »
Showing 20 items per page