Skip to main content

Home/ TOK Friends/ Group items matching "Entertainment" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
dpittenger

What Your Job Says About Your Politics - 0 views

  • Why is an air traffic controller more likely to be a Democrat than a pilot? Why is nearly the entire entertainment industry Democratic, while the majority of surgeons are Republican?
  • Edmond said it was also interesting to see how “unsurprising” many of the results were.
  • “Most of us probably already have the notion that, say, coal miners lean to the right and environmentalists to the left, and it's amusing to see how much that's confirmed by the numbers,” he said.
Javier E

Reimagining Televised Debates - The Daily Dish | By Andrew Sullivan - 0 views

  • television forces those who appear on it to argue "directly, and pointedly, in a short amount of time." This shapes how debates unfold because "concision actually favors the spouting of conventional thinking."
  • What if a television network tried to run a debate show like the back-and-forths that sometimes occur in print?
  • if executed correctly, the quality of argument and entertainment would be far better than any of the talking head exchanges currently broadcast on cable.
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

Facebook's New Strategy to Turn Eyeballs Into Influence - NYTimes.com - 1 views

  • Facebook, in short, aims not to be a Web site you spend a lot of time on, but something that defines your online — and increasingly offline — life.
  • Searching the Web is still the way most people discover content — whether it is news, information about wedding photographers or Swiss chard recipes. Facebook is trying to change that: in effect, friends will direct other friends to content.
  • it is teaming up with companies that distribute music, movies, information and games in positioning itself to become the conduit where news and entertainment is found and consumed.
  • ...2 more annotations...
  • A new feature called Timeline lets users post information about their past, like weddings and big vacations. And everywhere on the site, users will be able to more precisely signal what they are reading, watching, hearing or eating. This will let Facebook reap even more valuable data than it does now about its users’ habits and desires, which in turn can be used to sell more fine-tuned advertising.
  • The site’s evolution could make it easier for them to decide how to spend their time and money. But it could also potentially allow them to shut out alternative viewpoints and information that is not being shared among their set of friends.
Javier E

We Are Just Not Digging The Whole Anymore : NPR - 1 views

  • We just don't do whole things anymore. We don't read complete books — just excerpts. We don't listen to whole CDs — just samplings. We don't sit through whole baseball games — just a few innings. Don't even write whole sentences. Or read whole stories like this one. Long-form reading, listening and viewing habits are giving way to browse-and-choose consumption. With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span. - Adam Thierer, senior research fellow at George Mason University We care more about the parts and less about the entire. We are into snippets and smidgens and clips and tweets. We are not only a fragmented society, but a fragment society.
  • One Duke University student was famously quoted in a 2006 Time magazine essay telling his history professor, "We don't read whole books anymore."
  • Now there are lots of websites that present whole books and concepts in nano form
  • ...5 more annotations...
  • Here is the ultra-condensation of Pride and Prejudice by Jane Austen: Mr. Darcy: Nothing is good enough for me. Ms. Elizabeth Bennet: I could never marry that proud man. (They change their minds.) THE END
  • nearly half of all adults — 47 percent — get some of their local news and information on mobile computing devices. We are receiving our news in kibbles and bits, sacrificing context and quality for quickness and quantity.
  • Fewer and fewer gamers are following gaming storylines all the way to completion, according to a recent blog post on the IGN Entertainment video game website.
  • "With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span." He ticks off a long list of bandied-about terms. Here's a shortened version: cognitive overload; information paralysis; techno stress; and data asphyxiation.
  • Rockmore believes that the way many people learn — or try to learn — these days is via this transporter technique. "The truth is," he says, "that modern pedagogy probably needs to address this in the sense that there is so much information out there, for free, so that obtaining it — even in bits and pieces — is not the challenge, rather integrating it into a coherent whole is. That's a new paradigm."
Javier E

Is Twitter dominated by 0.05% of users? | Felix Salmon - 0 views

  • Twitter — along with Facebook — is at the forefront of what Arianna Huffington astutely identified as an incredibly important and powerful new trend: “self-expression has become the new entertainment”. Think about it this way: what would happen if Twitter was reduced to just those 20,000 accounts broadcasting to the Twitter user base, with nobody else writing anything at all? The service, obviously, would die in a matter of days. The 20,000 most-read Twitter accounts are the bread in the typical user’s sandwich; the flavor comes from everything else — their friends, their unique interests, and, crucially, their own contributions to the stream.
Javier E

Our Machine Masters - NYTimes.com - 0 views

  • the smart machines of the future won’t be humanlike geniuses like HAL 9000 in the movie “2001: A Space Odyssey.” They will be more modest machines that will drive your car, translate foreign languages, organize your photos, recommend entertainment options and maybe diagnose your illnesses. “Everything that we formerly electrified we will now cognitize,” Kelly writes. Even more than today, we’ll lead our lives enmeshed with machines that do some of our thinking tasks for us.
  • This artificial intelligence breakthrough, he argues, is being driven by cheap parallel computation technologies, big data collection and better algorithms. The upshot is clear, “The business plans of the next 10,000 start-ups are easy to forecast: Take X and add A.I.”
  • Two big implications flow from this. The first is sociological. If knowledge is power, we’re about to see an even greater concentration of power.
  • ...14 more annotations...
  • in 2001, the top 10 websites accounted for 31 percent of all U.S. page views, but, by 2010, they accounted for 75 percent of them.
  • The Internet has created a long tail, but almost all the revenue and power is among the small elite at the head.
  • Advances in artificial intelligence will accelerate this centralizing trend. That’s because A.I. companies will be able to reap the rewards of network effects. The bigger their network and the more data they collect, the more effective and attractive they become.
  • As a result, our A.I. future is likely to be ruled by an oligarchy of two or three large, general-purpose cloud-based commercial intelligences.”
  • engineers at a few gigantic companies will have vast-though-hidden power to shape how data are collected and framed, to harvest huge amounts of information, to build the frameworks through which the rest of us make decisions and to steer our choices. If you think this power will be used for entirely benign ends, then you have not read enough history.
  • The second implication is philosophical. A.I. will redefine what it means to be human. Our identity as humans is shaped by what machines and other animals can’t do
  • On the other hand, machines cannot beat us at the things we do without conscious thinking: developing tastes and affections, mimicking each other and building emotional attachments, experiencing imaginative breakthroughs, forming moral sentiments.
  • For the last few centuries, reason was seen as the ultimate human faculty. But now machines are better at many of the tasks we associate with thinking — like playing chess, winning at Jeopardy, and doing math.
  • In the age of smart machines, we’re not human because we have big brains. We’re human because we have social skills, emotional capacities and moral intuitions.
  • I could paint two divergent A.I. futures, one deeply humanistic, and one soullessly utilitarian.
  • In the cold, utilitarian future, on the other hand, people become less idiosyncratic. If the choice architecture behind many decisions is based on big data from vast crowds, everybody follows the prompts and chooses to be like each other. The machine prompts us to consume what is popular, the things that are easy and mentally undemanding.
  • In this future, there is increasing emphasis on personal and moral faculties: being likable, industrious, trustworthy and affectionate. People are evaluated more on these traits, which supplement machine thinking, and not the rote ones that duplicate it
  • In the humanistic one, machines liberate us from mental drudgery so we can focus on higher and happier things. In this future, differences in innate I.Q. are less important. Everybody has Google on their phones so having a great memory or the ability to calculate with big numbers doesn’t help as much.
  • In the current issue of Wired, the technology writer Kevin Kelly says that we had all better get used to this level of predictive prowess. Kelly argues that the age of artificial intelligence is finally at hand.
aqconces

As Mainstream Religion Embraces Marriage Equality, The "Threat to Religious Liberty" Argument Crumbles | Jeffrey S. Trachtman - 0 views

  • Allowing all couples to marry has zero impact on anybody's legally protected free exercise of religion
  • The legal basics of marriage equality and religion are not that complicated: The courts are addressing only civil marriage. No church will ever be forced to change its definition of marriage
  • Problems arise only when religious organizations act in the commercial realm -- such as by operating a public catering hall -- or when business owners harbor religious qualms about serving gay people (or black people or Japanese people) in their restaurants, bakeries, entertainment venues, or other public accommodations. In these settings, civil rights laws trump personal preferences -- even those based on religious conviction. That's a consensus we came to as a society long ago. Have a religious objection to serving Mexicans or Mormons or interracial couples? Don't open a lunch counter.
  • ...2 more annotations...
  • The related suggestion that the freedom to marry violates some uniform "religious" definition of marriage is equally misguided.
  • But the freedom to marry is now endorsed as well by the Episcopal Church, the Religious Society of Friends (Quakers)
Javier E

Ann Coulter Is Right to Fear the World Cup - Peter Beinart - The Atlantic - 1 views

  • Ann Coulter penned a column explaining why soccer is un-American. First, it’s collectivist. (“Individual achievement is not a big factor…blame is dispersed.”) Second, it’s effeminate. (“It’s a sport in which athletic talent finds so little expression that girls can play with boys.”) Third, it’s culturally elitist. (“The same people trying to push soccer on Americans are the ones demanding that we love HBO’s “Girls,” light-rail, Beyoncé and Hillary Clinton.”) Fourth, and most importantly, “It’s foreign…Soccer is like the metric system, which liberals also adore because it’s European.”
  • Soccer hatred, in other words, exemplifies American exceptionalism.
  • For Coulter and many contemporary conservatives, by contrast, part of what makes America exceptional is its individualism, manliness and populism
  • ...22 more annotations...
  • Coulter’s deeper point is that for America to truly be America, it must stand apart
  • The core problem with embracing soccer is that in so doing, America would become more like the rest of the world.
  • America’s own league, Major League Soccer, draws as many fans to its stadiums as do the NHL and NBA.
  • I wrote an essay entitled “The End of American Exceptionalism,” which argued that on subjects where the United States has long been seen as different, attitudes in America increasingly resemble those in Europe. Soccer is one of the best examples yet.
  • “Soccer,” Markovits and Hellerman argue, “was perceived by both native-born Americans and immigrants as a non-American activity at a time in American history when nativism and nationalism emerged to create a distinctly American self-image … if one liked soccer, one was viewed as at least resisting—if not outright rejecting—integration into America.”
  • The average age of Americans who call baseball their favorite sport is 53. Among Americans who like football best, it’s 46. Among Americans who prefer soccer, by contrast, the average age is only 37.
  • Old-stock Americans, in other words, were elevating baseball, football, and basketball into symbols of America’s distinct identity. Immigrants realized that embracing those sports offered a way to claim that identity for themselves. Clinging to soccer, by contrast, was a declaration that you would not melt.
  • why is interest in soccer rising now? Partly, because the United States is yet again witnessing mass immigration from soccer-mad nations.
  • the key shift is that America’s sports culture is less nativist. More native-born Americans now accept that a game invented overseas can become authentically American, and that the immigrants who love it can become authentically American too. Fewer believe that to have merit, something must be invented in the United States.
  • why didn’t soccer gain a foothold in the U.S. in the decades between the Civil War and World War I, when it was gaining dominance in Europe? Precisely because it was gaining dominance in Europe. The arbiters of taste in late 19th and early 20th century America wanted its national pastimes to be exceptional.
  • Americans over the age of 50 were 15 points more likely to say “our culture is superior” than were people over 50 in Germany, Spain, Britain, and France
  • Americans under 30, by contrast, were actually less likely to say “our culture is superior” than their counterparts in Germany, Spain, and Britain.
  • Americans today are less likely to insist that America’s way of doing things is always best. In 2002, 60 percent of Americans told the Pew Research Center that, “our culture is superior to others.” By 2011, it was down to 49 percent.
  • the third major pro-soccer constituency is liberals. They’re willing to embrace a European sport for the same reason they’re willing to embrace a European-style health care system: because they see no inherent value in America being an exception to the global rule
  • When the real-estate website Estately created a seven part index to determine a state’s love of soccer, it found that Washington State, Maryland, the District of Columbia, New York, and New Jersey—all bright blue—loved soccer best, while Alabama, Arkansas, North Dakota, Mississippi and Montana—all bright red—liked it least.
  • the soccer coalition—immigrants, liberals and the young—looks a lot like the Obama coalition.
  • Sports-wise, therefore, Democrats constitute an alliance between soccer and basketball fans while Republicans disproportionately follow baseball, golf, and NASCAR. Football, by far America’s most popular sport, crosses the aisle.)
  • The willingness of growing numbers of Americans to embrace soccer bespeaks their willingness to imagine a different relationship with the world. Historically, conservative foreign policy has oscillated between isolationism and imperialism. America must either retreat from the world or master it. It cannot be one among equals, bound by the same rules as everyone else
  • Exceptionalists view sports the same way. Coulter likes football, baseball, and basketball because America either plays them by itself, or—when other countries play against us—we dominate them.
  • Embracing soccer, by contrast, means embracing America’s role as merely one nation among many, without special privileges. It’s no coincidence that young Americans, in addition to liking soccer, also like the United Nations. In 2013, Pew found that Americans under 30 were 24 points more favorable to the U.N. than Americans over 50.
  • Millennials were also 23 points more likely than the elderly to say America should take its allies’ opinion into account even if means compromising our desires.
  • In embracing soccer, Americans are learning to take something we neither invented nor control, and nonetheless make it our own. It’s a skill we’re going to need in the years to come.
Javier E

What 'White Privilege' Really Means - NYTimes.com - 0 views

  • This week’s conversation is with Naomi Zack, a professor of philosophy at the University of Oregon and the author of “The Ethics and Mores of Race: Equality After the History of Philosophy.”
  • My first book, “Race and Mixed Race” (1991) was an analysis of the incoherence of U.S. black/white racial categories in their failure to allow for mixed race. In “Philosophy of Science and Race,” I examined the lack of a scientific foundation for biological notions of human races, and in “The Ethics and Mores of Race,” I turned to the absence of ideas of universal human equality in the Western philosophical tradition.
  • Critical philosophy of race, like critical race theory in legal studies, seeks to understand the disadvantages of nonwhite racial groups in society (blacks especially) by understanding social customs, laws, and legal practices.
  • ...14 more annotations...
  • What’s happening in Ferguson is the result of several recent historical factors and deeply entrenched racial attitudes, as well as a breakdown in participatory democracy.
  • In Ferguson, the American public has awakened to images of local police, fully decked out in surplus military gear from our recent wars in Iraq and Afghanistan, who are deploying all that in accordance with a now widespread “broken windows” policy, which was established on the hypothesis that if small crimes and misdemeanors are checked in certain neighborhoods, more serious crimes will be deterred. But this policy quickly intersected with police racial profiling already in existence to result in what has recently become evident as a propensity to shoot first.
  • How does this “broken windows” policy relate to the tragic deaths of young black men/boys? N.Z.:People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate.
  • Young black men are the convenient target of choice in the tragic intersection of the broken windows policy, the domestic effects of the war on terror and police racial profiling.
  • Why do you think that young black men are disproportionately targeted? N.Z.: Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our society and have suffered the most extreme forms of oppression, for centuries.
  • Police in the United States are mostly white and mostly male. Some confuse their work roles with their own characters. As young males, they naturally pick out other young male opponents. They have to win, because they are the law, and they have the moral charge of protecting.
  • So young black males, who have less status than they do, and are already more likely to be imprisoned than young white males, are natural suspects.
  • Besides the police, a large segment of the white American public believes they are in danger from blacks, especially young black men, who they think want to rape young white women. This is an old piece of American mythology that has been invoked to justify crimes against black men, going back to lynching. The perceived danger of blacks becomes very intense when blacks are harmed.
  • The term “white privilege” is misleading. A privilege is special treatment that goes beyond a right. It’s not so much that being white confers privilege but that not being white means being without rights in many cases. Not fearing that the police will kill your child for no reason isn’t a privilege. It’s a right. 
  • that is what “white privilege” is meant to convey, that whites don’t have many of the worries nonwhites, especially blacks, do.
  • Other examples of white privilege include all of the ways that whites are unlikely to end up in prison for some of the same things blacks do, not having to worry about skin-color bias, not having to worry about being pulled over by the police while driving or stopped and frisked while walking in predominantly white neighborhoods, having more family wealth because your parents and other forebears were not subject to Jim Crow and slavery.
  • Probably all of the ways in which whites are better off than blacks in our society are forms of white privilege.
  • Over half a century later, it hasn’t changed much in the United States. Black people are still imagined to have a hyper-physicality in sports, entertainment, crime, sex, politics, and on the street. Black people are not seen as people with hearts and minds and hopes and skills but as cyphers that can stand in for anything whites themselves don’t want to be or think they can’t be.
  • race is through and through a social construct, previously constructed by science, now by society, including its most extreme victims. But, we cannot abandon race, because people would still discriminate and there would be no nonwhite identities from which to resist. Also, many people just don’t want to abandon race and they have a fundamental right to their beliefs. So race remains with us as something that needs to be put right.
Emilio Ergueta

How To Be A Philosopher | Issue 81 | Philosophy Now - 0 views

  • Philosophers rarely get worked up about clothing. Clothes can be a source of aesthetic pleasure, and few philosophers are adamantly opposed to pleasure.
  • From the fascist’s brown shirt to the bishop’s purple cassock, authorities have a fetishistic attraction to the tailor and milliner. Some uniforms, for example the footballer’s jersey, serve the practical function of making it easier to adopt certain roles. These cases aside, if you find yourself tempted to don a uniform, or worse, impose one on others, you might like to reconsider your philosophical credentials.
  • there is a strong tendency towards vegetarianism, at least in contemporary English-speaking philosophy.
  • ...5 more annotations...
  • Over the last twenty years a large number of philosophical dictionaries, handbooks and companions/study guides have sprang up. These can be both incredibly useful and very entertaining. Three of my favourites are the Blackwell Companion to the Philosophy of Mind edited by Samuel Guttenplan; the Oxford Dictionary of Philosophy by Simon Blackburn; and the on-line Stanford Encyclopedia of Philosophy, edited by Edward Zalta. Indulge yourself.
  • there’s an overwhelming preference amongst philosophers for red wine and coffee.
  • There are very few intellectual endeavours into which the philosopher cannot productively stick her nose. All the natural and social sciences provide fertile ground for philosophy; as do the arts, literature, politics, history and current affairs
  • philosophers don’t sit around shooting the breeze. It’s hard work finding a good argument. It takes practise to become skilled at judging the degree of support the premises and steps of an argument provide for the conclusion. Familiarizing yourself with the arguments of the great philosophers of the past is an excellent way to get the requisite practise.
  • Arguments – rational derivations of conclusions from premises – are central to philosophy. But arguments in another sense – vigorous interchanges of ideas, either verbally or in writing – are also very common in philosophy.
Javier E

How Poor Are the Poor? - NYTimes.com - 0 views

  • “Anyone who studies the issue seriously understands that material poverty has continued to fall in the U.S. in recent decades, primarily due to the success of anti-poverty programs” and the declining cost of “food, air-conditioning, communications, transportation, and entertainment,”
  • Despite the rising optimism, there are disagreements over how many poor people there are and the conditions they live under. There are also questions about the problem of relative poverty, what we are now calling inequality
  • There are strong theoretical justifications for the use of a relative poverty measure. The Organization for Economic Cooperation and Development puts it this way:In order to participate fully in the social life of a community, individuals may need a level of resources that is not too inferior to the norms of a community. For example, the clothing budget that allows a child not to feel ashamed of his school attire is much more related to national living standards than to strict requirements for physical survival
  • ...15 more annotations...
  • Democratic supporters of safety net programs can use Jencks’s finding that poverty has dropped below 5 percent as evidence that the war on poverty has been successful.
  • At the same time liberals are wary of positive news because, as Jencks notes:It is easier to rally support for such an agenda by saying that the problem in question is getting worse
  • The plus side for conservatives of Jencks’s low estimate of the poverty rate is the implication that severe poverty has largely abated, which then provides justification for allowing enemies of government entitlement programs to further cut social spending.
  • At the same time, however, Jencks’s data undermines Republican claims that the war on poverty has been a failure – a claim exemplified by Ronald Reagan’s famous 1987 quip: “In the sixties we waged a war on poverty, and poverty won.”
  • Jencks’s conclusion: “The absolute poverty rate has declined dramatically since President Johnson launched his war on poverty in 1964.” At 4.8 percent, Jencks’s calculation is the lowest poverty estimate by a credible expert in the field.
  • his conclusion — that instead of the official count of 45.3 million people living in poverty, the number of poor people in America is just under 15 million — understates the scope of hardship in this country.
  • Jencks argues that the actual poverty rate has dropped over the past five decades – far below the official government level — if poverty estimates are adjusted for food and housing benefits, refundable tax credits and a better method of determining inflation rates. In Jencks’s view, the war on poverty worked.
  • using a relative measure shows that the United States lags well behind other developed countries:If you use the O.E.C.D. standard of 50 percent of median income as a poverty line, the United States looks pretty bad in cross-national relief. We have a relative poverty rate exceeded only by Chile, Turkey, Mexico and Israel (which has seen a big increase in inequality in recent years). And that rate in 2010 was essentially where it was in 1995
  • While the United States “has achieved real progress in reducing absolute poverty over the past 50 years,” according to Burtless, “the country may have made no progress at all in reducing the relative economic deprivation of folks at the bottom.”
  • the heart of the dispute: How severe is the problem of poverty?
  • Kathryn Edin, a professor of sociology at Johns Hopkins, and Luke Schaefer, a professor of social work at the University of Michigan, contend that the poverty debate overlooks crucial changes that have taken place within the population of the poor.
  • welfare reform, signed into law by President Clinton in 1996 (the Personal Responsibility and Work Opportunity Act), which limited eligibility for welfare benefits to five years. The limitation has forced many of the poor off welfare: over the past 19 years, the percentage of families falling under the official poverty line who receive welfare benefits has fallen from to 26 percent from 68 percent. Currently, three-quarters of those in poverty, under the official definition, receive no welfare payments.
  • he enactment of expanded benefits for the working poor through the earned-income tax credit and the child tax credit.According to Edin and Schaefer, the consequence of these changes, taken together, has been to divide the poor who no longer receive welfare into two groups. The first group is made up of those who have gone to work and have qualified for tax credits. Expanded tax credits lifted about 3.2 million children out of poverty in 2013
  • he second group, though, has really suffered. These are the very poor who are without work, part of a population that is struggling desperately. Edin and Schaefer write that among the losers are an estimated 3.4 million “children who over the course of a year live for at least three months under a $2 per person per day threshold.”
  • ocusing on these findings, Mishel argues, diverts attention from the more serious problem of “the failure of the labor market to adequately reward low-wage workers.”To support his case, Mishel points out that hourly pay for those in the bottom fifth grew only 7.7 percent from 1979 to 2007, while productivity grew by 64 percent, and education levels among workers in this quintile substantially improved.
Javier E

95,000 Words, Many of Them Ominous, From Donald Trump's Tongue - The New York Times - 2 views

  • The New York Times analyzed every public utterance by Mr. Trump over the past week from rallies, speeches, interviews and news conferences to explore the leading candidate’s hold on the Republican electorate for the past five months.
  • The transcriptions yielded 95,000 words and several powerful patterns
  • The most striking hallmark was Mr. Trump’s constant repetition of divisive phrases, harsh words and violent imagery that American presidents rarely use
  • ...19 more annotations...
  • He has a particular habit of saying “you” and “we” as he inveighs against a dangerous “them” or unnamed other — usually outsiders like illegal immigrants (“they’re pouring in”), Syrian migrants (“young, strong men”) and Mexicans, but also leaders of both political parties.
  • Mr. Trump appears unrivaled in his ability to forge bonds with a sizable segment of Americans over anxieties about a changing nation, economic insecurities, ferocious enemies and emboldened minorities (like the first black president, whose heritage and intelligence he has all but encouraged supporters to malign).
  • “ ‘We vs. them’ creates a threatening dynamic, where ‘they’ are evil or crazy or ignorant and ‘we’ need a candidate who sees the threat and can alleviate it,”
  • “He appeals to the masses and makes them feel powerful again: ‘We’ need to build a wall on the Mexican border — not ‘I,’ but ‘we.’ ”
  • And as much as he likes the word “attack,” the Times analysis shows, he often uses it to portray himself as the victim of cable news channels and newspapers that, he says, do not show the size of his crowds.
  • The specter of violence looms over much of his speech, which is infused with words like kill, destroy and fight.
  • “Such statements and accusations make him seem like a guy who can and will cut through all the b.s. and do what in your heart you know is right — and necessary,
  • And Mr. Trump uses rhetoric to erode people’s trust in facts, numbers, nuance, government and the news media, according to specialists in political rhetoric.
  • In another pattern, Mr. Trump tends to attack a person rather than an idea or a situation, like calling political opponents “stupid” (at least 30 times), “horrible” (14 times), “weak” (13 times) and other names, and criticizing foreign leaders, journalists and so-called anchor babies
  • He insists that Mr. Obama wants to accept 250,000 Syrian migrants, even though no such plan exists, and repeats discredited rumors that thousands of Muslims were cheering in New Jersey during the Sept. 11, 2001, attacks.
  • “Nobody knows,” he likes to declare, where illegal immigrants are coming from or the rate of increase of health care premiums under the Affordable Care Act, even though government agencies collect and publish this information.
  • This pattern of elevating emotional appeals over rational ones is a rhetorical style that historians, psychologists and political scientists placed in the tradition of political figures like Goldwater, George Wallace, Joseph McCarthy, Huey Long and Pat Buchanan,
  • “His entire campaign is run like a demagogue’s — his language of division, his cult of personality, his manner of categorizing and maligning people with a broad brush,”
  • “If you’re an illegal immigrant, you’re a loser. If you’re captured in war, like John McCain, you’re a loser. If you have a disability, you’re a loser. It’s rhetoric like Wallace’s — it’s not a kind or generous rhetoric.”
  • “And then there are the winners, most especially himself, with his repeated references to his wealth and success and intelligence,”
  • Historically, demagogues have flourished when they tapped into the grievances of citizens and then identified and maligned outside foes, as McCarthy did with attacking Communists, Wallace with pro-integration northerners and Mr. Buchanan with cultural liberals
  • Mr. Trump, by contrast, is an energetic and charismatic speaker who can be entertaining and ingratiating with his audiences. There is a looseness to his language that sounds almost like water-cooler talk or neighborly banter, regardless of what it is about.
  • he presents himself as someone who is always right in his opinions — even prophetic, a visionary
  • It is the sort of trust-me-and-only-me rhetoric that, according to historians, demagogues have used to insist that they have unique qualities that can lead the country through turmoil
kushnerha

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
Javier E

The Real Victims of Victimhood - The New York Times - 0 views

  • BACK in 1993, the misanthropic art critic Robert Hughes published a grumpy, entertaining book called “Culture of Complaint,” in which he predicted that America was doomed to become increasingly an “infantilized culture” of victimhood. It was a rant against what he saw as a grievance industry appearing all across the political spectrum.
  • Members of one group were prompted to write a short essay about a time when they felt bored; the other to write about “a time when your life seemed unfair. Perhaps you felt wronged or slighted by someone.” After writing the essay, the participants were interviewed and asked if they wanted to help the scholars in a simple, easy task. The results were stark. Those who wrote the essays about being wronged were 26 percent less likely to help the researchers, and were rated by the researchers as feeling 13 percent more entitled.
  • “Victimhood culture” has now been identified as a widening phenomenon by mainstream sociologists. And it is impossible to miss the obvious examples all around us.
  • ...9 more annotations...
  • On campuses, activists interpret ordinary interactions as “microaggressions” and set up “safe spaces” to protect students from certain forms of speech. And presidential candidates on both the left and the right routinely motivate supporters by declaring that they are under attack by immigrants or wealthy people.
  • victimhood makes it more and more difficult for us to resolve political and social conflicts. The culture feeds a mentality that crowds out a necessary give and take — the very concept of good-faith disagreement — turning every policy difference into a pitched battle between good (us) and evil (them).
  • Consider a 2014 study in the Proceedings of the National Academy of Sciences, which examined why opposing groups, including Democrats and Republicans, found compromise so difficult. The researchers concluded that there was a widespread political “motive attribution asymmetry,” in which both sides attributed their own group’s aggressive behavior to love, but the opposite side’s to hatred. Today, millions of Americans believe that their side is basically benevolent while the other side is evil and out to get them.
  • the intervening two decades have made Mr. Hughes look prophetic
  • In a separate experiment, the researchers found that members of the unfairness group were 11 percent more likely to express selfish attitudes. In a comical and telling aside, the researchers noted that the victims were more likely than the nonvictims to leave trash behind on the desks and to steal the experimenters’ pens.
  • Does this mean that we should reject all claims that people are victims? Of course not. Some people are indeed victims in America — of crime, discrimination or deprivation. They deserve our empathy and require justice.
  • The problem is that the line is fuzzy between fighting for victimized people and promoting a victimhood culture.
  • look at the role of free speech in the debate. Victims and their advocates always rely on free speech and open dialogue to articulate unpopular truths. They rely on free speech to assert their right to speak. Victimhood culture, by contrast, generally seeks to restrict expression in order to protect the sensibilities of its advocates
  • look at a movement’s leadership. The fight for victims is led by aspirational leaders who challenge us to cultivate higher values. They insist that everyone is capable of — and has a right to — earned success. They articulate visions of human dignity. But the organizations and people who ascend in a victimhood culture are very different. Some set themselves up as saviors; others focus on a common enemy. In all cases, they treat people less as individuals and more as aggrieved masses.
Javier E

Jonathan Franzen Is Fine With All of It - The New York Times - 0 views

  • If you’re in a state of perpetual fear of losing market share for you as a person, it’s just the wrong mind-set to move through the world with.” Meaning that if your goal is to get liked and retweeted, then you are perhaps molding yourself into the kind of person you believe will get those things, whether or not that person resembles the actual you. The writer’s job is to say things that are uncomfortable and hard to reduce. Why would a writer mold himself into a product?
  • And why couldn’t people hear him about the social effects this would have? “The internet is all about destroying the elite, destroying the gatekeepers,” he said. “The people know best. You take that to its conclusion, and you get Donald Trump. What do those Washington insiders know? What does the elite know?
  • So he decided to withdraw from it all. After publicity for “The Corrections” ended, he decided he would no longer read about himself — not reviews, not think pieces, not stories, and then, as they came, not status updates and not tweets. He didn’t want to hear reaction to his work. He didn’t want to see the myriad ways he was being misunderstood. He didn’t want to know what the hashtags were.
  • ...7 more annotations...
  • I stopped reading reviews because I noticed all I remember is the negatives. Whatever fleeting pleasure you have in someone applying a laudatory adjective to your book is totally washed away by the unpleasantness of remembering the negative things for the rest of your life verbatim.
  • Franzen thinks that there’s no way for a writer to do good work — to write something that can be called “consuming and extraordinarily moving” — without putting a fence around yourself so that you can control the input you encounter. So that you could have a thought that isn’t subject to pushback all the time from anyone who has ever met you or heard of you or expressed interest in hearing from you. Without allowing yourself to think for a minute.
  • It’s not just writers. It’s everyone. The writer is just an extreme case of something everyone struggles with. “On the one hand, to function well, you have to believe in yourself and your abilities and summon enormous confidence from somewhere. On the other hand, to write well, or just to be a good person, you need to be able to doubt yourself — to entertain the possibility that you’re wrong about everything, that you don’t know everything, and to have sympathy with people whose lives and beliefs and perspectives are very different from yours.”
  • “This balancing act” — the confidence that you know everything plus the ability to believe that you don’t — “only works, or works best, if you reserve a private space for it.”
  • Can you write clearly about something that you don’t yourself swim in? Don’t you have to endure it and hate it most of the time like the rest of us?
  • his answer was no. No. No, you absolutely don’t. You can miss a meme, and nothing really changes. You can be called fragile, and you will live. “I’m pretty much the opposite of fragile. I don’t need internet engagement to make me vulnerable. Real writing makes me — makes anyone doing it — vulnerable.”
  • Has anyone considered that the interaction is the fragility? Has anyone considered that letting other people define how you fill your day and what they fill your head with — a passive, postmodern stream of other people’s thoughts — is the fragility?
Javier E

Listening to Michael Jackson After 'Leaving Neverland' - The Atlantic - 1 views

  • The ancient question: What moral stain awaits us if we cannot abandon the art of a monster? None.
  • Michael Jackson’s art matters. It matters not because of any sociopolitical significance, although many of his songs bear uplifting messages. It matters not for its implications about race in America. It matters because of the simple fact that it is, in every sense, the gift revealed.
  • A generation ago, young people read Lewis Hyde’s The Gift to understand how to live meaningful lives by cultivating within themselves the ability to receive art: “An essential portion of any artist’s labor is not creation so much as invocation. Part of the work cannot be made, it must be received; and we cannot have this gift except, perhaps, by supplication, by courting, by creating within ourselves that ‘begging bowl’ to which the gift is drawn.”
  • ...2 more annotations...
  • You can cast away Picasso because Hannah Gadsby told you he was cruel to women. But can you cast away Guernica?
  • Art isn’t something mere; it doesn’t exist as the moral bona fides of the person who made it. That person is a supernumerary. Separate yourself from any art—even popular art; even art created simply as entertainment—and you separate yourself from all of it
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technology | The Guardian - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
« First ‹ Previous 61 - 80 of 134 Next › Last »
Showing 20 items per page