Skip to main content

Home/ TOK Friends/ Group items tagged prosperity

Rss Feed Group items tagged

Javier E

Worldly Philosophers Wanted - NYTimes.com - 0 views

  • Keynes himself was driven by a powerful vision of capitalism. He believed it was the only system that could create prosperity, but it was also inherently unstable and so in need of constant reform. This vision caught the imagination of a generation that had experienced the Great Depression and World War II and helped drive policy for nearly half a century.
  • Friedrich Hayek and Milton Friedman, who envisioned an ideal economy involving isolated individuals bargaining with one another in free markets. Government, they contended, usually messes things up. Overtaking a Keynesianism that many found inadequate to the task of tackling the stagflation of the 1970s, this vision fueled neoliberal and free-market conservative agendas of governments around the world.
  • It took extensive government action to prevent another Great Depression, while the enormous rewards received by bankers at the heart of the meltdown have led many to ask whether unfettered capitalism produced an equitable distribution of wealth. We clearly need a new, alternative vision of capitalism. But thanks to decades of academic training in the “dentistry” approach to economics, today’s Keynes or Friedman is nowhere to be found.
  • ...2 more annotations...
  • To refuse to discuss ideas such as types of capitalism deprives us of language with which to think about these problems. It makes it easier to stop thinking about what the economic system is for and in whose interests it is working.
  • Perhaps the protesters occupying Wall Street are not so misguided after all. The questions they raise — how do we deal with the local costs of global downturns? Is it fair that those who suffer the most from such downturns have their safety net cut, while those who generate the volatility are bailed out by the government? — are the same ones that a big-picture economic vision should address. If economists want to help create a better world, they first have to ask, and try to answer, the hard questions that can shape a new vision of capitalism’s potential.
Sophia C

In China, 'Once the Villages Are Gone, the Culture Is Gone' - NYTimes.com - 0 views

  • Across China, cultural traditions like the Lei family’s music are under threat. Rapid urbanization means village life, the bedrock of Chinese culture, is rapidly disappearing, and with it, traditions and history.
  • By 2010, that figure had dropped to 2.6 million, a loss of about 300 villages a day
  • In the past few years, the shift has accelerated as governments have pushed urbanization, often leaving villagers with no choice but to move.
  • ...2 more annotations...
  • Numerous local officials are under investigation for corruption linked to rural land sales.
  • “The goal is to make sure these cultural heritages don’t get lost,” she said. “It would be a great pity if they are lost just as our country is on the road to prosperity.
Javier E

The World According to Team Walt - NYTimes.com - 0 views

  • “Breaking Bad” implicitly challenges audiences to get down to bedrock and actually justify those norms. Why is it so wrong to kill strangers — often dangerous strangers! — so that your own family can survive and prosper? Why is it wrong to exploit people you don’t see or care about for the sake of those inside your circle? Why is Walter White’s empire-building — carried out with boldness, brilliance and guile — not an achievement to be admired?
  • The allure for Team Walt is not ultimately the pull of nihilism, or the harmless thrill of rooting for a supervillain. It’s the pull of an alternative moral code, neither liberal nor Judeo-Christian, with an internal logic all its own. As James Bowman wrote in The New Atlantis, embracing Walt doesn’t requiring embracing “individual savagery” and a world without moral rules. It just requires a return to “old rules” — to “the tribal, family-oriented society and the honor culture that actually did precede the Enlightenment’s commitment to universal values.”
  • Those rules seem cruel by the lights of both cosmopolitanism and Christianity, but they are not irrational or necessarily false. Their Darwinian logic is clear enough, and where the show takes place — in the shadow of cancer, the shadow of death — the kindlier alternatives can seem softheaded, pointless, naïve.
  • ...1 more annotation...
  • It’s comforting to dismiss Walt’s admirers as sickos, idiots, “bad fans.” But they, too, can be moralists — drawn by their sympathy for Walter White into a worldview that still lies percolating, like one of his reactions, just below the surface of every human heart.
Emily Freilich

What Is Education For? - 2 views

  • The truth is that many things on which your future health and prosperity depend are in dire jeopardy: climate stability, the resilience and productivity of natural systems, the beauty of the natural world, and biological diversity.
  • this is not the work of ignorant people. It is, rather, largely the result of work by people with BAs, BSs, LLBs, MBAs, and PhDs.
  • Ignorance is not a solvable problem, but rather an inescapable part of the human condition. The advance of knowledge always carries with it the advance of some form of ignorance.
  • ...8 more annotations...
  • What was wrong with their education? In Wiesel’s words: "It emphasized theories instead of values, concepts rather than human beings, abstraction rather than consciousness, answers instead of questions, ideology and efficiency rather than conscience."
  • In the modern curriculum we have fragmented the world into bits and pieces called disciplines and subdisciplines. As a result, after 12 or 16 or 20 years of education, most students graduate without any broad integrated sense of the unity of things. The consequences for their personhood and for the planet are large. For example, we routinely produce economists who lack the most rudimentary knowledge of ecology. This explains why our national accounting systems do not subtract the costs of biotic impoverishment, soil erosion, poisons in the air or water, and resource depletion from gross national product. We add the price of the sale of a bushel of wheat to GNP while forgetting to subtract the three bushels of topsoil lost in its production.
  • There is an information explosion going on, by which I mean a rapid increase of data, words, and paper. But this explosion should not be taken for an increase in knowledge and wisdom, which cannot so easily by measured. What can be said truthfully is that some knowledge is increasing while other kinds of knowledge are being lost. David Ehrenfeld has pointed out that biology departments no longer hire faculty in such areas as systematics, taxonomy, or ornithology. In other words, important knowledge is being lost because of the recent overemphasis on molecular biology and genetic engineering, which are more lucrative, but not more important, areas of inquiry.
  • The plain fact is that the planet does not need more "successful" people. But it does desperately need more peacemakers, healers, restorers, storytellers, and lovers of every shape and form. It needs people who live well in their places.
  • The goal of education is not mastery of subject matter, but of one’s person. Subject matter is simply the tool. Much as one would use a hammer and chisel to carve a block of marble, one uses ideas and knowledge to forge one’s own personhood.
  • knowledge carries with it the responsibility to see that it is well used in the world.
  • we cannot say that we know something until we understand the effects of this knowledge on real people and their communities
  • Indoor classes create the illusion that learning only occurs inside four walls isolated from what students call without apparent irony the "real world."
charlottedonoho

Smarter Every Year? Mystery of the Rising IQs - WSJ - 0 views

  • That’s because absolute performance on IQ tests—the actual number of questions people get right—has greatly improved over the last 100 years. It’s called the Flynn effect, after James Flynn, a social scientist at New Zealand’s University of Otago who first noticed it in the 1980s.
  • They found that the Flynn effect is real—and large. The absolute scores consistently improved for children and adults, for developed and developing countries. People scored about three points more every decade, so the average score is 30 points higher than it was 100 years ago.
  • The pace jumped in the 1920s and slowed down during World War II. The scores shot up again in the postwar boom and then slowed down again in the ’70s. They’re still rising, but even more slowly. Adult scores climbed more than children’s.
  • ...5 more annotations...
  • Genes couldn’t change that swiftly, but better nutrition and health probably played a role. Still, that can’t explain why the change affected adults’ scores more than children’s. Economic prosperity helped, too—IQ increases correlate significantly with higher gross domestic product.
  • The fact that more people go to school for longer likely played the most important role—more education also correlates with IQ increases. That could explain why adults, who have more schooling, benefited most.
  • The best explanation probably depends on some combination of factors. Dr. Flynn himself argues for a “social multiplier” theory. An initially small change can set off a benign circle that leads to big effects. Slightly better education, health, income or nutrition might make a child do better at school and appreciate learning more. That would motivate her to read more books and try to go to college, which would make her even smarter and more eager for education, and so on.
  • “Life history” is another promising theory. A longer period of childhood correlates with better learning abilities across many species.
  • The thing that really makes humans so smart, throughout our history, may be that we can invent new kinds of intelligence to suit our changing environments.
Ellie McGinnis

Social Connection Makes a Better Brain - Emily Esfahani Smith - The Atlantic - 0 views

  • Recent trends show that people increasingly value material goods over relationships—but neuroscience and evolution say this goes against our nature.
  • Do you prioritize your career or your relationships?
  • “Man is by nature a social animal … Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god.”
  • ...11 more annotations...
  • Just as human beings have a basic need for food and shelter, we also have a basic need to belong to a group and form relationships.
  • Lieberman sees the brain as the center of the social self. Its primary purpose is social thinking.
  • Every time we are not engaged in an active task—like when we take a break between two math problems—the brain falls into a neural configuration called the “default network.” When you have down time, even if it’s just for a second, this brain system comes on automatically.
  • “The default network directs us to think about other people’s minds—their thoughts, feelings, and goals.”
  • “Evolution has made a bet,” Lieberman tells me, “that the best thing for our brain to do in any spare moment is to get ready for what comes next in social terms.”
  • When economists put a price tag on our relationships, we get a concrete sense of just how valuable our social connections are—and how devastating it is when they are broken.
  • If you have a friend that you see on most days, it’s like earning $100,000 more each year.
  • To the brain, social pain feels a lot like physical pain—a broken heart can feel like a broken leg, as Lieberman puts it in his book.
  • But over the last fifty years, while society has been growing more and more prosperous and individualistic, our social connections have been dissolving.
  • Over the same period of time that social isolation has increased, our levels of happiness have gone down, while rates of suicide and depression have multiplied. 
  • Across the board, people are increasingly sacrificing their personal relationships for the pursuit of wealth.
Emilio Ergueta

A New Role for Japan's Military - NYTimes.com - 0 views

  • Japanese people have been divided over whether to revise the Constitution since almost as soon as it was promulgated in 1946. The debate has centered on Article 9, the so-called peace clause. And it has been fundamentally miscast.
  • Article 9 comprises two paragraphs. In the first, Japan renounces “war as a sovereign right of the nation and the threat or use of force as means of settling international disputes.”
  • In the second paragraph of Article 9, Japan renounces maintaining any “land, sea, and air forces, as well as other war potential.” No other country has imposed such a restriction on itself.
  • ...4 more annotations...
  • In 1954, the Japanese government created the Self-Defense Forces (SDF) in order to alleviate the United States’ burden of ensuring Japan’s security. At the time it argued for interpreting Article 9 (2) as recognizing Japan’s sovereign right to have a small military force. (The Supreme Court supported this reading in a 1959 ruling.) This construction — which has come to be known as the “minimum necessary level” — allowed the establishment of a force to defend Japan within its territory.
  • amending the Constitution is an onerous process, requiring at least a two-thirds majority in both houses of the Diet and a simple majority in a national referendum. It will also require overcoming the misguided objections of the reflexively antiwar set
  • Japan now understands that its prosperity and stability depend on global trade and on the peaceful resolution of any disputes. As one of the main beneficiaries of the international liberal order today, Japan is committed to the system — and it is committed to defending it, particularly against rising states like China, which are challenging the status quo
  • A moderate, sensible revision of the Constitution would be a modest step toward making Japan both a normal country and a more effective protector of the international order — and no less peace-loving.
charlottedonoho

Weekend Roundup: Preparing to Be Disrupted | Nathan Gardels - 0 views

  • Discussion around the theme "prepare to be disrupted" ranged from how the emergent sharing economy, along with 3D desktop manufacturing, would take work back into the home to worries that automation could eliminate as much as 47 percent of current jobs in the United States.
  • In The WorldPost, Ian Goldin of the Oxford Martin School writes that technological advance can lead to greater inequality or inclusive prosperity depending on how we govern ourselves.
  • Speaking at the London conference, MIT's Andrew McAfee argues that digital technology is "the best economic news in human history" but says that it poses many challenges to job creation in the future.
Javier E

The Obama Boom - The New York Times - 1 views

  • What did Mr. Obama do that was supposed to kill jobs? Quite a lot, actually. He signed the 2010 Dodd-Frank financial reform, which critics claimed would crush employment by starving businesses of capital.
  • He raised taxes on high incomes, especially at the very top, where average tax rates rose by about six and a half percentage points after 2012, a step that critics claimed would destroy incentives.
  • Yet none of the dire predicted consequences of these policies have materialized.
  • ...6 more annotations...
  • And he enacted a health reform that went into full effect in 2014, amid claims that it would have catastrophic effects on employment.
  • what do we learn from this impressive failure to fail? That the conservative economic orthodoxy dominating the Republican Party is very, very wrong.
  • conservative orthodoxy has a curiously inconsistent view of the abilities and motivations of corporations and wealthy individuals — I mean, job creators.
  • On one side, this elite is presumed to be a bunch of economic superheroes, able to deliver universal prosperity by summoning the magic of the marketplace. On the other side, they’re depicted as incredibly sensitive flowers who wilt in the face of adversity — raise their taxes a bit, subject them to a few regulations, or for that matter hurt their feelings in a speech or two, and they’ll stop creating jobs and go sulk in their tents, or more likely their mansions.
  • It’s a doctrine that doesn’t make much sense, but it conveys a clear message that, whaddya know, turns out to be very convenient for the elite: namely, that injustice is a law of nature, that we’d better not do anything to make our society less unequal or protect ordinary families from financial risks. Because if we do, the usual suspects insist, we’ll be severely punished by the invisible hand, which will collapse the economy.
  • From a conservative point of view, Mr. Obama did everything wrong, afflicting the comfortable (slightly) and comforting the afflicted (a lot), and nothing bad happened. We can, it turns out, make our society better after all.
Javier E

Anxiety and Depression Are on an 80-Year Upswing -- Science of Us - 1 views

  • Ever since the 1930s, young people in America have reported feeling increasingly anxious and depressed. And no one knows exactly why.One of the researchers who has done the most work on this subject is Dr. Jean Twenge, a social psychologist at San Diego State University who is the author of Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before. She’s published a handful of articles on this trajectory, and the underlying story, she thinks, is a rather negative one. “I think the research tells us that modern life is not good for mental health,” she said.
  • The words “depression” and “anxiety” themselves, after all, mean very different things to someone asked about them in 1935 as compared to 1995, so surveys that invoke these concepts directly only have limited utility for longitudinal study. To get around this, Twenge prefers to rely on surveys and inventories in which respondents are asked about specific symptoms which are frequently correlated with anxiety and depression
  • Much of the richest data on this question, then, comes from the Minnesota Multiphasic Personality Inventory (MMPI), which has been administered to high school and college students since the 1930s — and which includes many questions about symptoms. Specifically, it asks — among many other things — whether respondents feel well-rested when they wake up, whether they have trouble thinking, and whether they have experienced dizzy spells, headaches, shortness of breath, a racing heart, and so on.
  • ...12 more annotations...
  • The trendlines are obvious: Asked the same questions at about the same points in their lives, Americans are, over time, experiencing worse and worse symptoms associated with anxiety and depression.
  • there’s an interesting recent wrinkle to this trajectory. In a paper published in 2014 in Social Indicators Research, Twenge tracked the results of the Monitoring the Future (MtF) survey, “a nationally representative sample of U.S. 12th graders [administered] every year since 1976,” between 1982 and 2013. Like the MMPI, the MtF asks students about symptoms in a manner that should be generally resistant to cultural change: The somatic items Twenge examined asked about trouble sleeping, remembering things, thinking/concentrating, and learning, as well as shortness of breath. An interesting recent pattern emerged on these measures:
  • All the items end up significantly higher than where they started, but for many of them most of the increase happens over the first half of the time period in question. From the late 1990s or so until 2013, many of the items bounce around a bit but ultimately remain flat, or flat-ish.
  • drugs — Prozac and Lexapro, among others — have been prescribed to millions of people who experience these symptoms, many of whom presumably saw some improvement once the drugs kicked in, so this explanation at least makes intuitive sens
  • there are likely other factors leading to the plateau as well, said Twenge. For one thing, the “crime rate is lower [today] than it was when it peaked in the early 1990s,” and dealing with crime can lead to anxiety and depression symptoms. Other indicators of youth well-being, like teen pregnancy, were also significantly higher back then, and could have accounted for the trajectory visible on the graphs.“For whatever reason,” said Twenge, “if you look at what was going on back then, the early 1990s were not a good time, particularly for young people.”
  • “Obviously there’s a lot of good things about societal and technological progress,” she said, “and in a lot of ways our lives are much easier than, say, our grandparents’ or great-grandparents’ lives. But there’s a paradox here that we seem to have so much ease and relative economic prosperity compared to previous centuries, yet there’s this dissatisfaction, there’s this unhappiness, there are these mental health issues in terms of depression and anxiety.
  • She thinks the primary problem is that “modern life doesn’t give us as many opportunities to spend time with people and connect with them, at least in person, compared to, say, 80 years ago or 100 years ago. Families are smaller, the divorce rate is higher, people get married much later in life.”
  • it may simply be the case that many people who lived in less equal, more “traditional” times were forced into close companionship with a lot of other people, and that this shielded them from certain psychological problems, whatever else was going on in their lives.
  • She was virtually never alone — and that can be a bad thing, clearly, but from a mental health perspective being surrounded by people is a good thing.”
  • the shift away from this sort of life has also brought with it a shift in values, and Twenge thinks that this, too, can account for the increase in anxiety and depression. “There’s clear evidence that the focus on money, fame, and image has gone up,
  • “and there’s also clear evidence that people who focus on money, fame, and image are more likely to be depressed and anxious.”
  • “It’s so tempting to say the world is going to hell in a handbasket and everything’s bad, but there are so many good things about modern life,” she said. So maybe the key message here is that while there’s no way to go back to family farms and young marriage and parenthood — and, from an equality standpoint,we wouldn’t want to anyway — modern life needs to do a better job of connecting people to one another, and encouraging them to adopt the sorts of goals and outlooks that will make them happy.
Javier E

Dan Crenshaw: I made amends with Pete Davidson on SNL. But that's only the beginning. -... - 0 views

  • As a country, we still have a lot of work to do. We need to agree on some basic rules for civil discourse.
  • many of the ultimate goals — economic prosperity, better health care and education, etc. — are the same. We just don’t share the same vision of how to achieve them.
  • How, then, do we live together in this world of differing ideas? For starters, let’s agree that the ideas are fair game. If you think my idea is awful, you should say as much
  • ...4 more annotations...
  • But there is a difference between attacking an idea and attacking the person behind that idea. Labeling someone as an “-ist” who believes in an “-ism” because of the person’s policy preference is just a shortcut to playground-style name-calling, cloaked in political terminology
  • Similarly, people too often attack not just an idea but also the supposed intent behind an idea. That raises the emotional level of the debate and might seem like it strengthens the attacker’s side, but it’s a terrible way to make a point.
  • Assuming the worst about your opponents’ intentions has the effect of demonizing their ideas, removing the need for sound counter-reasoning and fact-based argument. That’s not a good environment for the exchange of ideas.
  • When all else fails, try asking for forgiveness, or granting it.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Seven Lessons In Economic Leadership From Ancient Egypt - 0 views

  • Although there are plenty of grounds for rage against the big banks, the challenge is to sort out which are the activities that grow the real economy of goods and services, and which are the activities that are essentially a zero-sum game of socially useless gambling?
  • The situation today is that the zero-sum games of the financial sector aren’t just a tiny sideshow. They have grown exponentially and have become almost the main game of the financial sector.
  • When finance becomes the end, not the means, then the result is what analyst Gautam Mukunda calls “excessive financialization” of the economy, as his excellent article by “The Price of Wall Street Power” in the June 2014 issue of Harvard Business Review makes clear.
  • ...15 more annotations...
  • Quite apart from the “unbalanced power” of the financial sector, and the tendency of a super-sized financial sector to cause increasingly bad global financial crashes, excessive financialization leads to resources being misallocated. “In many of the financial sector’s segments that have grown fastest since deregulation—like investment banks—the transactions are primarily zero-sum.”
  • However in times of rapid technological transformation like today, the role of the economic priesthood in protecting its own interests can become a massively destabilizing.
  • Thus we know from the history of the last couple of hundred years that in times of rapid technological transformation, the financial sector tends to become disconnected from the real economy
  • This has occurred a number of times in the last few hundred years, including the Canal Mania (England—1790s), the Rail Mania (England—1840s), the Gilded Age (US: 1880s—early 1900s) the Roaring Twenties (US—1920s) and the Big Banks of today.
  • Getting to safety is not made any easier by the fact the modern economic priesthood—the managers of large firms and the banks—has, like their ancient Egyptian forbears, found ways to participate in the casino economy and benefit from “making money out of money”, even as the economy as a whole suffers.  As Upton Sinclair wrote, “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.
  • Just as the ancient Egyptian economic priesthood clung to power as the economy stagnated, so today the economic priesthood shows no signs of relinquishing their gains or their power. The appetite and expectation of extraordinary returns is still there.
  • “Corporate chieftains rationally choose financial engineering—debt-financed share buybacks, for example—over capital investment in property, plants and equipment. Financial markets reward shareholder activism. Institutional investors extend their risk parameters to beat their benchmarks… But real economic growth—averaging just a bit above 2 percent for the fifth year in a row—remains sorely lacking.”
  • As a result, the economy remains in the “Great Stagnation”(Tyler Cowen), also known as “the Secular Stagnation (Larry Summers). It is running on continuing life support from the Federal Reserve. Large enterprises still appear to be profitable. The appearance, though not the reality, of economic well-being has been sufficient to make the stock market soa
  • Just as no change was possible in ancient Egyptian society so long as the economic priesthood colluded to preserve the status quo, so the excesses and prevarications of the Financial Sector will continue so long as the regulators remain its cheerleaders.
  • Just listen to the chair of the Securities and Exchange Commission (SEC), Mary Jo White at Stanford University Rock Center for Corporate Governance speaking to directors. In her speech, she makes no secret of her view that the overall corporate arrangements are sound. The job of the SEC, as outlined in the speech, is to find the odd individual who might be doing something wrong. The idea that the large-scale activities of the major banks might be socially corrosive is not even alluded.
  • Thus in times of transformational technology, there is a huge expansion of investment, driven by the financial sector. Wealthy investors begin to expect outsized returns and so there is over-investment. The resulting bubbles in due course burst
  • Just as in ancient Egypt, no progress was possible so long as the myths and rituals of the economic priesthood and their offerings to the gods were widely accepted as real indicators of what was going on, so today no progress is possible so long as the myths and rituals of the modern economic priesthood still has a pervasive hold of people’s minds
  • In the modern economy, the myths and rituals of the economic priesthood are built on the notion that the purpose of a firm is to maximize shareholder value and the notion that if the share price is increasing, things are going well. These ideas are the intellectual underpinnings of the zero-sum activities of the financial sector for “making money out of money”, by whatever means possible
  • Like the myths and rituals of the priests of ancient Egypt, shareholder value theory is espoused with religious overtones. Shareholder value, which even Jack Welch has called “the dumbest idea in the world,” remains pervasive in business, even though it is responsible for massive offshoring of manufacturing, thereby destroying major segments of the US economy, undermining US capacity to compete in international markets and killing the economic recovery.
  • If instead society decides that the financial sector should concentrate on its socially important function of financing the real economy and providing financial security for an ever wider circle of citizens and enterprises, we could enjoy an era of growth and lasting prosperity.
sandrine_h

How to Defeat Those Who Are Waging War on Science - Scientific American Blog Network - 0 views

  • new language of this war—a subtle, yet potentially damaging form of science skepticism
  • The systematic use of so-called “uncertainty” surrounding well-established scientific ideas has proven to be a reliable method for manipulating public perception and stalling political action.
  • Make no mistake: the War on Science is going to affect you, whether you are a scientist or not. It is going to affect everything—ranging from the safety of the food we eat, the water we drink, the air we breathe, and the kind of planet we live on.
  • ...8 more annotations...
  • The reality is that science touches everything we do, and everyone we love
  • Do we want to be the America that embraces science and the pursuit of knowledge to advance our health, safety, prosperity, and security, making America the leader of the civilized world? Or do want America to mimic failed regimes of the past, where knowledge and science were deliberately suppressed to benefit a few, to funnel more profits into dying industries, and placate the prejudices of a mob
  • Traditionally, scientists have been coached to steer clear of the political fray. But if the past few weeks have taught us anything, it’s that now is the time for a quantum leap of political relevance.
  • You cannot isolate science from politics, or politics from science
  • That is precisely why scientists shouldn’t shy away from engaging in political conversations. Now more than ever, it is necessary to be participating in them
  • At the very least, we all share a deeply-held fascination with our natural world. The search for meaning, the understanding of something bigger than ourselves, is of universal significance.
  • In today’s world, facts alone are not enough to win debates, let alone people’s hearts and minds. Research shows that increasing scientific knowledge can often deepen the divide between people on polarizing issues. “Individuals subconsciously resist factual information that threatens their defining values,” a recent study points out
  • America has a choice to make. A choice between advancing civilization or bringing it down. A choice between knowledge and chaos. Now, everyone must choose which side they are on.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

For Trump and G.O.P., the Welfare State Shouldn't Be the Enemy - The New York Times - 0 views

  • Historically, however, the level of government spending and the level of regulation have been packaged together and treated as a single variable. This has forced a choice between two options: the “liberal” package of big government and heavy regulation or the “conservative” package of small government and light regulation.
  • But this is a false choice. Regulatory policy and fiscal policy are independent dimensions, and they can be rebundled in different packages. Mr. Trump’s gestures toward a big-government, low-regulation package — rooted more in instinct than intellect — proved popular with Republican voters
  • Government spending reliably rises as economies grow. When countries get richer, one of the first things their people do is vote for more generous government social services. This pattern, which economists have labeled Wagner’s Law, has held more or less steady for a century in dozens of developed democratic countries.
  • ...5 more annotations...
  • Republicans need to recognize finally that secure property rights, openness to global trade and a relatively low regulatory burden are much more important than fiscal policy for innovation, job creation and rising standards of
  • not only are sound safety nets popular, but they also increase the public’s tolerance for the dislocations of a dynamic free-market economy
  • Third, the idea that reducing taxpayer-financed government spending is the key to giving people more freedom and revving up the economy encourages conservative hostility to government as such
  • The Republican legislative agenda is stalled because party members have boxed themselves in with their own bad ideas about what freedom and rising prosperity require. A new pro-growth economic platform that sets aside small-government monomania and focuses instead on protecting citizens’ basic rights to commit “capitalist acts between consenting adults,” as the libertarian philosopher Robert Nozick put it, has both practical and political advantages
  • a generous and effective safety net can be embraced as a tool to promote and sustain a culture of freedom, innovation and risk taking. Politically, repairing and improving the slipshod infrastructure of the safety net would liberate Republicans from the bad faith of attacking the welfare state in one breath, halfheartedly promising not to cut entitlements in the next and then breaking that promise once in power.
ilanaprincilus06

Rate Of Gun Violence Deaths In U.S. Is Higher Than Much Of The World : Goats and Soda :... - 1 views

  • The horrific mass shooting events in the Atlanta area and Boulder, Colo., just days apart have once again shown a spotlight on how frequent this type of violence is in the United States compared with other wealthy countries.
  • The U.S. has the 32nd-highest rate of deaths from gun violence in the world:
  • 3.96 deaths per 100,000 people in 2019.
  • ...7 more annotations...
  • In the District of Columbia, the rate is 18.5 per 100,000 — the highest in the United States.
  • "If you compare us to other well-off countries, we really stand out."
  • with deaths due to gun violence rare even in many low-income countries — such as Tajikistan and Gambia, which saw 0.18 deaths and 0.22 deaths, respectively, per 100,000 people.
  • "It is a little surprising that a country like ours should have this level of gun violence,"
  • Prosperous Asian countries such as Singapore (0.01), Japan (0.02) and South Korea (0.02) boast the absolute lowest rates — along with China, also at 0.02.
  • With the casualties due to armed conflicts factored out, even in conflict-ridden regions such as the Middle East, the U.S. rate is worse.
  • The U.S. gun violence death rate is also higher than in nearly all countries in sub-Saharan Africa, including many that are among the world's poorest.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
caelengrubb

Why it's time to stop worrying about the decline of the English language | Language | T... - 0 views

  • Now imagine that something even more fundamental than electricity or money is at risk: a tool we have relied on since the dawn of human history, enabling the very foundations of civilisation to be laid
  • I’m talking about our ability to communicate – to put our thoughts into words, and to use those words to forge bonds, to deliver vital information, to learn from our mistakes and build on the work done by others.
  • “Their language is deteriorating. They are lowering the bar. Our language is flying off at all tangents, without the anchor of a solid foundation.
  • ...20 more annotations...
  • Although it is at pains to point out that it does not believe language can be preserved unchanged, it worries that communication is at risk of becoming far less effective. “Some changes would be wholly unacceptable, as they would cause confusion and the language would lose shades of meaning
  • “Without grammar, we lose the agreed-upon standards about what means what. We lose the ability to communicate when respondents are not actually in the same room speaking to one another. Without grammar, we lose the precision required to be effective and purposeful in writing.”
  • At the same time, our laziness and imprecision are leading to unnecessary bloating of the language – “language obesity,”
  • That’s five writers, across a span of 400 years, all moaning about the same erosion of standards. And yet the period also encompasses some of the greatest works of English literature.
  • Since then, the English-speaking world has grown more prosperous, better educated and more efficiently governed, despite an increase in population. Most democratic freedoms have been preserved and intellectual achievement intensified.
  • Linguistic decline is the cultural equivalent of the boy who cried wolf, except the wolf never turns up
  • Our language will always be as flexible and sophisticated as it has been up to now. Those who warn about the deterioration of English haven’t learned about the history of the language, and don’t understand the nature of their own complaints – which are simply statements of preference for the way of doing things they have become used to.
  • But the problem is that writers at that time also felt they were speaking a degraded, faltering tongue
  • Seventy-odd years ago, people knew their grammar and knew how to talk clearly. And, if we follow the logic, they must also have been better at organising, finding things out and making things work.
  • Hand-wringing about standards is not restricted to English. The fate of every language in the world has been lamented by its speakers at some point or another.
  • “For more than 2,000 years, complaints about the decay of respective languages have been documented in literature, but no one has yet been able to name an example of a ‘decayed language’.” He has a point.
  • One common driver of linguistic change is a process called reanalysis.
  • Another form that linguistic change often takes is grammaticalisation: a process in which a common phrase is bleached of its independent meaning and made into a word with a solely grammatical function
  • One instance of this is the verb “to go”, when used for an action in the near future or an intention.
  • Human anatomy makes some changes to language more likely than others. The simple mechanics of moving from a nasal sound (m or n) to a non-nasal one can make a consonant pop up in between
  • The way our brain divides up words also drives change. We split them into phonemes (building blocks of sound that have special perceptual significance) and syllables (groups of phonemes).
  • ound changes can come about as a result of social pressures: certain ways of saying things are seen as having prestige, while others are stigmatised. We gravitate towards the prestigious, and make efforts to avoid saying things in a way that is associated with undesirable qualities – often just below the level of consciousnes
  • The problem arises when deciding what might be good or bad. There are, despite what many people feel, no objective criteria by which to judge what is better or worse in communication
  • Though we are all capable of adaptation, many aspects of the way we use language, including stylistic preferences, have solidified by our 20s. If you are in your 50s, you may identify with many aspects of the way people spoke 30-45 years ago.
  • The irony is, of course, that the pedants are the ones making the mistakes. To people who know how language works, pundits such as Douglas Rushkoff only end up sounding ignorant, having failed to really interrogate their views
cvanderloo

Britain, dubbed 'plague island', wants tourists to return | CNN Travel - 0 views

  • Boris Johnson plunged the country into harsh new restrictions, blaming a new variant of the disease that had been spreading in London and the southeast of England since September.
  • Country after country closed their borders to flights from the UK, in a bid to keep the new variant confined to "plague island,"
  • UK travelers are still banned from much of the world -- including EU countries -- because of the homegrown variant.
  • ...13 more annotations...
  • In the end, 2020 saw a 76% decline in visitors and an 80% drop.
  • "A lot of our multi-country trips including England used to fly round-trip to London, and now we're looking to see if from a traveler's perspective that will be the most convenient."
  • DaSilva said that potential Brexit complications were on the radar of travelers' concerns last year, but, with a no-deal averted and the pandemic taking center stage, it's no longer an issue for her guests. In fact, three of the top five most searched trips on their website involve Great Britain.
  • "Early on in the pandemic, people were searching for places that had more open green spaces, like New Zealand and Ireland," she says. "But as news of the vaccine came out and people became more confident about trips for this year, England popped back up to the top."
  • And there's one big bonus for those traveling to the UK this year --- the tanking pound.
  • The UK made a great play that it was an international and welcoming destination over the 2012 Olympics, but that message was withdrawn with Brexit. The posturing of the government -- especially the threat to put gunboats in the Channel -- didn't play well with a lot of origin markets," he says.
  • "There may be differences with the import of goods and transmission of services that means London isn't as prosperous as it was."
  • "Suddenly, using the UK as gateway to Europe becomes enormously less attractive. Travelers will have to think about whether it's sensible to come to the UK as part of a European destination. They may wish to look at the UK as a single destination, but that isn't nearly as attractive as the UK being part of a European vacation."
  • "The UK won't be ignored, but it's unlikely to recover as strongly as Europe.
  • About 20% of the Intercontinental's staff left the UK before Brexit, says Ouseph; but while in normal times that would be a crisis, he thinks that Covid-induced job losses will mean hotels can fill these positions for now -- at least, the customer-facing ones. Instead, it's the less visible, but crucial roles, where they'll struggle.
  • Not everyone thinks Brexit will make a big difference to the inbound UK travel industry.
  • Maine -- who hasn't run tours since October -- says that he thinks the vaccine "will get us out of it -- it's a matter of when, not if." And he predicts that "when" could be as early as Easter.
  • "Rolling out the vaccine is the acid test of being a coherent holiday destination, and the UK looks like it's doing a reasonably good job in comparison to everyone else."
‹ Previous 21 - 40 of 50 Next ›
Showing 20 items per page