Skip to main content

Home/ TOK Friends/ Group items matching "open-source" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
12More

What to Do About 'Coming Apart' - NYTimes.com - 0 views

  • Murray has produced a book-length argument placing responsibility for rising inequality and declining mobility on widespread decay in the moral fiber of white, lower-status, less well-educated Americans, putting relatively less emphasis on a similar social breakdown among low-status, less-educated Americans of all races
  • Murray’s strength lies in his ability to raise issues that center-left policy makers and academics prefer, for the most part, to shy away from. His research methods, his statistical analyses and the conclusions he draws are subject to passionate debate. But by forcing taboo issues into the public arena, Murray has opened up for discussion politically salient issues that lurk at a subterranean level in the back-brains of many voters, issues that are rarely examined with the rigor necessary to affirm or deny their legitimacy.
  • The National Review and the Conservative Monitor cited “Losing Ground” as one of the ten books that most changed America. Murray’s bookseemed like a bolt of lightning in the middle of the night revealing what should have been plain as the light of day. The welfare state so carefully built up in the 1960s and 1970s created a system of disincentives for people to better their own lives. By paying welfare mothers to have children out of wedlock into a poor home, more of these births were encouraged. By doling out dollars at a rate that could not be matched by the economy, the system encouraged the poor to stay home.
  • ...9 more annotations...
  • He contends in “Coming Apart” that there was far greater social cohesion across class lines 50 years ago because “the powerful norms of social and economic behavior in 1960 swept virtually everyone into their embrace,” adding in a Jan. 21 op-ed in the Wall Street Journal thatOver the past 50 years, that common civic culture has unraveled. We have developed a new upper class with advanced educations, often obtained at elite schools, sharing tastes and preferences that set them apart from mainstream America. At the same time, we have developed a new lower class, characterized not by poverty but by withdrawal from America’s core cultural institutions.According to Murray, higher education has now become a proxy for higher IQ, as elite colleges become sorting mechanisms for finding, training and introducing to each other the most intellectually gifted young people. Fifty years into the education revolution, members of this elite are likely to be themselves the offspring of cognitively gifted parents, and to ultimately bear cognitively gifted children.
  • “Industriousness: The norms for work and women were revolutionized after 1960, but the norm for men putatively has remained the same: Healthy men are supposed to work. In practice, though, that norm has eroded everywhere.”
  • Murray makes the case that cognitive ability is worth ever more in modern advanced, technologically complex hypercompetitive market economies. As an example, Murray quotes Bill Gates: “Software is an IQ business. Microsoft must win the IQ war or we won’t have a future.”
  • Murray alleges that those with higher IQs now exhibit personal and social behavioral choices in areas like marriage, industriousness, honesty and religiosity that allow them to enjoy secure and privileged lives. Whites in the lower social-economic strata are less cognitively able – in Murray’s view – and thus less well-equipped to resist the lure of the sexual revolution and doctrines of self-actualization so they succumb to higher rates of family dissolution, non-marital births, worklessness and criminality. This interaction between IQ and behavioral choice, in Murray’s framework, is what has led to the widening income and cultural gap.
  • Despised by the left, Murray has arguably done liberals a service by requiring them to deal with those whose values may seem alien, to examine the unintended consequences of their policies and to grapple with the political impact of assertions made by the right. He has also amassed substantial evidence to bolster his claims and at the same time elicited a formidable academic counter-attack.
  • To Murray, the overarching problem is that liberal elites, while themselves living lives of probity, have refused to proselytize for the bourgeois virtues to which they subscribe, thus leaving their less discerning fellow-citizens to flounder in the anti-bourgeois legacy of the counter-cultural 1960s.
  • “Great Civic Awakening” among the new upper class – an awakening that will lead to the kind of “moral rearmament” and paternalism characteristic of anti-poverty drives in the 19th century. To achieve this, Murray believes, the “new upper class must once again fall in love with what makes America different.”
  • The cognitive elites Murray cites are deeply committed to liberal norms of cultural tolerance and permissiveness. The antipathy to the moralism of the religious right has, in fact, been a major force driving these upscale, secular voters into the Democratic party.
  • changes in the world economy may be destructive in terms of the old social model, but they are profoundly liberating and benign in and of themselves. The family farm wasn’t dying because capitalism had failed or a Malthusian crisis was driving the world to starvation. The family farm died of abundance; it died of the rapidly rising productivity that meant that fewer and fewer people had to work to produce the food on which humanity depended.Mead continues:Revolutions in manufacturing and, above all, in communications and information technology create the potential for unprecedented abundance and a further liberation of humanity from meaningless and repetitive work. Our problem isn’t that the sources of prosperity have dried up in a long drought; our problem is that we don’t know how to swim. It is raining soup, and we are stuck holding a fork.The 21st century, Mead adds,must reinvent the American Dream. It must recast our economic, social, familial, educational and political systems for new challenges and new opportunities. Some hallowed practices and institutions will have to go under the bus. But in the end, the changes will make us richer, more free and more secure than we are now.Mead’s predictions may or may not prove prescient, but it his thinking, more than Murray’s, that reflects the underlying optimism that has sustained the United States for more than two centuries — a refusal to believe that anything about human nature is essentially “intractable.” Mead’s way of looking at things is not only more inviting than Murray’s, it is also more on target.
6More

Beating History: Why Today's Rising Powers Can't Copy the West - Heather Horn - Interna... - 0 views

  • For the BRIC rising economies -- Brazil, Russia, India, and China -- what can be learned by looking at the rise of powers throughout history?
  • production in "all organic economies was set by the annual cycle of plant growth" -- it limits food, fuel, fiber, and building materials. Coal changed all that. By digging into the earth to get minerals instead of growing fuel on the earth, you get a vastly more efficient source of fuel and completely change the rules of the game. You've shifted from an organic economy, as he calls it, to an energy-rich economy. But the economic reprieve the fossil fuels offered could be nearing an end, as global supply becomes more competitive.
  • Historians still debate the nature and causes of the Industrial Revolution, but one thing they seem to agree on is that the it wasn't just industrial -- it was demographic and agricultural as well. Prior to the Industrial Revolution, populations all over the globe had non-negotiable checks on their growth: too many people and you get ecological crises and famines to push the number back down. In the 18th and 19th centuries, England managed to solve this problem, with tremendous leaps in population and urbanization as a result.
  • ...3 more annotations...
  • What the rise of the BRICs symbolizes to both panicked individuals in the West and optimistic ones elsewhere is a radical shift in the geography of power -- a catch-up or reversal of the Western global dominance that was established with the Industrial Revolution.
  • developing countries won't be able to follow the West's path to becoming rich, because that path required certain things that were largely particular to that one period in history.
  • The challenge ahead for the BRICs, then, is to figure out how to maintain growth in a world where the vast new frontier opened up by the Industrial Revolution appears to be closing. The BRICs can play the West's game better than the West, both through technological innovation and population growth, but only for so long. The whole world has to figure out a way of dealing with energy and agriculture.
1More

AGNI Online: Open to Influence: Jonathan Lethem on Reading, Writing and Concepts of Ori... - 0 views

  • I think originality is a word of praise for things that have been expressed in a marvelous way and that make points of origin for any particular element beside the point.  When you read Saul Bellow or listen to Bob Dylan sing, you can have someone point to various cribbings and it won’t matter, because something has been arrived at which subsumes and incorporates and transcends these matters.  In that way, sourcing and originality are two sides of the same coin, they’re a nested partnership.  I don’t think originality has any value as a description of process.  In that regard it’s as meaningless a process word as beauty is.  No artist says, “Let me sit down and do some beauty now.”
5More

Anger for Path Social Network After Privacy Breach - NYTimes.com - 0 views

  • bloggers in Egypt and Tunisia are often approached online by people who are state security in disguise.
  • The most sought-after bounty for state officials: dissidents’ address books, to figure out who they are in cahoots with, where they live and information about their family. In some cases, this information leads to roundups and arrests.
  • A person’s contacts are so sensitive that Alec Ross, a senior adviser on innovation to Secretary of State Hillary Rodham Clinton, said the State Department was supporting the development of an application that would act as a “panic button” on a smartphone, enabling people to erase all contacts with one click if they are arrested during a protest.
  • ...2 more annotations...
  • The big deal is that privacy and security is not a big deal in Silicon Valley. While technorati tripped over themselves to congratulate Mr. Morin on finessing the bad publicity, a number of concerned engineers e-mailed me noting that the data collection was not an accident. It would have taken programmers weeks to write the code necessary to copy and organize someone’s address book. Many said Apple was at fault, too, for approving Path for its App Store when it appears to violate its rules.
  • Lawyers I spoke with said that my address book — which contains my reporting sources at companies and in government — is protected under the First Amendment. On Path’s servers, it is frightfully open for anyone to see and use, because the company did not encrypt the data.
3More

And Suddenly, The Door Just Gives Way « The Dish - 0 views

  • The poll above (source here), conducted last December, is arguably the critical one. It’s of Americans living in the states that ban marriage rights for gay couples. Commissioned by Freedom To Marry, it reveals the seismic shift of the last few years.
  • Something has fundamentally changed since the late 1980s when I first made this argument. Gay people have become human in the eyes of most straights. Not perfect and not identical – but human in our capacity for love and commitment.
  • that is not, in the end, a political gain. It is a moral one. And it reveals, once again, that those who despair of persuading resistant majorities of core moral arguments in America are wrong. Americans, in the end, are open to persuasion.
41More

George Packer: Is Amazon Bad for Books? : The New Yorker - 0 views

  • Amazon is a global superstore, like Walmart. It’s also a hardware manufacturer, like Apple, and a utility, like Con Edison, and a video distributor, like Netflix, and a book publisher, like Random House, and a production studio, like Paramount, and a literary magazine, like The Paris Review, and a grocery deliverer, like FreshDirect, and someday it might be a package service, like U.P.S. Its founder and chief executive, Jeff Bezos, also owns a major newspaper, the Washington Post. All these streams and tributaries make Amazon something radically new in the history of American business
  • Amazon is not just the “Everything Store,” to quote the title of Brad Stone’s rich chronicle of Bezos and his company; it’s more like the Everything. What remains constant is ambition, and the search for new things to be ambitious about.
  • It wasn’t a love of books that led him to start an online bookstore. “It was totally based on the property of books as a product,” Shel Kaphan, Bezos’s former deputy, says. Books are easy to ship and hard to break, and there was a major distribution warehouse in Oregon. Crucially, there are far too many books, in and out of print, to sell even a fraction of them at a physical store. The vast selection made possible by the Internet gave Amazon its initial advantage, and a wedge into selling everything else.
  • ...38 more annotations...
  • it’s impossible to know for sure, but, according to one publisher’s estimate, book sales in the U.S. now make up no more than seven per cent of the company’s roughly seventy-five billion dollars in annual revenue.
  • A monopoly is dangerous because it concentrates so much economic power, but in the book business the prospect of a single owner of both the means of production and the modes of distribution is especially worrisome: it would give Amazon more control over the exchange of ideas than any company in U.S. history.
  • “The key to understanding Amazon is the hiring process,” one former employee said. “You’re not hired to do a particular job—you’re hired to be an Amazonian. Lots of managers had to take the Myers-Briggs personality tests. Eighty per cent of them came in two or three similar categories, and Bezos is the same: introverted, detail-oriented, engineer-type personality. Not musicians, designers, salesmen. The vast majority fall within the same personality type—people who graduate at the top of their class at M.I.T. and have no idea what to say to a woman in a bar.”
  • According to Marcus, Amazon executives considered publishing people “antediluvian losers with rotary phones and inventory systems designed in 1968 and warehouses full of crap.” Publishers kept no data on customers, making their bets on books a matter of instinct rather than metrics. They were full of inefficiences, starting with overpriced Manhattan offices.
  • For a smaller house, Amazon’s total discount can go as high as sixty per cent, which cuts deeply into already slim profit margins. Because Amazon manages its inventory so well, it often buys books from small publishers with the understanding that it can’t return them, for an even deeper discount
  • According to one insider, around 2008—when the company was selling far more than books, and was making twenty billion dollars a year in revenue, more than the combined sales of all other American bookstores—Amazon began thinking of content as central to its business. Authors started to be considered among the company’s most important customers. By then, Amazon had lost much of the market in selling music and videos to Apple and Netflix, and its relations with publishers were deteriorating
  • In its drive for profitability, Amazon did not raise retail prices; it simply squeezed its suppliers harder, much as Walmart had done with manufacturers. Amazon demanded ever-larger co-op fees and better shipping terms; publishers knew that they would stop being favored by the site’s recommendation algorithms if they didn’t comply. Eventually, they all did.
  • Brad Stone describes one campaign to pressure the most vulnerable publishers for better terms: internally, it was known as the Gazelle Project, after Bezos suggested “that Amazon should approach these small publishers the way a cheetah would pursue a sickly gazelle.”
  • ithout dropping co-op fees entirely, Amazon simplified its system: publishers were asked to hand over a percentage of their previous year’s sales on the site, as “marketing development funds.”
  • The figure keeps rising, though less for the giant pachyderms than for the sickly gazelles. According to the marketing executive, the larger houses, which used to pay two or three per cent of their net sales through Amazon, now relinquish five to seven per cent of gross sales, pushing Amazon’s percentage discount on books into the mid-fifties. Random House currently gives Amazon an effective discount of around fifty-three per cent.
  • In December, 1999, at the height of the dot-com mania, Time named Bezos its Person of the Year. “Amazon isn’t about technology or even commerce,” the breathless cover article announced. “Amazon is, like every other site on the Web, a content play.” Yet this was the moment, Marcus said, when “content” people were “on the way out.”
  • In 2004, he set up a lab in Silicon Valley that would build Amazon’s first piece of consumer hardware: a device for reading digital books. According to Stone’s book, Bezos told the executive running the project, “Proceed as if your goal is to put everyone selling physical books out of a job.”
  • By 2010, Amazon controlled ninety per cent of the market in digital books—a dominance that almost no company, in any industry, could claim. Its prohibitively low prices warded off competition
  • Lately, digital titles have levelled off at about thirty per cent of book sales.
  • The literary agent Andrew Wylie (whose firm represents me) says, “What Bezos wants is to drag the retail price down as low as he can get it—a dollar-ninety-nine, even ninety-nine cents. That’s the Apple play—‘What we want is traffic through our device, and we’ll do anything to get there.’ ” If customers grew used to paying just a few dollars for an e-book, how long before publishers would have to slash the cover price of all their titles?
  • As Apple and the publishers see it, the ruling ignored the context of the case: when the key events occurred, Amazon effectively had a monopoly in digital books and was selling them so cheaply that it resembled predatory pricing—a barrier to entry for potential competitors. Since then, Amazon’s share of the e-book market has dropped, levelling off at about sixty-five per cent, with the rest going largely to Apple and to Barnes & Noble, which sells the Nook e-reader. In other words, before the feds stepped in, the agency model introduced competition to the market
  • But the court’s decision reflected a trend in legal thinking among liberals and conservatives alike, going back to the seventies, that looks at antitrust cases from the perspective of consumers, not producers: what matters is lowering prices, even if that goal comes at the expense of competition. Barry Lynn, a market-policy expert at the New America Foundation, said, “It’s one of the main factors that’s led to massive consolidation.”
  • The combination of ceaseless innovation and low-wage drudgery makes Amazon the epitome of a successful New Economy company. It’s hiring as fast as it can—nearly thirty thousand employees last year.
  • brick-and-mortar retailers employ forty-seven people for every ten million dollars in revenue earned; Amazon employs fourteen.
  • Since the arrival of the Kindle, the tension between Amazon and the publishers has become an open battle. The conflict reflects not only business antagonism amid technological change but a division between the two coasts, with different cultural styles and a philosophical disagreement about what techies call “disruption.”
  • Bezos told Charlie Rose, “Amazon is not happening to bookselling. The future is happening to bookselling.”
  • n Grandinetti’s view, the Kindle “has helped the book business make a more orderly transition to a mixed print and digital world than perhaps any other medium.” Compared with people who work in music, movies, and newspapers, he said, authors are well positioned to thrive. The old print world of scarcity—with a limited number of publishers and editors selecting which manuscripts to publish, and a limited number of bookstores selecting which titles to carry—is yielding to a world of digital abundance. Grandinetti told me that, in these new circumstances, a publisher’s job “is to build a megaphone.”
  • it offers an extremely popular self-publishing platform. Authors become Amazon partners, earning up to seventy per cent in royalties, as opposed to the fifteen per cent that authors typically make on hardcovers. Bezos touts the biggest successes, such as Theresa Ragan, whose self-published thrillers and romances have been downloaded hundreds of thousands of times. But one survey found that half of all self-published authors make less than five hundred dollars a year.
  • The business term for all this clear-cutting is “disintermediation”: the elimination of the “gatekeepers,” as Bezos calls the professionals who get in the customer’s way. There’s a populist inflection to Amazon’s propaganda, an argument against élitist institutions and for “the democratization of the means of production”—a common line of thought in the West Coast tech world
  • “Book publishing is a very human business, and Amazon is driven by algorithms and scale,” Sargent told me. When a house gets behind a new book, “well over two hundred people are pushing your book all over the place, handing it to people, talking about it. A mass of humans, all in one place, generating tremendous energy—that’s the magic potion of publishing. . . . That’s pretty hard to replicate in Amazon’s publishing world, where they have hundreds of thousands of titles.”
  • By producing its own original work, Amazon can sell more devices and sign up more Prime members—a major source of revenue. While the company was building the
  • Like the publishing venture, Amazon Studios set out to make the old “gatekeepers”—in this case, Hollywood agents and executives—obsolete. “We let the data drive what to put in front of customers,” Carr told the Wall Street Journal. “We don’t have tastemakers deciding what our customers should read, listen to, and watch.”
  • book publishers have been consolidating for several decades, under the ownership of media conglomerates like News Corporation, which squeeze them for profits, or holding companies such as Rivergroup, which strip them to service debt. The effect of all this corporatization, as with the replacement of independent booksellers by superstores, has been to privilege the blockbuster.
  • Publishers sometimes pass on this cost to authors, by redefining royalties as a percentage of the publisher’s receipts, not of the book’s list price. Recently, publishers say, Amazon began demanding an additional payment, amounting to approximately one per cent of net sales
  • the long-term outlook is discouraging. This is partly because Americans don’t read as many books as they used to—they are too busy doing other things with their devices—but also because of the relentless downward pressure on prices that Amazon enforces.
  • he digital market is awash with millions of barely edited titles, most of it dreck, while r
  • Amazon believes that its approach encourages ever more people to tell their stories to ever more people, and turns writers into entrepreneurs; the price per unit might be cheap, but the higher number of units sold, and the accompanying royalties, will make authors wealthier
  • In Friedman’s view, selling digital books at low prices will democratize reading: “What do you want as an author—to sell books to as few people as possible for as much as possible, or for as little as possible to as many readers as possible?”
  • The real talent, the people who are writers because they happen to be really good at writing—they aren’t going to be able to afford to do it.”
  • Seven-figure bidding wars still break out over potential blockbusters, even though these battles often turn out to be follies. The quest for publishing profits in an economy of scarcity drives the money toward a few big books. So does the gradual disappearance of book reviewers and knowledgeable booksellers, whose enthusiasm might have rescued a book from drowning in obscurity. When consumers are overwhelmed with choices, some experts argue, they all tend to buy the same well-known thing.
  • These trends point toward what the literary agent called “the rich getting richer, the poor getting poorer.” A few brand names at the top, a mass of unwashed titles down below, the middle hollowed out: the book business in the age of Amazon mirrors the widening inequality of the broader economy.
  • “If they did, in my opinion they would save the industry. They’d lose thirty per cent of their sales, but they would have an additional thirty per cent for every copy they sold, because they’d be selling directly to consumers. The industry thinks of itself as Procter & Gamble*. What gave publishers the idea that this was some big goddam business? It’s not—it’s a tiny little business, selling to a bunch of odd people who read.”
  • Bezos is right: gatekeepers are inherently élitist, and some of them have been weakened, in no small part, because of their complacency and short-term thinking. But gatekeepers are also barriers against the complete commercialization of ideas, allowing new talent the time to develop and learn to tell difficult truths. When the last gatekeeper but one is gone, will Amazon care whether a book is any good? ♦
2More

Beyond Billboards - The Daily Dish | By Andrew Sullivan - 0 views

  • The Atlantic Home todaysDate();Sunday, December 12, 2010Sunday, December 12, 2010 Go Follow the Atlantic » atlanticPrintlayoutnavigation()Politics Presented ByBack to the Gold Standard? Joshua GreenSenate Dems Lose Vote on 'Don't Ask' RepealMegan Scully & Dan FriedmanA Primary Challenge to Obama? Marc Ambinder Business Presented byif (typeof window.dartOrd == 'undefined') {window.dartOrd = ('000000000' + Math.ceil(Math.random()*1000000000).toString()).slice(-9);}jsProperties = 'TheAtlanticOnline/channel_business;pos=navlogo;sz=88x31,215x64;tile=1';document.write('');if( $(".adNavlogo").html().search("grey.gif") != -1 ){$(".adNavlogo").hide();}Will the Economy Get Jobs for Christmas?Daniel Indiviglio27 Key Facts About US ExportsDerek ThompsonThe Last StimulusDerek Thompson Culture Presented ByThe 10 Biggest Sports Stories of 2010Eleanor Barkhorn and Kevin Fallon al
  • at the force behind all that exists actually intervened in the consciousness of humankind in the form of a man so saturated in godliness that merely being near him healed people of the weight of the world's sins.
12More

If It Feels Right - NYTimes.com - 3 views

  • What’s disheartening is how bad they are at thinking and talking about moral issues.
  • you see the young people groping to say anything sensible on these matters. But they just don’t have the categories or vocabulary to do so.
  • “Not many of them have previously given much or any thought to many of the kinds of questions about morality that we asked,” Smith and his co-authors write. When asked about wrong or evil, they could generally agree that rape and murder are wrong. But, aside from these extreme cases, moral thinking didn’t enter the picture, even when considering things like drunken driving, cheating in school or cheating on a partner.
  • ...8 more annotations...
  • The default position, which most of them came back to again and again, is that moral choices are just a matter of individual taste. “It’s personal,” the respondents typically said. “It’s up to the individual. Who am I to say?”
  • “I would do what I thought made me happy or how I felt. I have no other way of knowing what to do but how I internally feel.”
  • their attitudes at the start of their adult lives do reveal something about American culture. For decades, writers from different perspectives have been warning about the erosion of shared moral frameworks and the rise of an easygoing moral individualism. Allan Bloom and Gertrude Himmelfarb warned that sturdy virtues are being diluted into shallow values. Alasdair MacIntyre has written about emotivism, the idea that it’s impossible to secure moral agreement in our culture because all judgments are based on how we feel at the moment. Charles Taylor has argued that morals have become separated from moral sources. People are less likely to feel embedded on a moral landscape that transcends self. James Davison Hunter wrote a book called “The Death of Character.” Smith’s interviewees are living, breathing examples of the trends these writers have described.
  • Smith and company found an atmosphere of extreme moral individualism — of relativism and nonjudgmentalism.
  • they have not been given the resources — by schools, institutions and families — to cultivate their moral intuitions, to think more broadly about moral obligations, to check behaviors that may be degrading.
  • the interviewees were so completely untroubled by rabid consumerism.
  • Many were quick to talk about their moral feelings but hesitant to link these feelings to any broader thinking about a shared moral framework or obligation. As one put it, “I mean, I guess what makes something right is how I feel about it. But different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and wrong.”
  • In most times and in most places, the group was seen to be the essential moral unit. A shared religion defined rules and practices. Cultures structured people’s imaginations and imposed moral disciplines. But now more people are led to assume that the free-floating individual is the essential moral unit. Morality was once revealed, inherited and shared, but now it’s thought of as something that emerges in the privacy of your own heart.
  •  
    Goodness, I went through a bit of emotion reading that. Whew. Gotta center. Anyhoo, I feel certainly conflicted over the author's idea of "shallow values." Personally, I don't necessarily see the need to have a shared moral framework to connect to. What is this framework if not a system to instill shame and obligation into its members? While I do think it's important to have an articulate moral opinion on relevant subjects, I also think the world cannot be divided into realms of right or wrong when we can barely see even an infinitely small part of it at one time. What's wrong with open-mindedness?
3More

Joichi Ito Named Head of M.I.T. Media Lab - NYTimes.com - 0 views

  • Raised in both Tokyo and Silicon Valley, Mr. Ito was part of the first generation to grow up with the Internet. His career includes serving as a board member of Icann, the Internet’s governance organization; becoming a “guild master” in the World of Warcraft online fantasy game; and more than a dozen investments in start-ups like Flickr, Last.fm and Twitter. In 1994 he helped establish the first commercial Internet service provider in Japan.
  • He was also an early participant in the open-source software movement and is a board member of the Mozilla Foundation, which oversees the development of the Firefox Web browse, as well as being the co-founder and chairman of Creative Commons, a nonprofit organization that has sought to create a middle ground to promote the sharing of digital information.
  • “You embrace serendipity and you pivot as you go along this longer term arc. That’s the way I have lived my life. I’ve jumped around in terms of career and geography,” he said. Mr. Ito, who maintains a home outside of Tokyo, became a resident of Dubai at the end of 2008 to gain a better understanding of the Middle East. He said that was part of his desire to understand intellectual property issues internationally and to become what he described as a “global citizen.”
16More

Enlightenment's Evil Twin - The Atlantic - 0 views

  • The first time I can remember feeling like I didn’t exist, I was 15. I was sitting on a train and all of a sudden I felt like I’d been dropped into someone else’s body. My memories, experiences, and feelings—the things that make up my intrinsic sense of “me-ness”—projected across my mind like phantasmagoria, but I felt like they belonged to someone else. Like I was experiencing life in the third person.
  • It’s characterized by a pervasive and disturbing sense of unreality in both the experience of self (called “depersonalization”) and one’s surroundings (known as “derealization”); accounts of it are surreal, obscure, shrouded in terms like “unreality” and “dream,” but they’re often conveyed with an almost incongruous lucidity.
  • It’s not a psychotic condition; the sufferers are aware that what they’re perceiving is unusual. “We call it an ‘as if’ disorder. People say they feel as if they’re in a movie, as if they’re a robot,” Medford says.
  • ...13 more annotations...
  • Studies carried out with college students have found that brief episodes are common in young people, with a prevalence ranging from 30 to 70 percent. It can happen when you’re jet-lagged, hungover, or stressed. But for roughly 1 to 2 percent of the population, it becomes persistent, and distressing
  • Research suggests that areas of the brain that are key to emotional and physical sensations, such as the amygdala and the insula, appear to be less responsive in chronic depersonalization sufferers. You might become less empathetic; your pain threshold might increase. These numbing effects mean that it’s commonly conceived as a defense mechanism; Hunter calls it a “psychological trip switch” which can be triggered in times of stress.
  • Have you ever played that game when you repeat a word over and over again until it loses all meaning? It’s called semantic satiation. Like words, can a sense of self be broken down into arbitrary, socially-constructed components?
  • That question may be why the phenomenon has attracted a lot of interest from philosophers. In a sense, the experience presupposes certain notions of how the self is meant to feel. We think of a self as an essential thing—a soul or an ego that everyone has and is aware of—but scientists and philosophers have been telling us for a while now that the self isn’t quite as it seems
  • there is no center in the brain where the self is generated. “What we experience is a powerful depiction generated by our brains for our benefit,” he writes. Brains make sense of data that would otherwise be overwhelming. “Experiences are fragmented episodes unless they are woven together in a meaningful narrative,” he writes, with the self being the story that “pulls it all together.”
  • “The unity [of self that] we experience, which allows us legitimately to talk of ‘I,’ is a result of the Ego Trick—the remarkable way in which a complicated bundle of mental events, made possible by the brain, creates a singular self, without there being a singular thing underlying it,”
  • depersonalization is both a burden, a horrible burden—but it’s in some strange way a blessing, to reach some depths, some meaning which somehow comes only in the broken mirror,” Bezzubova says. “It’s a Dostoyevsky style illumination—where clarity cannot be distinguished from pain.”
  • for her, the experience is pleasant. “It’s helped me in my life,” she says. Over the past few years, she has learned to interpret her experiences in a Buddhist context, and she describes depersonalization as a “deconditioning” of sorts: “The significance I place on the world is all in my mind,”
  • “I believe I am on the path to enlightenment,” she says.
  • The crossover between dark mental states and Buddhist practices is being investigated
  • Mindfulness has become increasingly popular in the West over the past few years, but as Britton told The Atlantic, the practice in its original form isn’t just about relaxation: It’s about the often painstaking process of coming to terms with three specific insights of the Theravadin Buddhist tradition, which are anicca, or impermanence; dukkha, or dissatisfaction; and anatta, or not-self.
  • depersonalization must cause the patient distress and have an impact on her daily functioning for it to be classified as clinically significant. In this sense, it seems inappropriate to call Alice’s experiences pathological. “We have ways of measuring disorders, but you have to ask if it’s meaningful. It’s an open question,”
  • “I think calling it a loss of self is maybe a convenient shorthand for something that’s hard to capture,” he says. “I prefer to talk about experience—because that’s what’s important in psychiatry.”
6More

On Pi Day, Celebrate Math's Enigmas - NYTimes.com - 0 views

  • a better way to commemorate the day is by trying to grasp what pi truly is, and why it remains so significant.
  • Pi is irrational, meaning it cannot be expressed as the ratio of two whole numbers. There is no way to write it down exactly: Its decimals continue endlessly without ever settling into a repeating pattern
  • pi, being the ratio of a circle’s circumference to its diameter, is manifested all around us. For instance, the meandering length of a gently sloping river between source and mouth approaches, on average, pi times its straight-line distance. Pi reminds us that the universe is what it is, that it doesn’t subscribe to our ideas of mathematical convenience.
  • ...3 more annotations...
  • pi’s infinite randomness can also be seen more as richness. What amazes, then, is the possibility that such profusion can come from a rule so simple: circumference divided by diameter. This is characteristic of mathematics, whereby elementary formulas can give rise to surprisingly varied phenomena. For instance, the humble quadratic can be used to model everything from the growth of bacterial populations to the manifestation of chaos. Pi makes us wonder if our universe’s complexity emerges from similarly simple mathematical building blocks.
  • Pi also opens a window into a more uncharted universe, the one consisting of transcendental numbers, which exclude such common irrationals as square and cube roots. Pi is one of the few transcendentals we ever encounter. One may suspect that such numbers would be quite rare, but actually, the opposite is true. Out of the totality of numbers, almost all are transcendental. Pi reveals how limited human knowledge is, how there exist teeming realms we might never explore.
  • But pi, on cue, reminds us that it is an abstraction, like all else in mathematics. The perfect flat circle is impossible to realize in practice. An area calculated using pi will never exactly match the same area measured physically. This is to be expected whenever we approximate reality using the idealizations of math.
6More

How the Intersection of Art and Science Made History | Patrick Daniel - 1 views

  • Leonardo Da Vinci lived in a time of cultural transition known as the Renaissance, an era of philosophical, scientific and religious "rebirth," where the masses no longer accepted beliefs at face value and questioned the reasoning behind given theories. This new mindset gave rise to a curiosity about the origins of science and how art could demonstrate them.
  • "perception is the origin of all knowledge" and that "science is the observation of things possible, whether present or past."
  • He was the first of his kind to combine the keen eye of an artist to study the detail of his findings, and the curiosity of a scientist to approach his subjects with an open mind.
  • ...3 more annotations...
  • His findings became a source of inspiration and a base for study that we continue to refer back to primarily due to not only the artistic detail of the machinery but also the aerodynamic theories, the description of wind currents, the mathematical calculations and measurements, and the engineering of each contraption
  • When we heed respect for artists, we often commend contemporary idols with discoveries and defiant actions that tear away from the norm, but we forget those artists living at the crossroads of the arts and sciences, who contribute to society in various unique ways.
  • common ground where science and art meet, provided a degree of mental stimulation and a sense of independent exploration that moved society forward.
18More

Addicted to Distraction - The New York Times - 0 views

  • ONE evening early this summer, I opened a book and found myself reading the same paragraph over and over, a half dozen times before concluding that it was hopeless to continue. I simply couldn’t marshal the necessary focus.
  • All my life, reading books has been a deep and consistent source of pleasure, learning and solace. Now the books I regularly purchased were piling up ever higher on my bedside table, staring at me in silent rebuke.
  • Instead of reading them, I was spending too many hours online,
  • ...15 more annotations...
  • “The net is designed to be an interruption system, a machine geared to dividing attention,” Nicholas Carr explains in his book “The Shallows: What the Internet Is Doing to Our Brains.” “We willingly accept the loss of concentration and focus, the division of our attention and the fragmentation of our thoughts, in return for the wealth of compelling or at least diverting information we receive.”
  • Addiction is the relentless pull to a substance or an activity that becomes so compulsive it ultimately interferes with everyday life
  • Denial is any addict’s first defense. No obstacle to recovery is greater than the infinite capacity to rationalize our compulsive behaviors
  • According to one recent survey, the average white-collar worker spends about six hours a day on email.
  • The brain’s craving for novelty, constant stimulation and immediate gratification creates something called a “compulsion loop.” Like lab rats and drug addicts, we need more and more to get the same effect.
  • Endless access to new information also easily overloads our working memory. When we reach cognitive overload, our ability to transfer learning to long-term memory significantly deteriorates.
  • By that definition, nearly everyone I know is addicted in some measure to the Internet. It has arguably replaced work itself as our most socially sanctioned addictio
  • t we humans have a very limited reservoir of will and discipline. We’re far more likely to succeed by trying to change one behavior at a time, ideally at the same time each day, so that it becomes a habit, requiring less and less energy to sustain.
  • Now it was time to detox. I interpreted the traditional second step — belief that a higher power could help restore my sanity — in a more secular way. The higher power became my 30-year-old daughter, who disconnected my phone and laptop from both my email and the Web.
  • During those first few days, I did suffer withdrawal pangs, most of all the hunger to call up Google and search for an answer to some question that arose. But with each passing day offline, I felt more relaxed, less anxious, more able to focus and less hungry for the next shot of instant but short-lived stimulation. What happened to my brain is exactly what I hoped would happen: It began to quiet down.
  • I had brought more than a dozen books of varying difficulty and length on my vacation. I started with short nonfiction, and then moved to longer nonfiction as I began to feel calmer and my focus got stronger. I eventually worked my way up to “The Emperor of All Maladies
  • I am back at work now, and of course I am back online. The Internet isn’t going away, and it will continue to consume a lot of my attention. My aim now is to find the best possible balance between time online and time off
  • I also make it my business now to take on more fully absorbing activities as part of my days. Above all, I’ve kept up reading books, not just because I love them, but also as a continuing attention-building practice.
  • I’ve retained my longtime ritual of deciding the night before on the most important thing I can accomplish the next morning. That’s my first work activity most days, for 60 to 90 minutes without interruption. Afterward, I take a 10- to 15-minute break to quiet my mind and renew my energy.
  • If I have other work during the day that requires sustained focus, I go completely offline for designated periods, repeating my morning ritual. In the evening, when I go up to my bedroom, I nearly always leave my digital devices downstairs.
29More

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
9More

How, and why, a journalist tricked news outlets into thinking chocolate makes you thin ... - 1 views

  • This spring, the journal “International Archives of Medicine” published a delicious new study: According to researchers at Germany’s Institute of Diet and Health, people who ate dark chocolate while dieting lost more weight
  • It turns out that the Institute of Diet and Health is just a Web site with no institute attached. Johannes Bohannon, health researcher and lead author of the study, is really John Bohannon, a science journalist. And the study, while based on real results of an actual clinical trial, wasn’t aimed at testing the health benefits of chocolate. It was aimed at testing health reporters, to see if they could distinguish a bad science story from a good one.
  • “demonstrate just how easy it is to turn bad science into the big headlines behind diet fads.”
  • ...6 more annotations...
  • Bohannon had done similar work before — in 2013 he submitted a fake research paper to more than 300 open-access journals as part of a sting operation for the journal Science.
  • Studies like his are called “underpowered,” meaning that they aren’t designed to distinguish between a real effect and pure luck. A study with thousands of participants being measured for just a few effects is “powerful.” But one like Bohannon’s, with just five people per group being measured according to any of 18 different variables? Any number of factors unrelated to the study could cause one of the variables to fluctuate, allowing researchers to irresponsibly — but not untruthfully — state that eating chocolate while dieting helps you lose more weight.
  • A responsible scientist shouldn’t conduct a trial like this, Bohannon said, and a responsible scientific journal shouldn’t publish it. But Bohannon is not a nutrition scientist (he does have a PhD in molecular biology) and the International Archives of Medicine, he says, is not the most responsible journal.
  • According to Bohannon, the journal didn’t peer review his study or even edit it (and the study could have used an edit — “chocolate” is misspelled more than once).
  • “It’s the reporters,” he told The Post. “The reporters and ultimately the editors. … People who are on the health science beat need to treat it like science, and that has to come from the editors. And if you’re reporting on a scientific study you need to actually look at the paper, you need to talk to a source who has real scientific expertise.”
  • Bohannon said he didn’t have any ethical qualms about tricking his fellow journalists this way. “I didn’t lie to reporters, except about my name. And whenever they asked me a scientific question about the study I gave them a completely honest answer,” he said. “The whole point is that this was as bad as a lot of science that is considered ‘real’ science. It gets reported without people asking the right questions.”
21More

It's True: False News Spreads Faster and Wider. And Humans Are to Blame. - The New York... - 0 views

  • What if the scourge of false news on the internet is not the result of Russian operatives or partisan zealots or computer-controlled bots? What if the main problem is us?
  • People are the principal culprits
  • people, the study’s authors also say, prefer false news.
  • ...18 more annotations...
  • As a result, false news travels faster, farther and deeper through the social network than true news.
  • those patterns applied to every subject they studied, not only politics and urban legends, but also business, science and technology.
  • The stories were classified as true or false, using information from six independent fact-checking organizations including Snopes, PolitiFact and FactCheck.org
  • with or without the bots, the results were essentially the same.
  • “It’s not really the robots that are to blame.”
  • “News” and “stories” were defined broadly — as claims of fact — regardless of the source. And the study explicitly avoided the term “fake news,” which, the authors write, has become “irredeemably polarized in our current political and media climate.”
  • False claims were 70 percent more likely than the truth to be shared on Twitter. True stories were rarely retweeted by more than 1,000 people, but the top 1 percent of false stories were routinely shared by 1,000 to 100,000 people. And it took true stories about six times as long as false ones to reach 1,500 people.
  • the researchers enlisted students to annotate as true or false more than 13,000 other stories that circulated on Twitter.
  • “The comprehensiveness is important here, spanning the entire history of Twitter,” said Jon Kleinberg, a computer scientist at Cornell University. “And this study shines a spotlight on the open question of the success of false information online.”
  • The M.I.T. researchers pointed to factors that contribute to the appeal of false news. Applying standard text-analysis tools, they found that false claims were significantly more novel than true ones — maybe not a surprise, since falsehoods are made up.
  • The goal, said Soroush Vosoughi, a postdoctoral researcher at the M.I.T. Media Lab and the lead author, was to find clues about what is “in the nature of humans that makes them like to share false news.”
  • The study analyzed the sentiment expressed by users in replies to claims posted on Twitter. As a measurement tool, the researchers used a system created by Canada’s National Research Council that associates English words with eight emotions
  • False claims elicited replies expressing greater surprise and disgust. True news inspired more anticipation, sadness and joy, depending on the nature of the stories.
  • The M.I.T. researchers said that understanding how false news spreads is a first step toward curbing it. They concluded that human behavior plays a large role in explaining the phenomenon, and mention possible interventions, like better labeling, to alter behavior.
  • For all the concern about false news, there is little certainty about its influence on people’s beliefs and actions. A recent study of the browsing histories of thousands of American adults in the months before the 2016 election found that false news accounted for only a small portion of the total news people consumed.
  • In fall 2016, Mr. Roy, an associate professor at the M.I.T. Media Lab, became a founder and the chairman of Cortico, a nonprofit that is developing tools to measure public conversations online to gauge attributes like shared attention, variety of opinion and receptivity. The idea is that improving the ability to measure such attributes would lead to better decision-making that would counteract misinformation.
  • Mr. Roy acknowledged the challenge in trying to not only alter individual behavior but also in enlisting the support of big internet platforms like Facebook, Google, YouTube and Twitter, and media companies
  • “Polarization,” he said, “has turned out to be a great business model.”
75More

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
13More

How the Language We Speak Affects the Way We Think | Psychology Today - 0 views

  • The story begins with the first American linguists who described (scientifically) some of the languages spoken by Native Americans. They discovered many awkward differences compared to the languages they had learned in school (ancient Greek, Latin, English, German, and the like).
  • They found sounds never heard in European languages (like ejective consonants), strange meanings encoded in the grammar (like parts of the verb referring to shapes of the objects), or new grammatical categories (like evidentiality, that is, the source of knowledge about the facts in a sentence).
  • Not surprisingly, some of these linguists concluded that such strange linguistic systems should have an effect on the mind of their speakers
  • ...10 more annotations...
  • Edward Sapir, one of the most influential American linguists, wrote: “The worlds in which different societies live are distinct worlds, not merely the same worlds with different labels attached” (Sapir, 1949: 162).
  • Now it was suggested that the world might be perceived differently by people speaking different languages.
  • This effect of framing or filtering is the main effect we can expect—regarding language—from perception and thought. Languages do not limit our ability to perceive the world or to think about the world, but they focus our perception, attention, and thought on specific aspects of the world.
  • Chinese-speaking children learn to count earlier than English-speaking children because Chinese numbers are more regular and transparent than English numbers (in Chinese, "eleven" is "ten one").
  • So, different languages focus the attention of their speakers on different aspects of the environment—either physical or cultural.
  • We linguists say that these salient aspects are either lexicalized or grammaticalised. Lexicalizing means that you have words for concepts, which work as shorthands for those concepts. This is useful because you don't need to explain (or paraphrase) the meaning you want to convey.
  • The lexicon is like a big, open bag: Some words are coined or borrowed because you need them for referring to new objects, and they are put into the bag. Conversely, some objects are not used anymore, and then the words for them are removed from the bag.
  • Dyirbal, a language spoken in Northern Australia, for example, has four noun classes (like English genders).
  • This grammatical classification of nouns involves a coherent view of the world, including an original mythology.
  • In summary, language functions as a filter of perception, memory, and attention. Whenever we construct or interpret a linguistic statement, we need to focus on specific aspects of the situation that the statement describes
7More

Tell all the truth slant - Philosophy and Life - 1 views

  • “Tell all the truth, but tell it slant,” wrote the poet Emily Dickinson: “Success in circuit lies.” The advice is itself a truth, a commendation in the art of looking sideways.
  • Dickinson lived in an age when it was becoming impossible to find truth straightforwardly,
  • What is striking about Dickinson, though, is that she both experienced the darkness of that doubt, and found a way to transform it into an experience that produced meaning. It’s all about the pursuit of the circuitous.
  • ...4 more annotations...
  • That her medium was poetry is no mere detail. It is almost the whole story. Poetry not only allows her to express herself – her desire for consolation, her anxiety about what’s disappearing. It is also the form of writing par excellence that can keep an eye open for what is peripheral. It can discern truths that words otherwise struggle to articulate. It glimpses, and hopes.
  • to know Socrates was to know someone who sought all the truth, and in so doing, realised it mostly lies out of sight.
  • Moses too does not see anything directly. He apparently doesn’t see anything at all. Instead, an oblique experience is granted to him. It is better described as a kind of unknowing, rather than knowing. He must leave behind what he has previously observed because this seeing consists in not seeing. That which is sought transcends all knowledge.
  • What she realises is that the truth which is beyond us, which is discerned only indirectly, is the only truth that is truly worth seeking. That which we can readily grasp and manipulate is too easy for us. It’s humdrum. It leaves life too small for us, the creature with an eye for the transcendent. But look further, and what you are offered is what she calls truth’s ‘superb surprise’. That’s why success lies in circuit. Our humanity is spoken to, from a direction – a source – that we had not expected. And our humanity expands as a result.
20More

How the web distorts reality and impairs our judgement skills | Media Network | The Gua... - 0 views

  • IBM estimates that 90% of the world's online data has been created just in the past two years. What's more, it has made information more accessible than ever before.
  • However, rather than enhancing knowledge, the internet has produced an information glut or "infoxication".
  • Furthermore, since online content is often curated to fit our preferences, interests and personality, the internet can even enhance our existing biases and undermine our motivation to learn new things.
    • ilanaprincilus06
       
      When we see our preferences constantly being displayed, we are more likely to go back to wherever the information was or utilize that source, website, etc more often.
  • ...14 more annotations...
  • these filters will isolate people in information bubbles only partly of their own choosing, and the inaccurate beliefs they form as a result may be difficult to correct."
  • the proliferation of search engines, news aggregators and feed-ranking algorithms is more likely to perpetuate ignorance than knowledge.
  • It would seem that excessive social media use may intensify not only feelings of loneliness, but also ideological isolation.
    • ilanaprincilus06
       
      Would social media networks need to stop exploiting these preferences in order for us to limit ideological isolation?
  • "What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact."
  • Recent studies show that although most people consume information that matches their opinions, being exposed to conflicting views tends to reduce prejudice and enhance creative thinking.
  • the desire to prove ourselves right and maintain our current beliefs trumps any attempt to be creative or more open-minded.
  • "our objects of inquiry are not 'truth' or 'meaning' but rather configurations of consciousness. These are figures or patterns of knowledge, cognitive and practical attitudes, which emerge within a definite historical and cultural context."
  • the internet is best understood as a cultural lens through which we construct – or distort – reality.
  • we can only deal with this overwhelming range of choices by ignoring most of them.
  • trolling is so effective for enticing readers' comments, but so ineffective for changing their viewpoints.
  • Will accumulating facts help you understand the world?
    • ilanaprincilus06
       
      We must take an extra step past just reading/learning about facts and develop second order thinking about the claims/facts to truly gain a better sense of what is going on.
  • we have developed a dependency on technology, which has eclipsed our reliance on logic, critical thinking and common sense: if you can find the answer online, why bother thinking?
  • it is conceivable that individuals' capacity to evaluate and produce original knowledge will matter more than the actual acquisition of knowledge.
  • Good judgment and decision-making will be more in demand than sheer expertise or domain-specific knowledge.
« First ‹ Previous 41 - 60 of 73 Next ›
Showing 20 items per page