Skip to main content

Home/ TOK Friends/ Group items matching "electron" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
7More

Does Thinking Really Hard Burn More Calories?: Scientific American - 0 views

  • Just as vigorous exercise tires our bodies, intellectual exertion should drain the brain. What the latest science reveals, however, is that the popular notion of mental exhaustion is too simplistic. The brain continuously slurps up huge amounts of energy for an organ of its size, regardless of whether we are tackling integral calculus or clicking through the week's top 10 LOLcats. Although firing neurons summon extra blood, oxygen and glucose, any local increases in energy consumption are tiny compared with the brain's gluttonous baseline intake. So, in most cases, short periods of additional mental effort require a little more brainpower than usual, but not much more.
  • something must explain the feeling of mental exhaustion, even if its physiology differs from physical fatigue. Simply believing that our brains have expended a lot of effort might be enough to make us lethargic.
  • a typical adult human brain runs on around 12 watts—a fifth of the power required by a standard 60 watt lightbulb. Compared with most other organs, the brain is greedy; pitted against man-made electronics, it is astoundingly efficient. IBM's Watson, the supercomputer that defeated Jeopardy! champions, depends on ninety IBM Power 750 servers, each of which requires around one thousand watts.
  • ...4 more annotations...
  • people routinely enjoy intellectually invigorating activities without suffering mental exhaustion.
  • Such fatigue seems much more likely to follow sustained mental effort that we do not seek for pleasure—such as the obligatory SAT—especially when we expect that the ordeal will drain our brains. If we think an exam or puzzle will be difficult, it often will be.
  • Studies have shown that something similar happens when people exercise and play sports: a large component of physical exhaustion is in our heads. In related research, volunteers that cycled on an exercise bike following a 90-minute computerized test of sustained attention quit pedaling from exhaustion sooner than participants that watched emotionally neutral documentaries before exercising
  • In the specific case of the SAT, something beyond pure mental effort likely contributes to post-exam stupor: stress. After all, the brain does not function in a vacuum. Other organs burn up energy, too. Taking an exam that partially determines where one will spend the next four years is nerve-racking enough to send stress hormones swimming through the blood stream, induce sweating, quicken heart rates and encourage fidgeting and contorted body postures. The SAT and similar trials are not just mentally taxing—they are physically exhausting, too.
14More

New Statesman - All machine and no ghost? - 0 views

  • More subtly, there are many who insist that consciousness just reduces to brain states - a pang of regret, say, is just a surge of chemicals across a synapse. They are collapsers rather than deniers. Though not avowedly eliminative, this kind of view is tacitly a rejection of the very existence of consciousness
  • it occurred to me that the problem might lie not in nature but in ourselves: we just don't have the faculties of comprehension that would enable us to remove the sense of mystery. Ontologically, matter and consciousness are woven intelligibly together but epistemologically we are precluded from seeing how. I used Noam Chomsky's notion of "mysteries of nature" to describe the situation as I saw it. Soon, I was being labelled (by Owen Flanagan) a "mysterian"
  • Dualism makes the mind too separate, thereby precluding intelligible interaction and dependence.
  • ...11 more annotations...
  • At this point the idealist swooshes in: ladies and gentlemen, there is nothing but mind! There is no problem of interaction with matter because matter is mere illusion
  • idealism has its charms but taking it seriously requires an antipathy to matter bordering on the maniacal. Are we to suppose that material reality is just a dream, a baseless fantasy, and that the Big Bang was nothing but the cosmic spirit having a mental sneezing fit?
  • pan­psychism: even the lowliest of material things has a streak of sentience running through it, like veins in marble. Not just parcels of organic matter, such as lizards and worms, but also plants and bacteria and water molecules and even electrons. Everything has its primitive feelings and minute allotment of sensation.
  • The trouble with panpsychism is that there just isn't any evidence of the universal distribution of consciousness in the material world.
  • The dualist, by contrast, freely admits that consciousness exists, as well as matter, holding that reality falls into two giant spheres. There is the physical brain, on the one hand, and the conscious mind, on the other: the twain may meet at some point but they remain distinct entities.
  • The more we know of the brain, the less it looks like a device for creating consciousness: it's just a big collection of biological cells and a blur of electrical activity - all machine and no ghost.
  • mystery is quite pervasive, even in the hardest of sciences. Physics is a hotbed of mystery: space, time, matter and motion - none of it is free of mysterious elements. The puzzles of quantum theory are just a symptom of this widespread lack of understanding
  • The human intellect grasps the natural world obliquely and glancingly, using mathematics to construct abstract representations of concrete phenomena, but what the ultimate nature of things really is remains obscure and hidden. How everything fits together is particularly elusive, perhaps reflecting the disparate cognitive faculties we bring to bear on the world (the senses, introspection, mathematical description). We are far from obtaining a unified theory of all being and there is no guarantee that such a theory is accessible by finite human intelligence.
  • real naturalism begins with a proper perspective on our specifically human intelligence. Palaeoanthropologists have taught us that the human brain gradually evolved from ancestral brains, particularly in concert with practical toolmaking, centring on the anatomy of the human hand. This history shaped and constrained the form of intelligence now housed in our skulls (as the lifestyle of other species form their set of cognitive skills). What chance is there that an intelligence geared to making stone tools and grounded in the contingent peculiarities of the human hand can aspire to uncover all the mysteries of the universe? Can omniscience spring from an opposable thumb? It seems unlikely, so why presume that the mysteries of consciousness will be revealed to a thumb-shaped brain like ours?
  • The "mysterianism" I advocate is really nothing more than the acknowledgment that human intelligence is a local, contingent, temporal, practical and expendable feature of life on earth - an incremental adaptation based on earlier forms of intelligence that no one would reg
  • rd as faintly omniscient. The current state of the philosophy of mind, from my point of view, is just a reflection of one evolutionary time-slice of a particular bipedal species on a particular humid planet at this fleeting moment in cosmic history - as is everything else about the human animal. There is more ignorance in it than knowledge.
37More

The American Scholar: The Disadvantages of an Elite Education - William Deresiewicz - 1 views

  • the last thing an elite education will teach you is its own inadequacy
  • I’m talking about the whole system in which these skirmishes play out. Not just the Ivy League and its peer institutions, but also the mechanisms that get you there in the first place: the private and affluent public “feeder” schools, the ever-growing parastructure of tutors and test-prep courses and enrichment programs, the whole admissions frenzy and everything that leads up to and away from it. The message, as always, is the medium. Before, after, and around the elite college classroom, a constellation of values is ceaselessly inculcated.
  • The first disadvantage of an elite education, as I learned in my kitchen that day, is that it makes you incapable of talking to people who aren’t like you. Elite schools pride themselves on their diversity, but that diversity is almost entirely a matter of ethnicity and race. With respect to class, these schools are largely—indeed increasingly—homogeneous. Visit any elite campus in our great nation and you can thrill to the heartwarming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals.
  • ...34 more annotations...
  • My education taught me to believe that people who didn’t go to an Ivy League or equivalent school weren’t worth talking to, regardless of their class. I was given the unmistakable message that such people were beneath me.
  • The existence of multiple forms of intelligence has become a commonplace, but however much elite universities like to sprinkle their incoming classes with a few actors or violinists, they select for and develop one form of intelligence: the analytic.
  • Students at places like Cleveland State, unlike those at places like Yale, don’t have a platoon of advisers and tutors and deans to write out excuses for late work, give them extra help when they need it, pick them up when they fall down.
  • When people say that students at elite schools have a strong sense of entitlement, they mean that those students think they deserve more than other people because their SAT scores are higher.
  • The political implications should be clear. As John Ruskin told an older elite, grabbing what you can get isn’t any less wicked when you grab it with the power of your brains than with the power of your fists.
  • students at places like Yale get an endless string of second chances. Not so at places like Cleveland State.
  • The second disadvantage, implicit in what I’ve been saying, is that an elite education inculcates a false sense of self-worth. Getting to an elite college, being at an elite college, and going on from an elite college—all involve numerical rankings: SAT, GPA, GRE. You learn to think of yourself in terms of those numbers. They come to signify not only your fate, but your identity; not only your identity, but your value.
  • For the elite, there’s always another extension—a bailout, a pardon, a stint in rehab—always plenty of contacts and special stipends—the country club, the conference, the year-end bonus, the dividend.
  • In short, the way students are treated in college trains them for the social position they will occupy once they get out. At schools like Cleveland State, they’re being trained for positions somewhere in the middle of the class system, in the depths of one bureaucracy or another. They’re being conditioned for lives with few second chances, no extensions, little support, narrow opportunity—lives of subordination, supervision, and control, lives of deadlines, not guidelines. At places like Yale, of course, it’s the reverse.
  • Elite schools nurture excellence, but they also nurture what a former Yale graduate student I know calls “entitled mediocrity.”
  • An elite education gives you the chance to be rich—which is, after all, what we’re talking about—but it takes away the chance not to be. Yet the opportunity not to be rich is one of the greatest opportunities with which young Americans have been blessed. We live in a society that is itself so wealthy that it can afford to provide a decent living to whole classes of people who in other countries exist (or in earlier times existed) on the brink of poverty or, at least, of indignity. You can live comfortably in the United States as a schoolteacher, or a community organizer, or a civil rights lawyer, or an artist
  • The liberal arts university is becoming the corporate university, its center of gravity shifting to technical fields where scholarly expertise can be parlayed into lucrative business opportunities.
  • You have to live in an ordinary house instead of an apartment in Manhattan or a mansion in L.A.; you have to drive a Honda instead of a BMW or a Hummer; you have to vacation in Florida instead of Barbados or Paris, but what are such losses when set against the opportunity to do work you believe in, work you’re suited for, work you love, every day of your life? Yet it is precisely that opportunity that an elite education takes away. How can I be a schoolteacher—wouldn’t that be a waste of my expensive education?
  • Isn’t it beneath me? So a whole universe of possibility closes, and you miss your true calling.
  • This is not to say that students from elite colleges never pursue a riskier or less lucrative course after graduation, but even when they do, they tend to give up more quickly than others.
  • But if you’re afraid to fail, you’re afraid to take risks, which begins to explain the final and most damning disadvantage of an elite education: that it is profoundly anti-intellectual.
  • being an intellectual is not the same as being smart. Being an intellectual means more than doing your homework.
  • The system forgot to teach them, along the way to the prestige admissions and the lucrative jobs, that the most important achievements can’t be measured by a letter or a number or a name. It forgot that the true purpose of education is to make minds, not careers.
  • Being an intellectual means, first of all, being passionate about ideas—and not just for the duration of a semester, for the sake of pleasing the teacher, or for getting a good grade.
  • Only a small minority have seen their education as part of a larger intellectual journey, have approached the work of the mind with a pilgrim soul. These few have tended to feel like freaks, not least because they get so little support from the university itself. Places like Yale, as one of them put it to me, are not conducive to searchers. GA_googleFillSlot('Rectangle_InArticle_Right'); GA_googleCreateDomIframe("google_ads_div_Rectangle_InArticle_Right_ad_container" ,"Rectangle_InArticle_Right"); Places like Yale are simply not set up to help students ask the big questions
  • Professors at top research institutions are valued exclusively for the quality of their scholarly work; time spent on teaching is time lost. If students want a conversion experience, they’re better off at a liberal arts college.
  • When elite universities boast that they teach their students how to think, they mean that they teach them the analytic and rhetorical skills necessary for success in law or medicine or science or business.
  • Although the notion of breadth is implicit in the very idea of a liberal arts education, the admissions process increasingly selects for kids who have already begun to think of themselves in specialized terms—the junior journalist, the budding astronomer, the language prodigy. We are slouching, even at elite schools, toward a glorified form of vocational training.
  • There’s a reason elite schools speak of training leaders, not thinkers—holders of power, not its critics. An independent mind is independent of all allegiances, and elite schools, which get a large percentage of their budget from alumni giving, are strongly invested in fostering institutional loyalty.
  • At a school like Yale, students who come to class and work hard expect nothing less than an A-. And most of the time, they get it.
  • Yet there is a dimension of the intellectual life that lies above the passion for ideas, though so thoroughly has our culture been sanitized of it that it is hardly surprising if it was beyond the reach of even my most alert students. Since the idea of the intellectual emerged in the 18th century, it has had, at its core, a commitment to social transformation. Being an intellectual means thinking your way toward a vision of the good society and then trying to realize that vision by speaking truth to power.
  • It takes more than just intellect; it takes imagination and courage.
  • Being an intellectual begins with thinking your way outside of your assumptions and the system that enforces them. But students who get into elite schools are precisely the ones who have best learned to work within the system, so it’s almost impossible for them to see outside it, to see that it’s even there.
  • Paradoxically, the situation may be better at second-tier schools and, in particular, again, at liberal arts colleges than at the most prestigious universities. Some students end up at second-tier schools because they’re exactly like students at Harvard or Yale, only less gifted or driven. But others end up there because they have a more independent spirit. They didn’t get straight A’s because they couldn’t be bothered to give everything in every class. They concentrated on the ones that meant the most to them or on a single strong extracurricular passion or on projects that had nothing to do with school
  • I’ve been struck, during my time at Yale, by how similar everyone looks. You hardly see any hippies or punks or art-school types, and at a college that was known in the ’80s as the Gay Ivy, few out lesbians and no gender queers. The geeks don’t look all that geeky; the fashionable kids go in for understated elegance. Thirty-two flavors, all of them vanilla.
  • The most elite schools have become places of a narrow and suffocating normalcy. Everyone feels pressure to maintain the kind of appearance—and affect—that go with achievement
  • Now that students are in constant electronic contact, they never have trouble finding each other. But it’s not as if their compulsive sociability is enabling them to develop deep friendships.
  • What happens when busyness and sociability leave no room for solitude? The ability to engage in introspection, I put it to my students that day, is the essential precondition for living an intellectual life, and the essential precondition for introspection is solitude
  • the life of the mind is lived one mind at a time: one solitary, skeptical, resistant mind at a time. The best place to cultivate it is not within an educational system whose real purpose is to reproduce the class system.
15More

The Age of 'Infopolitics' - NYTimes.com - 0 views

  • we need a new way of thinking about our informational milieu. What we need is a concept of infopolitics that would help us understand the increasingly dense ties between politics and information
  • Infopolitics encompasses not only traditional state surveillance and data surveillance, but also “data analytics” (the techniques that enable marketers at companies like Target to detect, for instance, if you are pregnant), digital rights movements (promoted by organizations like the Electronic Frontier Foundation), online-only crypto-currencies (like Bitcoin or Litecoin), algorithmic finance (like automated micro-trading) and digital property disputes (from peer-to-peer file sharing to property claims in the virtual world of Second Life)
  • Surveying this iceberg is crucial because atop it sits a new kind of person: the informational person. Politically and culturally, we are increasingly defined through an array of information architectures: highly designed environments of data, like our social media profiles, into which we often have to squeeze ourselves
  • ...12 more annotations...
  • We have become what the privacy theorist Daniel Solove calls “digital persons.” As such we are subject to infopolitics (or what the philosopher Grégoire Chamayou calls “datapower,” the political theorist Davide Panagia “datapolitik” and the pioneering thinker Donna Haraway “informatics of domination”).
  • Once fingerprints, biometrics, birth certificates and standardized names were operational, it became possible to implement an international passport system, a social security number and all other manner of paperwork that tells us who someone is. When all that paper ultimately went digital, the reams of data about us became radically more assessable and subject to manipulation,
  • We like to think of ourselves as somehow apart from all this information. We are real — the information is merely about us.
  • But what is it that is real? What would be left of you if someone took away all your numbers, cards, accounts, dossiers and other informational prostheses? Information is not just about you — it also constitutes who you are.
  • We understandably do not want to see ourselves as bits and bytes. But unless we begin conceptualizing ourselves in this way, we leave it to others to do it for us
  • agencies and corporations will continue producing new visions of you and me, and they will do so without our input if we remain stubbornly attached to antiquated conceptions of selfhood that keep us from admitting how informational we already are.
  • What should we do about our Internet and phone patterns’ being fastidiously harvested and stored away in remote databanks where they await inspection by future algorithms developed at the National Security Agency, Facebook, credit reporting firms like Experian and other new institutions of information and control that will come into existence in future decades?
  • What bits of the informational you will fall under scrutiny? The political you? The sexual you? What next-generation McCarthyisms await your informational self? And will those excesses of oversight be found in some Senate subcommittee against which we democratic citizens might hope to rise up in revolt — or will they lurk among algorithmic automatons that silently seal our fates in digital filing systems?
  • Despite their decidedly different political sensibilities, what links together the likes of Senator Wyden and the international hacker network known as Anonymous is that they respect the severity of what is at stake in our information.
  • information is a site for the call of justice today, alongside more quintessential battlefields like liberty of thought and equality of opportunity.
  • we lack the intellectual framework to grasp the new kinds of political injustices characteristic of today’s information society.
  • though nearly all of us have a vague sense that something is wrong with the new regimes of data surveillance, it is difficult for us to specify exactly what is happening and why it raises serious concern
11More

An Adaptation From 'Flash Boys: A Wall Street Revolt,' by Michael Lewis - NYTimes.com - 0 views

  • Ryan was making hundreds of thousands of dollars a year building systems to make stock-market trades faster. He was struck, over and over again, by how little those he helped understood the technology they were using
  • Ryan described what he witnessed inside the exchanges: The frantic competition for nanoseconds, clients’ trying to get their machines closer to the servers within the exchanges, the tens of millions being spent by high-frequency traders for tiny increments of speed. The U.S. stock market was now a class system of haves and have-nots, only what was had was not money but speed (which led to money).
  • A salesman at RBC who marketed Thor recalls one big investor calling to say, “You know, I thought I knew what I did for a living, but apparently not, because I had no idea this was going on.”
  • ...8 more annotations...
  • Eventually Brad Katsuyama came to realize that the most sophisticated investors didn’t know what was going on in their own market. Not the big mutual funds, Fidelity and Vanguard. Not the big money-management firms like T. Rowe Price and Capital Group. Not even the most sophisticated hedge funds.
  • The deep problem with the system was a kind of moral inertia. So long as it served the narrow self-interests of everyone inside it, no one on the inside would ever seek to change it, no matter how corrupt or sinister it became
  • Technology had collided with Wall Street in a peculiar way. It had been used to increase efficiency. But it had also been used to introduce a peculiar sort of market inefficiency. Taking advantage of loopholes in some well-meaning regulation introduced in the mid-2000s, some large amount of what Wall Street had been doing with technology was simply so someone inside the financial markets would know something that the outside world did not. The same system that once gave us subprime-mortgage collateralized debt obligations no investor could possibly truly understand now gave us stock-market trades involving fractions of a penny that occurred at unsafe speeds using order types that no investor could possibly truly understand.
  • The trouble with the stock market — with all of the public and private exchanges — was that they were fantastically gameable, and had been gamed: first by clever guys in small shops, and then by prop traders who moved inside the big Wall Street banks. That was the problem, Puz thought. From the point of view of the most sophisticated traders, the stock market wasn’t a mechanism for channeling capital to productive enterprise but a puzzle to be solved.
  • As they worked through the order types, the Puzzle Masters created a taxonomy of predatory behavior in the stock market. Broadly speaking, it appeared as if there were three activities that led to a vast amount of grotesquely unfair trading. The first they called electronic front-running — seeing an investor trying to do something in one place and racing ahead of him to the next (what had happened to Katsuyama when he traded at RBC). The second they called rebate arbitrage — using the new complexity to game the seizing of whatever legal kickbacks, called rebates within the industry, the exchange offered without actually providing the liquidity that the rebate was presumably meant to entice. The third, and probably by far the most widespread, they called slow-market arbitrage. This occurred when a high-frequency trader was able to see the price of a stock change on one exchange and pick off orders sitting on other exchanges before those exchanges were able to react. This happened all day, every day, and very likely generated more billions of dollars a year than the other strategies combined.
  • IEX had made its point: That to function properly, a financial market didn’t need to be rigged in someone’s favor. It didn’t need payment for order flow and co-location and all sorts of unfair advantages possessed by a small handful of traders. All it needed was for investors to take responsibility for understanding it, and then to seize its controls. “The backbone of the market,” Katsuyama says, “is investors coming together to trade.”
  • If an investor as large as T. Rowe Price, which acted on behalf of millions of investors, had trouble obtaining the information it needed to determine if its brokers had acted in their interest, what chance did the little guy have?
  • The stock market really was rigged. Katsuyama often wondered how enterprising politicians and plaintiffs’ lawyers and state attorneys general would respond to that realization. (This March, the New York attorney general, Eric Schneiderman, announced a new investigation of the stock exchanges and the dark pools, and their relationships with high-frequency traders.
5More

Upending Anonymity, These Days the Web Unmasks Everyone - NYTimes.com - 0 views

  • The collective intelligence of the Internet’s two billion users, and the digital fingerprints that so many users leave on Web sites, combine to make it more and more likely that every embarrassing video, every intimate photo, and every indelicate e-mail is attributed to its source, whether that source wants it to be or not. This intelligence makes the public sphere more public than ever before and sometimes forces personal lives into public view.
  • the positive effects can be numerous: criminality can be ferreted out, falsehoods can be disproved and individuals can become Internet icons.
  • “Humans want nothing more than to connect, and the companies that are connecting us electronically want to know who’s saying what, where,” said Susan Crawford, a professor at the Benjamin N. Cardozo School of Law. “As a result, we’re more known than ever before.”
  • ...2 more annotations...
  • This growing “publicness,” as it is sometimes called, comes with significant consequences for commerce, for political speech and for ordinary people’s right to privacy. There are efforts by governments and corporations to set up online identity systems.
  • He posited that because the Internet “can’t be made to forget” images and moments from the past, like an outburst on a train or a kiss during a riot, “the reality of an inescapable public world is an issue we are all going to hear a lot more about.”
9More

Face It, Your Brain Is a Computer - The New York Times - 0 views

  • all the standard arguments about why the brain might not be a computer are pretty weak.
  • Take the argument that “brains are parallel, but computers are serial.” Critics are right to note that virtually every time a human does anything, many different parts of the brain are engaged; that’s parallel, not serial.
  • the trend over time in the hardware business has been to make computers more and more parallel, using new approaches like multicore processors and graphics processing units.
  • ...6 more annotations...
  • The real payoff in subscribing to the idea of a brain as a computer would come from using that idea to profitably guide research. In an article last fall in the journal Science, two of my colleagues (Adam Marblestone of M.I.T. and Thomas Dean of Google) and I endeavored to do just that, suggesting that a particular kind of computer, known as the field programmable gate array, might offer a preliminary starting point for thinking about how the brain works.
  • FIELD programmable gate arrays consist of a large number of “logic block” programs that can be configured, and reconfigured, individually, to do a wide range of tasks. One logic block might do arithmetic, another signal processing, and yet another look things up in a table. The computation of the whole is a function of how the individual parts are configured. Much of the logic can be executed in parallel, much like what happens in a brain.
  • our suggestion is that the brain might similarly consist of highly orchestrated sets of fundamental building blocks, such as “computational primitives” for constructing sequences, retrieving information from memory, and routing information between different locations in the brain. Identifying those building blocks, we believe, could be the Rosetta stone that unlocks the brain.
  • it is unlikely that we will ever be able to directly connect the language of neurons and synapses to the diversity of human behavior, as many neuroscientists seem to hope. The chasm between brains and behavior is just too vast.
  • Our best shot may come instead from dividing and conquering. Fundamentally, that may involve two steps: finding some way to connect the scientific language of neurons and the scientific language of computational primitives (which would be comparable in computer science to connecting the physics of electrons and the workings of microprocessors); and finding some way to connect the scientific language of computational primitives and that of human behavior (which would be comparable to understanding how computer programs are built out of more basic microprocessor instructions).
  • If neurons are akin to computer hardware, and behaviors are akin to the actions that a computer performs, computation is likely to be the glue that binds the two.
9More

Teaching a Different Shakespeare From the One I Love - The New York Times - 0 views

  • Even the highly gifted students in my Shakespeare classes at Harvard are less likely to be touched by the subtle magic of his words than I was so many years ago or than my students were in the 1980s in Berkeley, Calif. What has happened? It is not that my students now lack verbal facility. In fact, they write with ease, particularly if the format is casual and resembles the texting and blogging that they do so constantly. The problem is that their engagement with language, their own or Shakespeare’s, often seems surprisingly shallow or tepid.
  • There are many well-rehearsed reasons for the change: the rise of television followed by the triumph of digital technology, the sending of instant messages instead of letters, the ‘‘visual turn’’ in our culture, the pervasive use of social media. In their wake, the whole notion of a linguistic birthright could be called quaint, the artifact of particular circumstances that have now vanished
  • For my parents, born in Boston, the English language was a treasured sign of arrival and rootedness; for me, a mastery of Shakespeare, the supreme master of that language, was like a purchased coat of arms, a title of gentility tracing me back to Stratford-upon-Avon.
  • ...6 more annotations...
  • It is not that the English language has ceased to be a precious possession; on the contrary, it is far more important now than it ever was in my childhood. But its importance has little or nothing to do any longer with the dream of rootedness. English is the premier international language, the global medium of communication and exchange.
  • as I have discovered in my teaching, it is a different Shakespeare from the one with whom I first fell in love. Many of my students may have less verbal acuity than in years past, but they often possess highly developed visual, musical and performative skills. They intuitively grasp, in a way I came to understand only slowly, the pervasiveness of songs in Shakespeare’s plays, the strange ways that his scenes flow one into another or the cunning alternation of close-ups and long views
  • When I ask them to write a 10-page paper analyzing a particular web of metaphors, exploring a complex theme or amassing evidence to support an argument, the results are often wooden; when I ask them to analyze a film clip, perform a scene or make a video, I stand a better chance of receiving something extraordinary.
  • This does not mean that I should abandon the paper assignment; it is an important form of training for a range of very different challenges that lie in their future. But I see that their deep imaginative engagement with Shakespeare, their intoxication, lies elsewhere.
  • The M.I.T. Global Shakespeare Project features an electronic archive that includes images of every page of the First Folio of 1623. In the Norton Shakespeare, which I edit, the texts of his plays are now available not only in the massive printed book with which I was initiated but also on a digital platform. One click and you can hear each song as it might have sounded on the Elizabethan stage; another click and you listen to key scenes read by a troupe of professional actors. It is a kind of magic unimagined even a few years ago or rather imaginable only as the book of a wizard like Prospero in ‘‘The Tempest.’
  • But it is not the new technology alone that attracts students to Shakespeare; it is still more his presence throughout the world as the common currency of humanity. In Taiwan, Tokyo and Nanjing, in a verdant corner of the Villa Borghese gardens in Rome and in an ancient garden in Kabul, in Berlin and Bangkok and Bangalore, his plays continue to find new and unexpected ways to enchant.
5More

Sexting Enters the Mainstream - NYTimes.com - 1 views

  • flirtatious texts have replaced phone calls, and Web sites like Facebook have replaced high school reunions as a way to reconnect with an old flame.
  • “We use new technologies in romantic relationships all the time,” said Dr. Baym. “When two people meet and they’re interested in developing the relationship, they go to text messages really fast as a way to safely negotiate the relationship.”
  • slight shifts in infidelity rates among young people and women suggest that digital media may be playing a role. Anecdotally, therapists report that electronic contact via Facebook, e-mail and text messages has allowed women in particular to form more intimate relationships.
  • ...2 more annotations...
  • The Internet dramatically expands the scope of potential people that we can meet.
  • the widespread availability of pornography via the Internet has also led to an insidious change in attitudes about sex. One study found that more than a third of Americans had visited an online porn site at least once a month
4More

Getting From the Internet What It Knows About You - NYTimes.com - 0 views

  • “NO one knows what I like better than I do.”
  • This statement may seem self-evident, but the revolution in information technology has created a growing list of exceptions. Your grocery store knows what you like to eat and can probably make educated guesses about other foods you might enjoy. Your wireless carrier knows whom you call, and your phone may know where you’ve been. And your search engine can finish many of your thoughts before you are even done typing them.
  • Here is a guiding principle: If a business collects data on consumers electronically, it should provide them with a version of that data that is easy to download and export to another Web site. Think of it this way: you have lent the company your data, and you’d like a copy for your own use.
  • ...1 more annotation...
  • If personal data is accompanied by detailed pricing information, as I discussed in my last column, consumers will be more aware of how they really use products and how much fees really cost them. And transparent pricing will give honest, high-quality providers a leg up on competitors who rely on obfuscation. All of this will help stimulate the best kind of economic growth.
2More

"Divine Inspiration" by Jeet Heer | The Walrus | July 2011 - 0 views

  • The Mechanical Bride, published in 1951, he established himself in the emerging field of cultural studies by offering a caustic survey of the dehumanizing impact of popular magazines, advertising, and comic strips.
  • In The Gutenberg Galaxy, he offered a map of modern history by highlighting the hitherto-unexplored effect of print in shaping how we think. This was followed by Understanding Media, which prophesied that new electronic media would rewire human consciousness just as effectively as print once did, giving birth to a “global village”
19More

Specs that see right through you - tech - 05 July 2011 - New Scientist - 0 views

  • a number of "social X-ray specs" that are set to transform how we interact with each other. By sensing emotions that we would otherwise miss, these technologies can thwart disastrous social gaffes and help us understand each other better.
  • In conversation, we pantomime certain emotions that act as social lubricants. We unconsciously nod to signal that we are following the other person's train of thought, for example, or squint a bit to indicate that we are losing track. Many of these signals can be misinterpreted - sometimes because different cultures have their own specific signals.
  • n 2005, she enlisted Simon Baron-Cohen, also at Cambridge, to help her identify a set of more relevant emotional facial states. They settled on six: thinking, agreeing, concentrating, interested - and, of course, the confused and disagreeing expressions
  • ...16 more annotations...
  • More often, we fail to spot them altogether. D
  • To create this lexicon, they hired actors to mime the expressions, then asked volunteers to describe their meaning, taking the majority response as the accurate one.
  • The camera tracks 24 "feature points" on your conversation partner's face, and software developed by Picard analyses their myriad micro-expressions, how often they appear and for how long. It then compares that data with its bank of known expressions (see diagram).
  • Eventually, she thinks the system could be incorporated into a pair of augmented-reality glasses, which would overlay computer graphics onto the scene in front of the wearer.
  • the average person only managed to interpret, correctly, 54 per cent of Baron-Cohen's expressions on real, non-acted faces. This suggested to them that most people - not just those with autism - could use some help sensing the mood of people they are talking to.
  • set up a company called Affectiva, based in Waltham, Massachusetts, which is selling their expression recognition software. Their customers include companies that, for example, want to measure how people feel about their adverts or movie.
  • it's hard to fool the machine for long
  • In addition to facial expressions, we radiate a panoply of involuntary "honest signals", a term identified by MIT Media Lab researcher Alex Pentland in the early 2000s to describe the social signals that we use to augment our language. They include body language such as gesture mirroring, and cues such as variations in the tone and pitch of the voice. We do respond to these cues, but often not consciously. If we were more aware of them in others and ourselves, then we would have a fuller picture of the social reality around us, and be able to react more deliberately.
  • develop a small electronic badge that hangs around the neck. Its audio sensors record how aggressive the wearer is being, the pitch, volume and clip of their voice, and other factors. They called it the "jerk-o-meter".
  • it helped people realise when they were being either obnoxious or unduly self-effacing.
  • y the end of the experiment, all the dots had gravitated towards more or less the same size and colour. Simply being able to see their role in a group made people behave differently, and caused the group dynamics to become more even. The entire group's emotional intelligence had increased (
  • Some of our body's responses during a conversation are not designed for broadcast to another person - but it's possible to monitor those too. Your temperature and skin conductance can also reveal secrets about your emotional state, and Picard can tap them with a glove-like device called the Q Sensor. In response to stresses, good or bad, our skin becomes clammy, increasing its conductance, and the Q Sensor picks this up.
  • Physiological responses can now even be tracked remotely, in principle without your consent. Last year, Picard and one of her graduate students showed that it was possible to measure heart rate without any surface contact with the body. They used software linked to an ordinary webcam to read information about heart rate, blood pressure and skin temperature based on, among other things, colour changes in the subject's face
  • In Rio de Janeiro and Sao Paolo, police officers can decide whether someone is a criminal just by looking at them. Their glasses scan the features of a face, and match them against a database of criminal mugshots. A red light blinks if there's a match.
  • Thad Starner at Georgia Institute of Technology in Atlanta wears a small device he has built that looks like a monocle. It can retrieve video, audio or text snippets of past conversations with people he has spoken with, and even provide real-time links between past chats and topics he is currently discussing.
  • The US military has built a radar-imaging device that can see through walls to capture 3D images of people and objects beyond.
7More

Why Science Majors Change Their Minds (It's Just So Darn Hard) - NYTimes.com - 1 views

  • roughly 40 percent of students planning engineering and science majors end up switching to other subjects or failing to get any degree. That increases to as much as 60 percent when pre-medical students, who typically have the strongest SAT scores and high school science preparation, are included
  • the attrition rate can be higher at the most selective schools, where he believes the competition overwhelms even well-qualified students.
  • the main majors are difficult and growing more complex. Some students still lack math preparation or aren’t willing to work hard enough.
  • ...4 more annotations...
  • there could be more subtle problems at work, like the proliferation of grade inflation in the humanities and social sciences, which provides another incentive for students to leave STEM majors. It is no surprise that grades are lower in math and science, where the answers are clear-cut and there are no bonus points for flair. Professors also say they are strict because science and engineering courses build on one another, and a student who fails to absorb the key lessons in one class will flounder in the next.
  • The National Science Board, a public advisory body, warned in the mid-1980s that students were losing sight of why they wanted to be scientists and engineers in the first place. Research confirmed in the 1990s that students learn more by grappling with open-ended problems, like creating a computer game or designing an alternative energy system, than listening to lectures. While the National Science Foundation went on to finance pilot courses that employed interactive projects, when the money dried up, so did most of the courses. Lecture classes are far cheaper to produce, and top professors are focused on bringing in research grants, not teaching undergraduates.
  • Since becoming Notre Dame’s dean in 2008, Dr. Kilpatrick has revamped and expanded a freshman design course that had gotten “a little bit stale.” The students now do four projects. They build Lego robots and design bridges capable of carrying heavy loads at minimal cost. They also create electronic circuit boards and dream up a project of their own.
  • Some new students do not have a good feel for how deeply technical engineering is. Other bright students may have breezed through high school without developing disciplined habits. By contrast, students in China and India focus relentlessly on math and science from an early age.
6More

Actually, Some Material Goods Can Make You Happy - The Atlantic - 1 views

  • In many studies, participants are asked to think about material items as purchases made "in order to have," in contrast with experiences—purchases made "in order to do." This, they say, neglects a category of goods: those made in order to have experiences,  such as electronics, musical instruments, and sports and outdoors gear.
  • Do such "experiential goods," as Guevarra and Howell call them, leave our well-being unimproved, as is the case with most goods, or do they contribute positively to our happiness?
  • In a series of experiments, Guevarra and Howell find that the latter is the case: experiential goods made people happier, just like the experiences themselves.
  • ...3 more annotations...
  • What is it about experiences? It's not the fact of having an experience per se but that experiences can "satisf[y] the psychological needs of autonomy, competence, and relatedness." Talking to friends, mastering a skill, expressing oneself through art or writing—all of these provide a measure of fulfillment that merely owning a thing cannot.
  • Experiential goods fit in under this framework because they likewise can satisfy those same psychological needs. A musical instrument, for example, makes possible a sort of human happiness hat trick: Finely tune your skills, get the happiness of mastery (competence); play your heart out, get the happiness of self-expression (autonomy); jam with friends, get the happiness of connecting with others (relatedness).
  • "Spend your money on experiences, not things" remains a good basic rule. But it's possible to tweak it slightly to better reflect the drivers of human happiness: "Spend your money on competence, autonomy, and relatedness." That doesn't quite have the same ring to it, but it'll guide you wisely.
13More

Among the Disrupted - NYTimes.com - 0 views

  • Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind.
  • Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability.
  • the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms:
  • ...10 more annotations...
  • Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology
  • The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past.
  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.”
13More

Physicists in Europe Find Tantalizing Hints of a Mysterious New Particle - The New York... - 1 views

  • seen traces of what could be a new fundamental particle of nature.
  • One possibility, out of a gaggle of wild and not-so-wild ideas springing to life as the day went on, is that the particle — assuming it is real — is a heavier version of the Higgs boson, a particle that explains why other particles have mass. Another is that it is a graviton, the supposed quantum carrier of gravity, whose discovery could imply the existence of extra dimensions of space-time.
  • At the end of a long chain of “ifs” could be a revolution, the first clues to a theory of nature that goes beyond the so-called Standard Model, which has ruled physics for the last quarter-century.
  • ...10 more annotations...
  • noting that the history of particle physics is rife with statistical flukes and anomalies that disappeared when more data was compiled
  • A coincidence is the most probable explanation for the surprising bumps in data from the collider, physicists from the experiments cautioned
  • Physicists could not help wondering if history was about to repeat itself. It was four years ago this week that the same two teams’ detection of matching bumps in Large Hadron Collider data set the clock ticking for the discovery of the Higgs boson six months later.
  • If the particle is real, Dr. Lykken said, physicists should know by this summer, when they will have 10 times as much data to present to scientists from around the world who will convene in Chicago
  • The Higgs boson was the last missing piece of the Standard Model, which explains all we know about subatomic particles and forces. But there are questions this model does not answer, such as what happens at the bottom of a black hole, the identity of the dark matter and dark energy that rule the cosmos, or why the universe is matter and not antimatter.
  • CERN physicists have been running their collider at nearly twice the energy with which they discovered the Higgs, firing twin beams of protons with 6.5 trillion electron volts of energy at each other in search of new particles to help point them to deeper laws.The main news since then has been mainly that there is no news yet, only tantalizing hints, bumps in the data, that might be new particles and signposts of new theories, or statistical demons.
  • Or it could be a more massive particle that has decayed in steps down to a pair of photons. Nobody knows. No model predicted this, which is how some scientists like it.
  • “The more nonstandard the better,” said Joe Lykken, the director of research at the Fermi National Accelerator Laboratory and a member of one of the CERN teams. “It will give people a lot to think about. We get paid to speculate.”
  • When physicists announced in 2012 that they had indeed discovered the Higgs boson, it was not the end of physics. It was not even, to paraphrase Winston Churchill, the beginning of the end.It might, they hoped, be the end of the beginning.
  • Such a discovery would augur a fruitful future for cosmological wanderings and for the CERN collider, which will be running for the next 20 years.
23More

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
14More

Was There a Civilization On Earth Before Humans? - The Atlantic - 0 views

  • When it comes to direct evidence of an industrial civilization—things like cities, factories, and roads—the geologic record doesn’t go back past what’s called the Quaternary period 2.6 million years ago
  • if we’re going back this far, we’re not talking about human civilizations anymore. Homo sapiens didn’t make their appearance on the planet until just 300,000 years or so ago. That means the question shifts to other species, which is why Gavin called the idea the Silurian hypothesis
  • could researchers find clear evidence that an ancient species built a relatively short-lived industrial civilization long before our own? Perhaps, for example, some early mammal rose briefly to civilization building during the Paleocene epoch about 60 million years ago. There are fossils, of course. But the fraction of life that gets fossilized is always minuscule and varies a lot depending on time and habitat. It would be easy, therefore, to miss an industrial civilization that only lasted 100,000 years—which would be 500 times longer than our industrial civilization has made it so far.
  • ...11 more annotations...
  • Given that all direct evidence would be long gone after many millions of years, what kinds of evidence might then still exist? The best way to answer this question is to figure out what evidence we’d leave behind if human civilization collapsed at its current stage of development.
  • Now that our industrial civilization has truly gone global, humanity’s collective activity is laying down a variety of traces that will be detectable by scientists 100 million years in the future. The extensive use of fertilizer, for example
  • And then there’s all that plastic. Studies have shown increasing amounts of plastic “marine litter” are being deposited on the seafloor everywhere from coastal areas to deep basins and even in the Arctic. Wind, sun, and waves grind down large-scale plastic artifacts, leaving the seas full of microscopic plastic particles that will eventually rain down on the ocean floor, creating a layer that could persist for geological timescales.
  • Likewise our relentless hunger for the rare-Earth elements used in electronic gizmos. Far more of these atoms are now wandering around the planet’s surface because of us than would otherwise be the case. They might also show up in future sediments, too.
  • Once you realize, through climate change, the need to find lower-impact energy sources, the less impact you will leave. So the more sustainable your civilization becomes, the smaller the signal you’ll leave for future generations.
  • The more fossil fuels we burn, the more the balance of these carbon isotopes shifts. Atmospheric scientists call this shift the Suess effect, and the change in isotopic ratios of carbon due to fossil-fuel use is easy to see over the last century. Increases in temperature also leave isotopic signals. These shifts should be apparent to any future scientist who chemically analyzes exposed layers of rock from our era. Along with these spikes, this Anthropocene layer might also hold brief peaks in nitrogen, plastic nanoparticles, and even synthetic steroids
  • Fifty-six million years ago, Earth passed through the Paleocene-Eocene Thermal Maximum (PETM). During the PETM, the planet’s average temperature climbed as high as 15 degrees Fahrenheit above what we experience today. It was a world almost without ice, as typical summer temperatures at the poles reached close to a balmy 70 degrees Fahrenheit.
  • While there is evidence that the PETM may have been driven by a massive release of buried fossil carbon into the air, it’s the timescale of these changes that matter. The PETM’s isotope spikes rise and fall over a few hundred thousand years. But what makes the Anthropocene so remarkable in terms of Earth’s history is the speed at which we’re dumping fossil carbon into the atmosphere. There have been geological periods where Earth’s CO2 has been as high or higher than today, but never before in the planet’s multibillion-year history has so much buried carbon been dumped back into the atmosphere so quickly
  • So the isotopic spikes we do see in the geologic record may not be spiky enough to fit the Silurian hypothesis’s bill.
  • ronically, however, the most promising marker of humanity’s presence as an advanced civilization is a by-product of one activity that may threaten it most.
  • “How do you know we’re the only time there’s been a civilization on our own planet?”
42More

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
‹ Previous 21 - 40 of 71 Next › Last »
Showing 20 items per page