Skip to main content

Home/ TOK Friends/ Group items tagged exhaustion

Rss Feed Group items tagged

Javier E

Does Thinking Really Hard Burn More Calories?: Scientific American - 0 views

  • Just as vigorous exercise tires our bodies, intellectual exertion should drain the brain. What the latest science reveals, however, is that the popular notion of mental exhaustion is too simplistic. The brain continuously slurps up huge amounts of energy for an organ of its size, regardless of whether we are tackling integral calculus or clicking through the week's top 10 LOLcats. Although firing neurons summon extra blood, oxygen and glucose, any local increases in energy consumption are tiny compared with the brain's gluttonous baseline intake. So, in most cases, short periods of additional mental effort require a little more brainpower than usual, but not much more.
  • something must explain the feeling of mental exhaustion, even if its physiology differs from physical fatigue. Simply believing that our brains have expended a lot of effort might be enough to make us lethargic.
  • a typical adult human brain runs on around 12 watts—a fifth of the power required by a standard 60 watt lightbulb. Compared with most other organs, the brain is greedy; pitted against man-made electronics, it is astoundingly efficient. IBM's Watson, the supercomputer that defeated Jeopardy! champions, depends on ninety IBM Power 750 servers, each of which requires around one thousand watts.
  • ...4 more annotations...
  • Such fatigue seems much more likely to follow sustained mental effort that we do not seek for pleasure—such as the obligatory SAT—especially when we expect that the ordeal will drain our brains. If we think an exam or puzzle will be difficult, it often will be.
  • people routinely enjoy intellectually invigorating activities without suffering mental exhaustion.
  • Studies have shown that something similar happens when people exercise and play sports: a large component of physical exhaustion is in our heads. In related research, volunteers that cycled on an exercise bike following a 90-minute computerized test of sustained attention quit pedaling from exhaustion sooner than participants that watched emotionally neutral documentaries before exercising
  • In the specific case of the SAT, something beyond pure mental effort likely contributes to post-exam stupor: stress. After all, the brain does not function in a vacuum. Other organs burn up energy, too. Taking an exam that partially determines where one will spend the next four years is nerve-racking enough to send stress hormones swimming through the blood stream, induce sweating, quicken heart rates and encourage fidgeting and contorted body postures. The SAT and similar trials are not just mentally taxing—they are physically exhausting, too.
Javier E

Today's Exhausted Superkids - The New York Times - 1 views

  • Sleep deprivation is just a part of the craziness, but it’s a perfect shorthand for childhoods bereft of spontaneity, stripped of real play and haunted by the “pressure of perfection,” to quote the headline on a story by Julie Scelfo in The Times this week.
  • In a study in the medical journal Pediatrics this year, about 55 percent of American teenagers from the ages of 14 to 17 reported that they were getting less than seven hours a night, though the National Sleep Foundation counsels 8 to 10.
  • Smartphones and tablets aggravate the problem, keeping kids connected and distracted long after lights out. But in communities where academic expectations run highest, the real culprit is panic: about acing the exam, burnishing the transcript, keeping up with high-achieving peers.
  • ...1 more annotation...
  • “No one is arguing for a generation of mediocre or underachieving kids — but plenty of people have begun arguing for a redefinition of what it means to achieve at all,” wrote Jeffrey Kluger in Time magazine last week. He noted, rightly, that “somewhere between the self-esteem building of going for the gold and the self-esteem crushing of the Ivy-or-die ethos, there has to be a place where kids can breathe.”
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Opinion | Two visions of 'normal' collided in our abnormal pandemic year - The Washingt... - 0 views

  • The date was Sept. 17, 2001. The rubble was still smoking. As silly as this sounds, I was hoping it would make me cry.
  • That didn’t happen. The truth is, it still looked like something on television, a surreal shot from a disaster movie. I was stunned but unmoved.
  • ADLater, trying to understand the difference between those two moments, I told people, “The rubble still didn’t feel real.”
  • ...11 more annotations...
  • now, after a year of pandemic, I realize that wasn’t the problem. The rubble was real, all right. It just wasn’t normal.
  • it always, somehow, came back to that essential human craving for things to be normal, and our inability to believe that they are not, even when presented with compelling evidence.
  • This phenomenon is well-known to cognitive scientists, who have dubbed it “normalcy bias.”
  • the greater risk is more often the opposite: People can’t quite believe. They ignore the fire alarm, defy the order to evacuate ahead of the hurricane, or pause to grab their luggage when exiting the crashed plane. Too often, they die.
  • Calling the quest for normalcy a bias makes it sound bad, but most of the time this tendency is a good thing. The world is full of aberrations, most of them meaningless. If we aimed for maximal reaction to every anomaly we encountered, we’d break down from sheer nervous exhaustion.
  • But when things go disastrously wrong, our optimal response is at war with the part of our brain that insists things are fine. We try to reoccupy the old normal even if it’s become radioactive and salted with mines. We still resist the new normal — even when it’s staring us in the face.
  • Nine months into our current disaster, I now see that our bitter divides over pandemic response were most fundamentally a contest between two ideas of what it meant to get “back to normal.”
  • One group wanted to feel as safe as they had before a virus invaded our shores; the other wanted to feel as unfettered
  • he disputes that followed weren’t just a fight to determine whose idea of normal would prevail. They were a battle against an unthinkable reality, which was that neither kind of normalcy was fully possible anymore.
  • I suspect we all might have been less willing to make war on our opponents if only we’d believed that we were fighting people not very different from how we were — exhausted by the whole thing and frantic to feel like themselves again
  • Some catastrophes are simply too big to be understood except in the smallest way, through their most ordinary human details
Javier E

Pandemic Advice From Athletes - The New York Times - 0 views

  • There’s a special kind of exhaustion that the world’s best endurance athletes embrace. Some call it masochistic, others may call it brave. When fatigue sends legs and lungs to their limits, they are able to push through to a gear beyond their pain threshold. These athletes approach fatigue not with fear but as a challenge, an opportunity.
  • It’s a quality that allows an ultramarathoner to endure what could be an unexpected rough segment of an 100-mile race, or a sailor to push ahead when she’s in the middle of the ocean, racing through hurricane winds alone.
  • The drive to persevere is something some are born with, but it’s also a muscle everyone can learn to flex. In a way, everyone has become an endurance athlete of sorts during this pandemic, running a race with no finish line that tests the limits of their exhaustion.
  • ...31 more annotations...
  • One message they all had: You are stronger than you think you are, and everyone is able to adapt in ways they didn’t think possible.
  • there are a few techniques to help you along — 100-mile race not required.
  • There’s a pacing in living day to day, just as there’s pacing in climbing.
  • Training to become an elite endurance athlete means learning to embrace discomfort. Instead of hiding from pain, athletes must learn to work with it. A lot of that comes down to pacing
  • Similarly, as you muscle through an ongoing pandemic, you must look for ways to make peace with unknowns and new, uncomfortable realities. “When we think about the coronavirus, we are in it for the long run; so how do you pace yourself?”
  • She recommends thinking about your routines, practicing positive self-talk and focusing on processes instead of outcomes
  • You don’t know when the pandemic will end, but you can take control of your daily habits
  • “always have a little in reserve.”
  • Deplete your resources early and you’ll be in trouble. Focusing on day-to-day activities will pay off in the long run.
  • If you burn out all your mental energy in one day or week, you may find it more difficult to adapt when things don’t return to normal as quickly as you would hope.
  • Pace Yourself
  • “Don’t play all your cards at once and keep a little something in reserve.”
  • Create Mini-Goals
  • Sports psychologists frequently recommend creating mini milestones en route to a big goal. There are many steps on the path from base camp to a mountain’s summit. Likewise, there are smaller, more achievable milestones to reach and celebrate as you venture ahead into the unknown.
  • Focus on Something New
  • “I’m really good at breaking things down into small increments and setting micro-goals,” he said. How micro?
  • “I break things down to 10 seconds at a time,” Mr. Woltering continued. “You just have to be present in what you are doing and you have to know that it may not be the most fun — or super painful — now, but that could change in 10 seconds down the road.”
  • And it may not change quickly. Mr. Woltering said he has spent six-hour stretches counting to 10 over and over again. “You just keep moving and keep counting,” he said. “And you have to have faith that it will change at some point.”
  • Create Structure
  • “Part of expedition life is having a routine that you’re comfortable with. When I’m on expedition, I always start the day with a basin of warm water and soap. I wash my hands, face, neck and ears and get the sand out of my eyes,” he said. “It’s something that’s repeated that gets you a sense of comfort and normalcy.”
  • During the pandemic, he has found comfort and normalcy by getting outdoors, and climbing whenever possible to “run the engine.”
  • Dee Caffari, a British sailor and the first woman to sail solo, nonstop, around the world in both directions, said structure is imperative to fight back loneliness and monotony.
  • “In your day you need structure,” Ms. Caffari said. “You need to get up in the morning knowing you’re going to make something happen.”
  • “Setting goals that are controllable makes it easier to adapt,” Dr. Meijen said. “If you set goals that are controlled by other people, goals that aren’t realistic or are tough or boring, those are much harder to adapt to.”
  • When all else fails, look to something new: a new hobby, a new goal, a new experience
  • During a particularly hard patch of a competition, some athletes say they focus on a different sense, one that perhaps is not at the forefront of their mind when the pain sets in. A runner could note the smells around her and a climber could note the way his hair is blowing in the wind.
  • When athletes are injured, sports psychologists and coaches frequently encourage them to find a new activity to engage their mind and body. The key is to adapt, adapt and then adapt again.
  • “We all want mental toughness, it’s an important part of dealing with difficult things,”
  • “The current definition of mental toughness is the ability to pivot and to be nimble and flexible.”
  • “The next moment is always completely uncertain, and it’s always been that way,” Dr. Gervais said. But adapting, adjusting expectations and discovering new goals or hobbies can allow you to continue to build the muscle that is mental toughness.
  • Bottom line? “Optimism is an antidote to anxiety,”
Javier E

Why the Past 10 Years of American Life Have Been Uniquely Stupid - The Atlantic - 0 views

  • Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories.
  • Social media has weakened all three.
  • gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.
  • ...118 more annotations...
  • the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.
  • Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom
  • That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers.
  • “Like” and “Share” buttons quickly became standard features of most other platforms.
  • Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well.
  • Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.
  • By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous”
  • If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
  • This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment,
  • As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.
  • It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution.
  • The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.”
  • The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.
  • The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare.
  • a less quoted yet equally important insight, about democracy’s vulnerability to triviality.
  • Madison notes that people are so prone to factionalism that “where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
  • Social media has both magnified and weaponized the frivolous.
  • It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust.
  • a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions.
  • when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side
  • The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
  • The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.
  • When people lose trust in institutions, they lose trust in the stories told by those institutions. That’s particularly true of the institutions entrusted with the education of children.
  • Facebook and Twitter make it possible for parents to become outraged every day over a new snippet from their children’s history lessons––and math lessons and literature selections, and any new pedagogical shifts anywhere in the country
  • The motives of teachers and administrators come into question, and overreaching laws or curricular reforms sometimes follow, dumbing down education and reducing trust in it further.
  • young people educated in the post-Babel era are less likely to arrive at a coherent story of who we are as a people, and less likely to share any such story with those who attended different schools or who were educated in a different decade.
  • former CIA analyst Martin Gurri predicted these fracturing effects in his 2014 book, The Revolt of the Public. Gurri’s analysis focused on the authority-subverting effects of information’s exponential growth, beginning with the internet in the 1990s. Writing nearly a decade ago, Gurri could already see the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached.
  • he notes a constructive feature of the pre-digital era: a single “mass audience,” all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society. I
  • The digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass. So the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile
  • Facebook, Twitter, YouTube, and a few other large platforms unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.
  • I think we can date the fall of the tower to the years between 2011 (Gurri’s focal year of “nihilistic” protests) and 2015, a year marked by the “great awokening” on the left and the ascendancy of Donald Trump on the right.
  • Twitter can overpower all the newspapers in the country, and stories cannot be shared (or at least trusted) across more than a few adjacent fragments—so truth cannot achieve widespread adherence.
  • fter Babel, nothing really means anything anymore––at least not in a way that is durable and on which people widely agree.
  • Politics After Babel
  • “Politics is the art of the possible,” the German statesman Otto von Bismarck said in 1867. In a post-Babel democracy, not much may be possible.
  • The ideological distance between the two parties began increasing faster in the 1990s. Fox News and the 1994 “Republican Revolution” converted the GOP into a more combative party.
  • So cross-party relationships were already strained before 2009. But the enhanced virality of social media thereafter made it more hazardous to be seen fraternizing with the enemy or even failing to attack the enemy with sufficient vigor.
  • What changed in the 2010s? Let’s revisit that Twitter engineer’s metaphor of handing a loaded gun to a 4-year-old. A mean tweet doesn’t kill anyone; it is an attempt to shame or punish someone publicly while broadcasting one’s own virtue, brilliance, or tribal loyalties. It’s more a dart than a bullet
  • from 2009 to 2012, Facebook and Twitter passed out roughly 1 billion dart guns globally. We’ve been shooting one another ever since.
  • “devoted conservatives,” comprised 6 percent of the U.S. population.
  • the warped “accountability” of social media has also brought injustice—and political dysfunction—in three ways.
  • First, the dart guns of social media give more power to trolls and provocateurs while silencing good citizens.
  • a small subset of people on social-media platforms are highly concerned with gaining status and are willing to use aggression to do so.
  • Across eight studies, Bor and Petersen found that being online did not make most people more aggressive or hostile; rather, it allowed a small number of aggressive people to attack a much larger set of victims. Even a small number of jerks were able to dominate discussion forums,
  • Additional research finds that women and Black people are harassed disproportionately, so the digital public square is less welcoming to their voices.
  • Second, the dart guns of social media give more power and voice to the political extremes while reducing the power and voice of the moderate majority.
  • The “Hidden Tribes” study, by the pro-democracy group More in Common, surveyed 8,000 Americans in 2017 and 2018 and identified seven groups that shared beliefs and behaviors.
  • Social media has given voice to some people who had little previously, and it has made it easier to hold powerful people accountable for their misdeeds
  • The group furthest to the left, the “progressive activists,” comprised 8 percent of the population. The progressive activists were by far the most prolific group on social media: 70 percent had shared political content over the previous year. The devoted conservatives followed, at 56 percent.
  • These two extreme groups are similar in surprising ways. They are the whitest and richest of the seven groups, which suggests that America is being torn apart by a battle between two subsets of the elite who are not representative of the broader society.
  • they are the two groups that show the greatest homogeneity in their moral and political attitudes.
  • likely a result of thought-policing on social media:
  • political extremists don’t just shoot darts at their enemies; they spend a lot of their ammunition targeting dissenters or nuanced thinkers on their own team.
  • Finally, by giving everyone a dart gun, social media deputizes everyone to administer justice with no due process. Platforms like Twitter devolve into the Wild West, with no accountability for vigilantes.
  • Enhanced-virality platforms thereby facilitate massive collective punishment for small or imagined offenses, with real-world consequences, including innocent people losing their jobs and being shamed into suicide
  • we don’t get justice and inclusion; we get a society that ignores context, proportionality, mercy, and truth.
  • Since the tower fell, debates of all kinds have grown more and more confused. The most pervasive obstacle to good thinking is confirmation bias, which refers to the human tendency to search only for evidence that confirms our preferred beliefs
  • search engines were supercharging confirmation bias, making it far easier for people to find evidence for absurd beliefs and conspiracy theorie
  • The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument.
  • In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system”—that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals
  • English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury.
  • Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking.
  • Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.
  • Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history
  • But this arrangement, Rauch notes, “is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.”
  • This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted
  • it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight
  • Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas—even those presented in class by their students—that they believed to be ill-supported or wrong.
  • The stupefying process plays out differently on the right and the left because their activist wings subscribe to different narratives with different sacred values.
  • The “Hidden Tribes” study tells us that the “devoted conservatives” score highest on beliefs related to authoritarianism. They share a narrative in which America is eternally under threat from enemies outside and subversives within; they see life as a battle between patriots and traitors.
  • they are psychologically different from the larger group of “traditional conservatives” (19 percent of the population), who emphasize order, decorum, and slow rather than radical change.
  • The traditional punishment for treason is death, hence the battle cry on January 6: “Hang Mike Pence.”
  • Right-wing death threats, many delivered by anonymous accounts, are proving effective in cowing traditional conservatives
  • The wave of threats delivered to dissenting Republican members of Congress has similarly pushed many of the remaining moderates to quit or go silent, giving us a party ever more divorced from the conservative tradition, constitutional responsibility, and reality.
  • The stupidity on the right is most visible in the many conspiracy theories spreading across right-wing media and now into Congress.
  • The Democrats have also been hit hard by structural stupidity, though in a different way. In the Democratic Party, the struggle between the progressive wing and the more moderate factions is open and ongoing, and often the moderates win.
  • The problem is that the left controls the commanding heights of the culture: universities, news organizations, Hollywood, art museums, advertising, much of Silicon Valley, and the teachers’ unions and teaching colleges that shape K–12 education. And in many of those institutions, dissent has been stifled:
  • Liberals in the late 20th century shared a belief that the sociologist Christian Smith called the “liberal progress” narrative, in which America used to be horrifically unjust and repressive, but, thanks to the struggles of activists and heroes, has made (and continues to make) progress toward realizing the noble promise of its founding.
  • It is also the view of the “traditional liberals” in the “Hidden Tribes” study (11 percent of the population), who have strong humanitarian values, are older than average, and are largely the people leading America’s cultural and intellectual institutions.
  • when the newly viralized social-media platforms gave everyone a dart gun, it was younger progressive activists who did the most shooting, and they aimed a disproportionate number of their darts at these older liberal leaders.
  • Confused and fearful, the leaders rarely challenged the activists or their nonliberal narrative in which life at every institution is an eternal battle among identity groups over a zero-sum pie, and the people on top got there by oppressing the people on the bottom. This new narrative is rigidly egalitarian––focused on equality of outcomes, not of rights or opportunities. It is unconcerned with individual rights.
  • The universal charge against people who disagree with this narrative is not “traitor”; it is “racist,” “transphobe,” “Karen,” or some related scarlet letter marking the perpetrator as one who hates or harms a marginalized group.
  • The punishment that feels right for such crimes is not execution; it is public shaming and social death.
  • anyone on Twitter had already seen dozens of examples teaching the basic lesson: Don’t question your own side’s beliefs, policies, or actions. And when traditional liberals go silent, as so many did in the summer of 2020, the progressive activists’ more radical narrative takes over as the governing narrative of an organization.
  • This is why so many epistemic institutions seemed to “go woke” in rapid succession that year and the next, beginning with a wave of controversies and resignations at The New York Times and other newspapers, and continuing on to social-justice pronouncements by groups of doctors and medical associations
  • The problem is structural. Thanks to enhanced-virality social media, dissent is punished within many of our institutions, which means that bad ideas get elevated into official policy.
  • In a 2018 interview, Steve Bannon, the former adviser to Donald Trump, said that the way to deal with the media is “to flood the zone with shit.” He was describing the “firehose of falsehood” tactic pioneered by Russian disinformation programs to keep Americans confused, disoriented, and angry.
  • artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence.
  • Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)
  • American factions won’t be the only ones using AI and social media to generate attack content; our adversaries will too.
  • In the 20th century, America’s shared identity as the country leading the fight to make the world safe for democracy was a strong force that helped keep the culture and the polity together.
  • In the 21st century, America’s tech companies have rewired the world and created products that now appear to be corrosive to democracy, obstacles to shared understanding, and destroyers of the modern tower.
  • What changes are needed?
  • I can suggest three categories of reforms––three goals that must be achieved if democracy is to remain viable in the post-Babel era.
  • We must harden democratic institutions so that they can withstand chronic anger and mistrust, reform social media so that it becomes less socially corrosive, and better prepare the next generation for democratic citizenship in this new age.
  • Harden Democratic Institutions
  • we must reform key institutions so that they can continue to function even if levels of anger, misinformation, and violence increase far above those we have today.
  • Reforms should reduce the outsize influence of angry extremists and make legislators more responsive to the average voter in their district.
  • One example of such a reform is to end closed party primaries, replacing them with a single, nonpartisan, open primary from which the top several candidates advance to a general election that also uses ranked-choice voting
  • A second way to harden democratic institutions is to reduce the power of either political party to game the system in its favor, for example by drawing its preferred electoral districts or selecting the officials who will supervise elections
  • These jobs should all be done in a nonpartisan way.
  • Reform Social Media
  • Social media’s empowerment of the far left, the far right, domestic trolls, and foreign agents is creating a system that looks less like democracy and more like rule by the most aggressive.
  • it is within our power to reduce social media’s ability to dissolve trust and foment structural stupidity. Reforms should limit the platforms’ amplification of the aggressive fringes while giving more voice to what More in Common calls “the exhausted majority.”
  • the main problem with social media is not that some people post fake or toxic stuff; it’s that fake and outrage-inducing content can now attain a level of reach and influence that was not possible before
  • Perhaps the biggest single change that would reduce the toxicity of existing platforms would be user verification as a precondition for gaining the algorithmic amplification that social media offers.
  • One of the first orders of business should be compelling the platforms to share their data and their algorithms with academic researchers.
  • Prepare the Next Generation
  • Childhood has become more tightly circumscribed in recent generations––with less opportunity for free, unstructured play; less unsupervised time outside; more time online. Whatever else the effects of these shifts, they have likely impeded the development of abilities needed for effective self-governance for many young adults
  • Depression makes people less likely to want to engage with new people, ideas, and experiences. Anxiety makes new things seem more threatening. As these conditions have risen and as the lessons on nuanced social behavior learned through free play have been delayed, tolerance for diverse viewpoints and the ability to work out disputes have diminished among many young people
  • Students did not just say that they disagreed with visiting speakers; some said that those lectures would be dangerous, emotionally devastating, a form of violence. Because rates of teen depression and anxiety have continued to rise into the 2020s, we should expect these views to continue in the generations to follow, and indeed to become more severe.
  • The most important change we can make to reduce the damaging effects of social media on children is to delay entry until they have passed through puberty.
  • The age should be raised to at least 16, and companies should be held responsible for enforcing it.
  • et them out to play. Stop starving children of the experiences they most need to become good citizens: free play in mixed-age groups of children with minimal adult supervision
  • while social media has eroded the art of association throughout society, it may be leaving its deepest and most enduring marks on adolescents. A surge in rates of anxiety, depression, and self-harm among American teens began suddenly in the early 2010s. (The same thing happened to Canadian and British teens, at the same time.) The cause is not known, but the timing points to social media as a substantial contributor—the surge began just as the large majority of American teens became daily users of the major platforms.
  • What would it be like to live in Babel in the days after its destruction? We know. It is a time of confusion and loss. But it is also a time to reflect, listen, and build.
  • In recent years, Americans have started hundreds of groups and organizations dedicated to building trust and friendship across the political divide, including BridgeUSA, Braver Angels (on whose board I serve), and many others listed at BridgeAlliance.us. We cannot expect Congress and the tech companies to save us. We must change ourselves and our communities.
  • when we look away from our dysfunctional federal government, disconnect from social media, and talk with our neighbors directly, things seem more hopeful. Most Americans in the More in Common report are members of the “exhausted majority,” which is tired of the fighting and is willing to listen to the other side and compromise. Most Americans now see that social media is having a negative impact on the country, and are becoming more aware of its damaging effects on children.
Javier E

Overstimulation Nation - Slack Tide by Matt Labash - 0 views

  • The local radio jock said to me, “You must think all of this is pretty silly”. He motioned towards the crowd and then to a rollercoaster directly beside us that came screeching at our heads every 95 seconds. But I said, “No. In a century people are going to look back on right now as a sort of magic era, a charmed time of peace and prosperity and freedom from fear, as something that can never happen again, no matter how much they wish it would.”
  • telling the truth always liberates us, even if it scares the hell out of us simultaneously
  • Bad things have always happened in this world. That’s nothing new. And bad things will continue to have their uninterrupted run, right until the end of time.  But the “freedom from fear” Coupland speaks of is largely a function of not wallowing in it all the live-long day, which  our trusty bad-news delivery systems are pretty good about making us do. They give us the illusion of constant movement, even if our only destination is backwards, prompting us to forever double down on fear, and agitation, and mutual suspicion, while steeping us in our own soul sickness.
  • ...7 more annotations...
  • It’s a trap, which maybe seeking out a little more deliberate boredom – also known as stillness - could help us avoid
  • Thomas Merton, whose praises I have sung in these pages before, framed it:
  • being bored might be a good start for healing what ails us.
  • But the purity of our conscience has a natural proportion with the depth of our being and the quality of our acts: and when our activity is habitually disordered, our malformed conscience can think of nothing better to tell us than to multiply the *quantity* of our acts, without perfecting their quality. And so we go from bad to worse, exhaust ourselves, empty our whole life of all content, and fall into despair
  • There are times, then, when in order to keep ourselves in existence at all we simply have to sit back for a while and do nothing. And for a man who has let himself be drawn completely out of himself by his activity, nothing is more difficult than to sit still and rest, doing nothing at all. The very act of resting is the hardest and most courageous act he can perform: and often it is quite beyond his power.
  • Our being is not to be enriched merely by activity or experience as such. Everything depends on the *quality* of our acts and our experiences. A multitude of badly performed actions and of experiences only half lived exhausts and depletes our being. By doing things badly we make ourselves less real. This growing unreality cannot help but make us unhappy and fill us with a sense of guilt
  • even with all the excitement, I couldn’t sustain any. I was bored by the excitement. Or rather, I craved boredom, finding all the excitement dull in a not-this-shitshow-again sort of way. For the last decade or so, we’ve been too over-excited, over-provoked, and overstimulated.
Javier E

What Gamergate should have taught us about the 'alt-right' | Technology | The Guardian - 0 views

  • Gamergate
  • The 2014 hashtag campaign, ostensibly founded to protest about perceived ethical failures in games journalism, clearly thrived on hate – even though many of those who aligned themselves with the movement either denied there was a problem with harassment, or wrote it off as an unfortunate side effect
  • ure, women, minorities and progressive voices within the industry were suddenly living in fear. Sure, those who spoke out in their defence were quickly silenced through exhausting bursts of online abuse. But that wasn’t why people supported it, right? They were disenfranchised, felt ignored, and wanted to see a systematic change.
  • ...23 more annotations...
  • Is this all sounding rather familiar now? Does it remind you of something?
  • it quickly became clear that the GamerGate movement was a mess – an undefined mission to Make Video Games Great Again via undecided means.
  • fter all, the culture war that began in games now has a senior representative in The White House. As a founder member and former executive chair of Brietbart News, Steve Bannon had a hand in creating media monster Milo Yiannopoulos, who built his fame and Twitter following by supporting and cheerleading Gamergate. This hashtag was the canary in the coalmine, and we ignored it.
  • Gamergate was an online movement that effectively began because a man wanted to punish his ex girlfriend. Its most notable achievement was harassing a large number of progressive figures - mostly women – to the point where they felt unsafe or considered leaving the industry
  • The similarities between Gamergate and the far-right online movement, the “alt-right”, are huge, startling and in no way a coincidence
  • These figures gave Gamergate a new sense of direction – generalising the rhetoric: this was now a wider war between “Social Justice Warriors” (SJWs) and everyday, normal, decent people. Games were simply the tip of the iceberg – progressive values, went the argument, were destroying everything
  • In 2016, new wave conservative media outlets like Breitbart have gained trust with their audience by painting traditional news sources as snooty and aloof. In 2014, video game YouTube stars, seeking to appear in touch with online gaming communities, unscrupulously proclaimed that traditional old-media sources were corrupt. Everything we’re seeing now, had its precedent two years ago.
  • With 2014’s Gamergate, Breitbart seized the opportunity to harness the pre-existing ignorance and anger among disaffected young white dudes. With Trump’s movement in 2016, the outlet was effectively running his campaign: Steve Bannon took leave of his role at the company in August 2016 when he was hired as chief executive of Trump’s presidential campaign
  • young men converted via 2014’s Gamergate, are being more widely courted now. By leveraging distrust and resentment towards women, minorities and progressives, many of Gamergate’s most prominent voices – characters like Mike Cernovich, Adam Baldwin, and Milo Yiannopoulos – drew power and influence from its chaos
  • no one in the movement was willing to be associated with the abuse being carried out in its name. Prominent supporters on Twitter, in subreddits and on forums like 8Chan, developed a range of pernicious rhetorical devices and defences to distance themselves from threats to women and minorities in the industry: the targets were lying or exaggerating, they were too precious; a language of dismissal and belittlement was formed against them. Safe spaces, snowflakes, unicorns, cry bullies. Even when abuse was proven, the usual response was that people on their side were being abused too. These techniques, forged in Gamergate, have become the standard toolset of far-right voices online
  • The majority of people who voted for Trump will never take responsibility for his racist, totalitarian policies, but they’ll provide useful cover and legitimacy for those who demand the very worst from the President Elect. Trump himself may have disavowed the “alt-right”, but his rhetoric has led to them feeling legitimised. As with Gamergate, the press risks being manipulated into a position where it has to tread a respectful middle ground that doesn’t really exist.
  • Using 4chan (and then the more sympathetic offshoot 8Chan) to plan their subversions and attacks made Gamergate a terribly sloppy operation, leaving a trail of evidence that made it quite clear the whole thing was purposefully, plainly nasty. But the video game industry didn’t have the spine to react, and allowed the movement to coagulate – forming a mass of spiteful disappointment that Breitbart was only more than happy to coddle
  • Historically, that seems to be Breitbart’s trick - strongly represent a single issue in order to earn trust, and then gradually indoctrinate to suit wider purposes. With Gamergate, they purposefully went fishing for anti-feminists. 2016’s batch of fresh converts – the white extremists – came from enticing conspiracy theories about the global neoliberal elite secretly controlling the world.
  • The greatest strength of Gamergate, though, was that it actually appeared to represent many left-leaning ideals: stamping out corruption in the press, pushing for better ethical practices, battling for openness.
  • There are similarities here with many who support Trump because of his promises to put an end to broken neo-liberalism, to “drain the swamp” of establishment corruption. Many left-leaning supporters of Gamergate sought to intellectualise their alignment with the hashtag, adopting familiar and acceptable labels of dissent – identifying as libertarian, egalitarian, humanist.
  • At best they unknowingly facilitated abuse, defending their own freedom of expression while those who actually needed support were threatened and attacked.
  • Genuine discussions over criticism, identity and censorship were paralysed and waylaid by Twitter voices obsessed with rhetorical fallacies and pedantic debating practices. While the core of these movements make people’s lives hell, the outer shell – knowingly or otherwise – protect abusers by insisting that the real problem is that you don’t want to talk, or won’t provide the ever-shifting evidence they politely require.
  • In 2017, the tactics used to discredit progressive game critics and developers will be used to discredit Trump and Bannon’s critics. There will be gaslighting, there will be attempts to make victims look as though they are losing their grip on reality, to the point that they gradually even start to believe it. The “post-truth” reality is not simply an accident – it is a concerted assault on the rational psyche.
  • The strangest aspect of Gamergate is that it consistently didn’t make any sense: people chose to align with it, and yet refused responsibility. It was constantly demanded that we debate the issues, but explanations and facts were treated with scorn. Attempts to find common ground saw the specifics of the demands being shifted: we want you to listen to us; we want you to change your ways; we want you to close your publication down. This movement that ostensibly wanted to protect free speech from cry bully SJWs simultaneously did what it could to endanger sites it disagreed with, encouraging advertisers to abandon support for media outlets that published stories critical of the hashtag. The petulance of that movement is disturbingly echoed in Trump’s own Twitter feed.
  • Looking back, Gamergate really only made sense in one way: as an exemplar of what Umberto Eco called “eternal fascism”, a form of extremism he believed could flourish at any point in, in any place – a fascism that would extol traditional values, rally against diversity and cultural critics, believe in the value of action above thought and encourage a distrust of intellectuals or experts – a fascism built on frustration and machismo. The requirement of this formless fascism would – above all else – be to remain in an endless state of conflict, a fight against a foe who must always be portrayed as impossibly strong and laughably weak
  • 2016 has presented us with a world in which our reality is being wilfully manipulated. Fake news, divisive algorithms, misleading social media campaigns.
  • The same voices moved into other geek communities, especially comics, where Marvel and DC were criticised for progressive storylines and decisions. They moved into science fiction with the controversy over the Hugo awards. They moved into cinema with the revolting kickback against the all-female Ghostbusters reboot.
  • Perhaps the true lesson of Gamergate was that the media is culturally unequipped to deal with the forces actively driving these online movements. The situation was horrifying enough two years ago, it is many times more dangerous now.
sissij

Depression is as bad for your heart as high cholesterol | Fox News - 0 views

  • When you think of heart attacks, you might assume the most common causes are smoking, high cholesterol, or obesity. Mental health issues probably don't spring to mind.
  • Depression—which for this study, was determined by a checklist of mood symptoms, including anxiety and fatigue
  • “depressed mood and exhaustion holds a solid middle position within the concert of major cardiovascular risk factors.”
  •  
    I think it is really interesting that even mental health issues has a positive relationship with cardiovascular disease. Our mind can affect how our body works. As we learn in the sense and perception unit, we know that brain will give us a shot of certain chemical that makes us feel good when we make certain decision. I think how we feel can reflect how our body feels. We all know that we feel pain because it is a warning that the injured part of our body send to our brain. So I think probably the feeling of depressed can be a warning sent by some part of our body. The scientific method mentioned in this article is a population research which is a typical biology scientific method. --Sissi (1/29/2017)
sissij

Is Empathy Overrated? | Big Think - 0 views

  • Empathy seems to be a quality you can never overdo. It’s like a megavitamin of emotionally relating: the more you display, the better a human you are.
  • In his last book, Just Babies, he argued humans are born moral, no religion required.
  • Telling someone empathy is overrated is akin to stating puppies are useless and ugly.
  • ...6 more annotations...
  • Empathy is the act of coming to experience the world as you think someone else does … If your suffering makes me suffer, if I feel what you feel, that’s empathy in the sense that I’m interested in here.
  • For example, donating to foreign charities ups our dopamine intake—we feel better because we’re making a difference (which, of course, can make it more about how we feel than who we’re helping).
  • Yet it’s not in our biological inheritance to offer unchecked empathy. Bloom points to our tribal nature as evidence. We’re going to care more for those closest to us, such as family and friends, then Cambodian orphans.
  • Anyone who thinks that it’s important for a therapist to feel depressed or anxious while dealing with depressed or anxious people is missing the point of therapy.
  • Bloom then discusses the difference between what Binghamton professor and Asian Studies scholar Charles Goodman describes as “sentimental compassion” and “great compassion.” The first is similar to empathy, which leads to imbalances in relationships and one’s own psychological state. Simply put, it’s exhausting.
  • Empathy is going to be a buzzword for some time to come. It feeds into our social nature, which Bloom sees nothing wrong with.
  •  
    I found this article very interesting as it talks about how empathy as a emotion is sometimes bad for us. I really like the point when the author mention that the empathy is not in our biological inheritance because our tribal nature is to care more for those closest to us. It is very interesting to think how our modern society shapes our emotions and behavior, and how empathy is gradually becoming our nature. --Sissi (2/22/2017)
Duncan H

Living in the Material World - NYTimes.com - 0 views

  • on a visit to the Academy of Sciences in Almaty some years ago I was presented with a souvenir meant to assure me that Central Asia was indeed still producing philosophy worthy of note. It was a collectively authored book entitled “The Development of Materialist Dialectics in Kazakhstan,” and I still display it proudly on my shelf. Its rough binding and paper bespeak economic hardship. It is packed with the traces of ideas, yet everything about the book announces its materiality.I had arrived in the Kazakh capital 1994, just in time to encounter the last of a dying breed: the philosopher as party functionary (they are all by now retired, dead or defenestrated, or have simply given up on what they learned in school). The book, written by committee, was a collection of official talking points, and what passed for conversation there was something much closer to recitation.
  • The philosophical meaning of materialism may in the final analysis be traced back to a religious view of the world. On this view, to focus on the material side of existence is to turn away from the eternal and divine. Here, the category of the material is assimilated to that of sin or evil.
  • Yet in fact this feature of Marxist philosophical classification is one that, with some variations, continues to be shared by all philosophers, even in the West, even today
  • ...9 more annotations...
  • materialism is not the greedy desire for material goods, but rather the belief that the fundamental reality of the world is material;
  • idealism is not the aspiration toward lofty and laudable goals, but rather the belief that the fundamental reality of the world is mental or idea-like. English-speaking philosophers today tend to speak of “physicalism” or “naturalism” rather than materialism (perhaps to avoid confusion with the Wall Street sense of the term). At the same time, Anglo-American historians of philosophy continue to find the distinction between materialism and idealism a useful one in our attempts at categorizing past schools of thought. Democritus and La Mettrie were materialists; Hobbes was pretty close. Berkeley and Kant were idealists; Leibniz may have been.
  • And it was these paradoxes that led the Irish philosopher to conclude that talk of matter was but a case of multiplying entities beyond necessity. For Berkeley, all we can know are ideas, and for this reason it made sense to suppose that the world itself consists in ideas.
  • Central to this performance was the concept of  “materialism.” The entire history of philosophy, in fact, was portrayed in Soviet historiography as a series of matches between the materialist home-team and its “idealist” opponents, beginning roughly with Democritus (good) and Plato (bad), and culminating in the opposition between official party philosophy and logical positivism, the latter of which was portrayed as a shrouded variety of idealism. Thus from the “Short Philosophical Dictionary,” published in Moscow in 1951, we learn that the school of logical empiricism represented by Rudolf Carnap, Otto Neurath and others, “is a form of subjective idealism, characteristic of degenerating bourgeois philosophy in the epoch of the decline of capitalism.”Now the Soviet usage of this pair of terms appears to fly in the face of our ordinary, non-philosophical understanding of them (that, for example,  Wall Street values are “materialist,” while the Occupy movement is “idealist”). One might have thought that the communists should be flinging the “materialist” label at their capitalist enemies, rather than claiming it for themselves. One might also have thought that the Bolshevik Revolution and the subsequent failed project of building a workers’ utopia was nothing if not idealistic.
  • one great problem with the concept of materialism is that it says very little in itself. What is required in addition is an elaboration of what a given thinker takes matter, or ideas, to be. It may not be just the Marxist aftertaste, but also the fact that the old common-sense idea about matter as brute, given stuff has turned out to have so little to do with the way the physical world actually is, that has led Anglo-American philosophers to prefer to associate themselves with the “physical” or the “natural” rather than with the material.  Reality, they want to say, is just what is natural, while everything else is in turn “supernatural” (this distinction has its clarity going for it, but it also seems uncomfortably close to tautology). Not every philosopher has a solid grasp of subatomic physics, but most know enough to grasp that, even if reality is eventually exhaustively accounted for through an enumeration of the kinds of particles and a few basic forces, this reality will still look nothing like what your average person-in-the-street takes reality to be.
  • The 18th-century idealist philosopher George Berkeley strongly believed that matter was only a fiction contrived by philosophers in the first place, for which the real people had no need. For Berkeley, there was never anything common-sensical about matter. We did not need to arrive at the era of atom-splitting and wave-particle duality, then, in order for the paradoxes inherent in matter to make themselves known (is it infinitely divisible or isn’t it?
  • Soviet and Western Marxists alike, by stark contrast, and before them the French “vulgar” (i.e., non-dialectical) materialists of the 18th century, saw and see the material world as the base and cause of all mental activity, as both bringing ideas into existence, and also determining the form and character of a society’s ideas in accordance with the state of its technology, its methods of resource extraction and its organization of labor. So here to focus on the material is not to become distracted from the true source of being, but rather to zero right in on it.
  • Consider money. Though it might sometimes be represented by bank notes or coins, money is an immaterial thing par excellence, and to seek to acquire it is to move on the plane of ideas. Of course, money can also be converted into material things, yet it seems simplistic to suppose that we want money only in order to convert it into the material things we really want, since even these material things aren’t just material either: they are symbolically dense artifacts, and they convey to others certain ideas about their owners. This, principally, is why their owners want them, which is to say that materialists (in the everyday sense) are trading in ideas just as much as anyone else.
  • In the end no one really cares about stuff itself. Material acquisitions — even, or perhaps especially, material acquisitions of things like Rolls Royces and Rolexes — are maneuvers within a universe of materially instantiated ideas. This is human reality, and it is within this reality that mystics, scientists, and philosophers alike are constrained to pursue their various ends, no matter what they might take the ultimate nature of the external world to be.
  •  
    A very interesting article on the contrast between materialism and idealism.
Duncan H

Other People's Suffering - NYTimes.com - 0 views

  • members of the upper class are more likely than others to behave unethically, to lie during negotiations, to drive illegally and to cheat when competing for a prize.“Greed is a robust determinant of unethical behavior,” the authors conclude. “Relative to lower-class individuals, individuals from upper-class backgrounds behaved more unethically in both naturalistic and laboratory settings.”
  • Our findings suggest that when a person is suffering, upper-class individuals perceive these signals less well on average, consistent with other findings documenting reduced empathic accuracy in upper-class individuals (Kraus et al., 2010). Taken together, these findings suggest that upper-class individuals may underestimate the distress and suffering in their social environments.
  • each participant was assigned to listen, face to face, from two feet away, to someone else describing real personal experiences of suffering and distress.The listeners’ responses were measured two ways, first by self-reported levels of compassion and second by electrocardiogram readings to determine the intensity of their emotional response. The participants all took a test known as the “sense of power” scale, ranking themselves on such personal strengths and weaknesses as ‘‘I can get people to listen to what I say’’ and ‘‘I can get others to do what I want,” as well as ‘‘My wishes do not carry much weight’’ and ‘‘Even if I voice them, my views have little sway,’’ which are reverse scored.The findings were noteworthy, to say the least. For “low-power” listeners, compassion levels shot up as the person describing suffering became more distressed. Exactly the opposite happened for “high-power” listeners: their compassion dropped as distress rose.
  • ...7 more annotations...
  • Who fits the stereotype of the rich and powerful described in this research? Mitt Romney. Empathy: “I’m not concerned about the very poor.” Compassion: “I like being able to fire people who provide services to me.” Sympathy for the disadvantaged: My wife “drives a couple of Cadillacs.” Willingness to lie in negotiations: “I was a severely conservative Republican governor.”
  • 48 percent described the Democratic Party as “weak,” compared to 28 percent who described the Republican Party that way. Conversely, 50 percent said the Republican Party is “cold hearted,” compared to 30 percent who said that was true of the Democrats.
  • This is the war that is raging throughout America. It is between conservatives, who emphasize personal responsibility and achievement, against liberals, who say the government must take from the wealthy and give to the poor. So it will be interesting this week to see if President Obama can rally the country to support his vision of a strong social compact. He has compassion on his side. Few Americans want to see their fellow citizens suffer. But the president does have that fiscal responsibility issue haunting him because the country remains in dire trouble.
  • For power holders, the world is viewed through an instrumental lens, and approach is directed toward those individuals who populate the useful parts of the landscape. Our results suggest that power not only channels its possessor’s energy toward goal completion but also targets and attempts to harness the energy of useful others. Thus, power appears to be a great facilitator of goal pursuit through a combination of intrapersonal and interpersonal processes. The nature of the power holder’s goals and interpersonal relationships ultimately determine how power is harnessed and what is accomplished in the end.
  • Republicans recognize the political usefulness of objectification, capitalizing on “compassion fatigue,” or the exhaustion of empathy, among large swathes of the electorate who are already stressed by the economic collapse of 2008, high levels of unemployment, an epidemic of foreclosures, stagnant wages and a hyper-competitive business arena.
  • . Republican debates provided further evidence of compassion fatigue when audiences cheered the record-setting use of the death penalty in Texas and applauded the prospect of a gravely ill pauper who, unable to pay medical fees, was allowed to die.Even Rick Santorum, who has been described by the National Review as holding “unstinting devotion to human dignity” and as fluent in “the struggles of the working class,” wants to slash aid to the poor. At a Feb. 21 gathering of 500 voters in Maricopa County, Ariz., Santorum brought the audience to its feet as he declared:We need to take everything from food stamps to Medicaid to the housing programs to education and training programs, we need to cut them, cap them, freeze them, send them to the states, say that there has to be a time limit and a work requirement, and be able to give them the flexibility to do those programs here at the state level.
  • President Obama has a substantial advantage this year because he does not have a primary challenger, which frees him from the need to emphasize his advocacy for the disempowered — increasing benefits or raising wages for the poor. This allows him to pick and chose the issues he wants to address.At the same time, compassion fatigue may make it easier for the Republican nominee to overcome the liabilities stemming from his own primary rhetoric, to reach beyond the core of the party to white centrist voters less openly drawn to hard-edged conservatism. With their capacity for empathy frayed by a pervasive sense of diminishing opportunity and encroaching shortfall, will these voters once again become dependable Republicans in 2012?
  •  
    Do you agree with Edsall? I think he is definitely taking an anti-Republican stance, but the findings are interesting.
Javier E

The American Scholar: The Decline of the English Department - William M. Chace - 1 views

  • The number of young men and women majoring in English has dropped dramatically; the same is true of philosophy, foreign languages, art history, and kindred fields, including history. As someone who has taught in four university English departments over the last 40 years, I am dismayed by this shift, as are my colleagues here and there across the land. And because it is probably irreversible, it is important to attempt to sort out the reasons—the many reasons—for what has happened.
  • English: from 7.6 percent of the majors to 3.9 percent
  • In one generation, then, the numbers of those majoring in the humanities dropped from a total of 30 percent to a total of less than 16 percent; during that same generation, business majors climbed from 14 percent to 22 percent.
  • ...23 more annotations...
  • History: from 18.5 percent to 10.7 percent
  • But the deeper explanation resides not in something that has happened to it, but in what it has done to itself. English has become less and less coherent as a discipline and, worse, has come near exhaustion as a scholarly pursuit.
  • The twin focus, then, was on the philological nature of the enterprise and the canon of great works to be studied in their historical evolution.
  • Studying English taught us how to write and think better, and to make articulate many of the inchoate impulses and confusions of our post-adolescent minds. We began to see, as we had not before, how such books could shape and refine our thinking. We began to understand why generations of people coming before us had kept them in libraries and bookstores and in classes such as ours. There was, we got to know, a tradition, a historical culture, that had been assembled around these books. Shakespeare had indeed made a difference—to people before us, now to us, and forever to the language of English-speaking people.
  • today there are stunning changes in the student population: there are more and more gifted and enterprising students coming from immigrant backgrounds, students with only slender connections to Western culture and to the assumption that the “great books” of England and the United States should enjoy a fixed centrality in the world. What was once the heart of the matter now seems provincial. Why throw yourself into a study of something not emblematic of the world but representative of a special national interest? As the campus reflects the cultural, racial, and religious complexities of the world around it, reading British and American literature looks more and more marginal. From a global perspective, the books look smaller.
  • With the cost of a college degree surging upward during the last quarter century—tuition itself increasing far beyond any measure of inflation—and with consequent growth in loan debt after graduation, parents have become anxious about the relative earning power of a humanities degree. Their college-age children doubtless share such anxiety. When college costs were lower, anxiety could be kept at bay. (Berkeley in the early ’60s cost me about $100 a year, about $700 in today’s dollars.)
  • Economists, chemists, biologists, psychologists, computer scientists, and almost everyone in the medical sciences win sponsored research, grants, and federal dollars. By and large, humanists don’t, and so they find themselves as direct employees of the institution, consuming money in salaries, pensions, and operating needs—not external money but institutional money.
  • These, then, are some of the external causes of the decline of English: the rise of public education; the relative youth and instability (despite its apparent mature solidity) of English as a discipline; the impact of money; and the pressures upon departments within the modern university to attract financial resources rather than simply use them up.
  • several of my colleagues around the country have called for a return to the aesthetic wellsprings of literature, the rock-solid fact, often neglected, that it can indeed amuse, delight, and educate. They urge the teaching of English, or French, or Russian literature, and the like, in terms of the intrinsic value of the works themselves, in all their range and multiplicity, as well-crafted and appealing artifacts of human wisdom. Second, we should redefine our own standards for granting tenure, placing more emphasis on the classroom and less on published research, and we should prepare to contest our decisions with administrators whose science-based model is not an appropriate means of evaluation.
  • “It may be that what has happened to the profession is not the consequence of social or philosophical changes, but simply the consequence of a tank now empty.” His homely metaphor pointed to the absence of genuinely new frontiers of knowledge and understanding for English professors to explore.
  • In this country and in England, the study of English literature began in the latter part of the 19th century as an exercise in the scientific pursuit of philological research, and those who taught it subscribed to the notion that literature was best understood as a product of language.
  • no one has come forward in years to assert that the study of English (or comparative literature or similar undertakings in other languages) is coherent, does have self-limiting boundaries, and can be described as this but not that.
  • to teach English today is to do, intellectually, what one pleases. No sense of duty remains toward works of English or American literature; amateur sociology or anthropology or philosophy or comic books or studies of trauma among soldiers or survivors of the Holocaust will do. You need not even believe that works of literature have intelligible meaning; you can announce that they bear no relationship at all to the world beyond the text.
  • With everything on the table, and with foundational principles abandoned, everyone is free, in the classroom or in prose, to exercise intellectual laissez-faire in the largest possible way—I won’t interfere with what you do and am happy to see that you will return the favor
  • Consider the English department at Harvard University. It has now agreed to remove its survey of English literature for undergraduates, replacing it and much else with four new “affinity groups”
  • there would be no one book, or family of books, that every English major at Harvard would have read by the time he or she graduates. The direction to which Harvard would lead its students in this “clean slate” or “trickle down” experiment is to suspend literary history, thrusting into the hands of undergraduates the job of cobbling together intellectual coherence for themselves
  • Those who once strove to give order to the curriculum will have learned, from Harvard, that terms like core knowledge and foundational experience only trigger acrimony, turf protection, and faculty mutinies. No one has the stomach anymore to refight the Western culture wars. Let the students find their own way to knowledge.
  • In English, the average number of years spent earning a doctoral degree is almost 11. After passing that milestone, only half of new Ph.D.’s find teaching jobs, the number of new positions having declined over the last year by more than 20 percent; many of those jobs are part-time or come with no possibility of tenure. News like that, moving through student networks, can be matched against, at least until recently, the reputed earning power of recent graduates of business schools, law schools, and medical schools. The comparison is akin to what young people growing up in Rust Belt cities are forced to see: the work isn’t here anymore; our technology is obsolete.
  • unlike other members of the university community, they might well have been plying their trade without proper credentials: “Whereas economists or physicists, geologists or climatologists, physicians or lawyers must master a body of knowledge before they can even think of being licensed to practice,” she said, “we literary scholars, it is tacitly assumed, have no definable expertise.”
  • English departments need not refight the Western culture wars. But they need to fight their own book wars. They must agree on which texts to teach and argue out the choices and the principles of making them if they are to claim the respect due a department of study.
  • They can teach their students to write well, to use rhetoric. They should place their courses in composition and rhetoric at the forefront of their activities. They should announce that the teaching of composition is a skill their instructors have mastered and that students majoring in English will be certified, upon graduation, as possessing rigorously tested competence in prose expression.
  • The study of literature will then take on the profile now held, with moderate dignity, by the study of the classics, Greek and Latin.
  • But we can, we must, do better. At stake are the books themselves and what they can mean to the young. Yes, it is just a literary tradition. That’s all. But without such traditions, civil societies have no compass to guide them.
Javier E

Valley of the Blahs: How Justin Bieber's Troubles Exposed Twitter's Achilles' Heel - NY... - 0 views

  • I think the number of followers you have is often irrelevant. What does matter, however, is how many people notice you, either through retweets, favorites or the holy grail, a retweet by someone extremely well known, like a celebrity. That validation that your contribution is important, interesting or worthy is enough social proof to encourage repetition. Many times, that results in one-upmanship, straining to be the loudest or the most retweeted and referred to as the person who captured the splashiest event of the day in the pithiest way.
  • It feels as if we’re all trying to be a cheeky guest on a late-night show, a reality show contestant or a toddler with a tiara on Twitter — delivering the performance of a lifetime, via a hot, rapid-fire string of commentary, GIFs or responses that help us stand out from the crowd. We’re sold on the idea that if we’re good enough, it could be our ticket to success
  • more often than not, it translates to standing on a collective soapbox, elbowing each other for room, in the hopes of being credited with delivering the cleverest one-liner or reaction
  • ...3 more annotations...
  • Increasingly, I’ve found myself retreating to smaller groups, where the stakes are lower and people are more honest and less determined to prove a point, freer to joke and experiment, more trusting in one other and open to real conversation.
  • Twitter is starting to feel calcified, slowed down by the weight of its own users, cumbersome, less exciting than exhausting. It may be why less public forms of communication — messaging applications like Snapchat, GroupMe, Instagram Direct and even old-fashioned e-mail threads and Google groups — are playing a bigger and bigger role in the most meaningful interactions during my day online.
  • even if the company were to snap to attention and give its community something other than Twitter lists and block or unfollow buttons to help users tailor their feeds, it most likely wouldn’t be enough. We, the users, the producers, the consumers — all our manic energy, yearning to be noticed, recognized for an important contribution to the conversation — are the problem. It is fueled by our own increasing need for attention, validation, through likes, favorites, responses, interactions. It is a feedback loop that can’t be closed, at least not for now.
Javier E

"Breaking Bad" By Niccolo Machiavelli « The Dish - 0 views

  • If a man is truly a man through force and fraud and nerve, then Walter becomes the man he always wanted to be. He trounces every foe; he gains a huge fortune; he dies a natural death. Compared with being a high school chemistry teacher? Niccolo would scoff at the comparison. “I did it for me.”
  • Walt is consumed all along by justified resentment of the success others stole from him, and by a rage that his superior mind was out-foxed by unscrupulous colleagues. He therefore lived and died his final years for human honor – for what <img class="alignright size-medium wp-image-150262" alt="466px-Portrait_of_Niccolò_Machiavelli_by_Santi_di_Tito" src="http://sullydish.files.wordpress.com/2013/02/466px-portrait_of_niccolocc80_machiavelli_by_santi_di_tito.jpg?w=233&h=300" width="233" height="300" />Machiavelli calls virtu, a caustic, brutal inversion of Christian virtue
  • his skills were eventually proven beyond any measure in ways that would never have happened if he had never broken bad. And breaking bad cannot mean putting a limit on what you are capable of doing. What Machiavelli insisted upon was that a successful power-broker know how to be “altogether bad.”
  • ...8 more annotations...
  • the cost-benefit analysis of “breaking bad” when the alternative is imminently “dying alone” is rigged in favor of the very short term, i.e. zero-sum evil. If Walt had had to weigh a long, unpredictable lifetime of unending fear and constant danger for his family and himself, he would have stopped cooking meth.
  • was he happy? Yes, but in a way that never really reflects any inner peace. He is happy in a way that all millionaires and tyrants are happy.
  • Breaking Bad should be taught alongside Machiavelli – as a riveting companion piece.
  • It should be taught because it really does convey the egoist appeal of evil, of acting ruthlessly in the world
  • The benefits only work if your life is nasty, brutish and short. The costs are seen in the exhausted, broken eyes of Skyler, the betrayal of an only painfully faithful son, the murder of a brother-in-law, the grisly massacre of dozens, the endless nervous need to be on the alert, to run and hide and lie and lie and lie again, until life itself becomes merely a means to achieve temporary security.
  • Machiavelli differs from later realists like Hobbes—and more contemporary “neorealists” like the late Kenneth Waltz—in recognizing that human agency matters as much as the structural fact of international anarchy in determining both foreign policy behavior and ultimate outcomes in world politics.
  • a leader’s choices can have a pivotal impact on politics, both domestic and international.
  • Though fortune be capricious and history contingent, the able leader may shape his fate and that of his state through the exercise of virtu. This is not to be mistaken for “virtue”, as defined by Christian moral teaching (implying integrity, charity, humility, and the like). Rather, it denotes the human qualities prized in classical antiquity, including knowledge, courage, cunning, pride, and strength.
Javier E

How to Make Your Own Luck | Brain Pickings - 0 views

  • editor Jocelyn Glei and her team at Behance’s 99U pull together another package of practical wisdom from 21 celebrated creative entrepreneurs. Despite the somewhat self-helpy, SEO-skewing title, this compendium of advice is anything but contrived. Rather, it’s a no-nonsense, experience-tested, life-approved cookbook for creative intelligence, exploring everything from harnessing the power of habit to cultivating meaningful relationships that enrich your work to overcoming the fear of failure.
  • If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.
  • Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.
  • ...14 more annotations...
  • the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:
  • We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another
  • the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next
  • a diary boosts your creativity
  • This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.
  • The second reason is focalism. When we contemplate failure from afar, according to Gilbert and Wilson, we tend to overemphasize the focal event (i.e., failure) and overlook all the other episodic details of daily life that help us move on and feel better. The threat of failure is so vivid that it consumes our attention
  • the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work: On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.
  • Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.
  • This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:
  • Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint.
  • Schwalbe reminds us of the “impact bias” — our tendency to greatly overestimate the intensity and extent of our emotional reactions, which causes us to expect failures to be more painful than they actually are and thus to fear them more than we should.
  • When we think about taking a risk, we rarely consider how good we will be at reframing a disappointing outcome. In short, we underestimate our resilience.
  • what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.
  • don’t let yourself forget that the good life, the meaningful life, the truly fulfilling life, is the life of presence, not of productivity.
Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
Javier E

Untier Of Knots « The Dish - 0 views

  • Benedict XVI and John Paul II focused on restoring dogmatic certainty as the counterpart to papal authority. Francis is arguing that both, if taken too far, can be sirens leading us away from God, not ensuring our orthodoxy but sealing us off in calcified positions and rituals that can come to mean nothing outside themselves
  • In this quest to seek and find God in all things there is still an area of uncertainty. There must be. If a person says that he met God with total certainty and is not touched by a margin of uncertainty, then this is not good. For me, this is an important key. If one has the answers to all the questions – that is the proof that God is not with him. It means that he is a false prophet using religion for himself. The great leaders of the people of God, like Moses, have always left room for doubt. You must leave room for the Lord, not for our certainties; we must be humble.
  • If the Christian is a restorationist, a legalist, if he wants everything clear and safe, then he will find nothing. Tradition and memory of the past must help us to have the courage to open up new areas to God.
  • ...31 more annotations...
  • In the end, you realize your only real option – against almost every fiber in your irate being – is to take each knot in turn, patiently and gently undo it, loosen a little, see what happens, and move on to the next. You will never know exactly when all the knots will resolve themselves – it can happen quite quickly after a while or seemingly never. But you do know that patience, and concern with the here and now, is the only way to “solve” the “problem.” You don’t look forward with a plan; you look down with a practice.
  • we can say what God is not, we can speak of his attributes, but we cannot say what He is. That apophatic dimension, which reveals how I speak about God, is critical to our theology
  • I would also classify as arrogant those theologies that not only attempted to define with certainty and exactness God’s attributes, but also had the pretense of saying who He was.
  • It is only in living that we achieve hints and guesses – and only hints and guesses – of what the Divine truly is. And because the Divine is found and lost by humans in time and history, there is no reachable truth for humans outside that time and history.
  • We are part of an unfolding drama in which the Christian, far from clinging to some distant, pristine Truth he cannot fully understand, will seek to understand and discern the “signs of the times” as one clue as to how to live now, in the footsteps of Jesus. Or in the words of T.S. Eliot, There is only the fight to recover what has been lost And found and lost again and again: and now, under conditions That seem unpropitious. But perhaps neither gain nor loss. For us, there is only the trying. The rest is not our business.
  • Ratzinger’s Augustinian notion of divine revelation: it is always a radical gift; it must always be accepted without question; it comes from above to those utterly unworthy below; and we are too flawed, too sinful, too human to question it in even the slightest respect. And if we ever compromise an iota on that absolute, authentic, top-down truth, then we can know nothing as true. We are, in fact, lost for ever.
  • A Christian life is about patience, about the present and about trust that God is there for us. It does not seek certainty or finality to life’s endless ordeals and puzzles. It seeks through prayer and action in the world to listen to God’s plan and follow its always-unfolding intimations. It requires waiting. It requires diligence
  • We may never know why exactly Benedict resigned as he did. But I suspect mere exhaustion of the body and mind was not the whole of it. He had to see, because his remains such a first-rate mind, that his project had failed, that the levers he continued to pull – more and more insistent doctrinal orthodoxy, more political conflict with almost every aspect of the modern world, more fastidious control of liturgy – simply had no impact any more.
  • The Pope must accompany those challenging existing ways of doing things! Others may know better than he does. Or, to feminize away the patriarchy: I dream of a church that is a mother and shepherdess. The church’s ministers must be merciful, take responsibility for the people, and accompany them like the good Samaritan, who washes, cleans, and raises up his neighbor. This is pure Gospel.
  • the key to Francis’ expression of faith is an openness to the future, a firm place in the present, and a willingness to entertain doubt, to discern new truths and directions, and to grow. Think of Benedict’s insistence on submission of intellect and will to the only authentic truth (the Pope’s), and then read this: Within the Church countless issues are being studied and reflected upon with great freedom. Differing currents of thought in philosophy, theology, and pastoral practice, if open to being reconciled by the Spirit in respect and love, can enable the Church to grow, since all of them help to express more clearly the immense riches of God’s word. For those who long for a monolithic body of doctrine guarded by all and leaving no room for nuance, this might appear as undesirable and leading to confusion. But in fact such variety serves to bring out and develop different facets of the inexhaustible riches of the Gospel.
  • Francis, like Jesus, has had such an impact in such a short period of time simply because of the way he seems to be. His being does not rely on any claims to inherited, ecclesiastical authority; his very way of life is the only moral authority he wants to claim.
  • faith is, for Francis, a way of life, not a set of propositions. It is a way of life in community with others, lived in the present yet always, deeply, insistently aware of eternity.
  • Father Howard Gray S.J. has put it simply enough: Ultimately, Ignatian spirituality trusts the world as a place where God dwells and labors and gathers all to himself in an act of forgiveness where that is needed, and in an act of blessing where that is prayed for.
  • Underlying all this is a profound shift away from an idea of religion as doctrine and toward an idea of religion as a way of life. Faith is a constantly growing garden, not a permanently finished masterpiece
  • Some have suggested that much of what Francis did is compatible with PTSD. He disowned his father and family business, and he chose to live homeless, and close to naked, in the neighboring countryside, among the sick and the animals. From being the dashing man of society he had once been, he became a homeless person with what many of us today would call, at first blush, obvious mental illness.
  • these actions – of humility, of kindness, of compassion, and of service – are integral to Francis’ resuscitation of Christian moral authority. He is telling us that Christianity, before it is anything else, is a way of life, an orientation toward the whole, a living commitment to God through others. And he is telling us that nothing – nothing – is more powerful than this.
  • I would not speak about, not even for those who believe, an “absolute” truth, in the sense that absolute is something detached, something lacking any relationship. Now, the truth is a relationship! This is so true that each of us sees the truth and expresses it, starting from oneself: from one’s history and culture, from the situation in which one lives, etc. This does not mean that the truth is variable and subjective. It means that it is given to us only as a way and a life. Was it not Jesus himself who said: “I am the way, the truth, the life”? In other words, the truth is one with love, it requires humbleness and the willingness to be sought, listened to and expressed.
  • “proselytism is solemn nonsense.” That phrase – deployed by the Pope in dialogue with the Italian atheist Eugenio Scalfari (as reported by Scalfari) – may seem shocking at first. But it is not about denying the revelation of Jesus. It is about how that revelation is expressed and lived. Evangelism, for Francis, is emphatically not about informing others about the superiority of your own worldview and converting them to it. That kind of proselytism rests on a form of disrespect for another human being. Something else is needed:
  • nstead of seeming to impose new obligations, Christians should appear as people who wish to share their joy, who point to a horizon of beauty and who invite others to a delicious banquet. It is not by proselytizing that the Church grows, but “by attraction.”
  • what you see in the life of Saint Francis is a turn from extreme violence to extreme poverty, as if only the latter could fully compensate for the reality of the former. This was not merely an injunction to serve the poor. It is the belief that it is only by being poor or becoming poor that we can come close to God
  • Pope Francis insists – and has insisted throughout his long career in the church – that poverty is a key to salvation. And in choosing the name Francis, he explained last March in Assisi, this was the central reason why:
  • Saint Francis. His conversion came after he had gone off to war in defense of his hometown, and, after witnessing horrifying carnage, became a prisoner of war. After his release from captivity, his strange, mystical journey began.
  • the priority of practice over theory, of life over dogma. Evangelization is about sitting down with anyone anywhere and listening and sharing and being together. A Christian need not be afraid of this encounter. Neither should an atheist. We are in this together, in the same journey of life, with the same ultimate mystery beyond us. When we start from that place – of radical humility and radical epistemological doubt – proselytism does indeed seem like nonsense, a form of arrogance and detachment, reaching for power, not freedom. And evangelization is not about getting others to submit their intellect and will to some new set of truths; it is about an infectious joy for a new way of living in the world. All it requires – apart from joy and faith – is patience.
  • “Preach the Gospel always. If necessary, with words.”
  • But there is little sense that a political or economic system can somehow end the problem of poverty in Francis’ worldview. And there is the discomfiting idea that poverty itself is not an unmitigated evil. There is, indeed, a deep and mysterious view, enunciated by Jesus, and held most tenaciously by Saint Francis, that all wealth, all comfort, and all material goods are suspect and that poverty itself is a kind of holy state to which we should all aspire.
  • Not only was Saint Francis to become homeless and give up his patrimony, he was to travel on foot, wearing nothing but a rough tunic held together with rope. Whatever else it is, this is not progressivism. It sees no structural, human-devised system as a permanent improver of our material lot. It does not envision a world without poverty, but instead a church of the poor and for the poor. The only material thing it asks of the world, or of God, is daily bread – and only for today, never for tomorrow.
  • From this perspective, the idea that a society should be judged by the amount of things it can distribute to as many people as possible is anathema. The idea that there is a serious social and political crisis if we cannot keep our wealth growing every year above a certain rate is an absurdity.
  • this is a 21st-century heresy. Which means, I think, that this Pope is already emerging and will likely only further emerge as the most potent critic of the newly empowered global capitalist project.
  • Now, the only dominant ideology in the world is the ideology of material gain – either through the relatively free markets of the West or the state-controlled markets of the East. And so the church’s message is now harder to obscure. It stands squarely against the entire dominant ethos of our age. It is the final resistance.
  • For Francis, history has not come to an end, and capitalism, in as much as it is a global ideology that reduces all of human activity to the cold currency of wealth, is simply another “ism” to be toppled in humankind’s unfolding journey toward salvation on earth.
  • Francis will grow as the church reacts to him; it will be a dynamic, not a dogma; and it will be marked less by the revelation of new things than by the new recognition of old things, in a new language. It will be, if its propitious beginnings are any sign, a patient untying of our collective, life-denying knots.
Javier E

The Problem With Confidence - NYTimes.com - 2 views

  • I almost never see problems caused by underconfidence, but I see (and create) problems related to overconfidence every day.
  • Much of the recent psychological research also suggests that overconfidence is our main cognitive problem, not the reverse. Daniel Kahneman’s book “Thinking, Fast and Slow” describes an exhaustive collection of experiments demonstrating how often people come to conclusions confidently and wrongly
  • my second reaction is to notice that people are phenomenally terrible at estimating their own self-worth. Some Americans seem to value themselves ridiculously too little while others value themselves ridiculously too highly.
  • ...3 more annotations...
  • If you want to talk about something real, it’s probably a mistake to use a suspect concept like self-confidence, which is self-oriented. It’s probably a better idea to think about competence, which is task-oriented. If you ask, “Am I competent?” at least you are measuring yourself according to the standards of a specific domain.
  • The person with the confidence mind-set is like the painfully self-conscious person at a dinner party who asks, “How am I coming across?” The person with an instrumentalist mind-set is serving a craft and asks “What does this specific job require?” The person with a confidence mind-set is told “Believe in yourself.” This arouses all sorts of historical prejudices and social stereotypes. The person with an instrumentalist mind-set is told “Look accurately at what you have done.”
  • One of the hard things in life is learning to ask questions that you can actually answer. For example, if you are thinking about taking a job, it’s probably foolish to ask, “What future opportunities will this lead to?” You can’t know. It’s probably better to ask, “Will going to this workplace be rewarding day to day?” which is more concrete.
1 - 20 of 59 Next › Last »
Showing 20 items per page