Skip to main content

Home/ TOK Friends/ Group items matching "Logical" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
13More

Julian Assange on Living in a Surveillance Society - NYTimes.com - 0 views

  • Describing the atomic bomb (which had only two months before been used to flatten Hiroshima and Nagasaki) as an “inherently tyrannical weapon,” he predicts that it will concentrate power in the hands of the “two or three monstrous super-states” that have the advanced industrial and research bases necessary to produce it. Suppose, he asks, “that the surviving great nations make a tacit agreement never to use the atomic bomb against one another? Suppose they only use it, or the threat of it, against people who are unable to retaliate?”
  • The likely result, he concludes, will be “an epoch as horribly stable as the slave empires of antiquity.” Inventing the term, he predicts “a permanent state of ‘cold war,"’ a “peace that is no peace,” in which “the outlook for subject peoples and oppressed classes is still more hopeless.”
  • the destruction of privacy widens the existing power imbalance between the ruling factions and everyone else, leaving “the outlook for subject peoples and oppressed classes,” as Orwell wrote, “still more hopeless.
  • ...10 more annotations...
  • At present even those leading the charge against the surveillance state continue to treat the issue as if it were a political scandal that can be blamed on the corrupt policies of a few bad men who must be held accountable. It is widely hoped that all our societies need to do to fix our problems is to pass a few laws.
  • The cancer is much deeper than this. We live not only in a surveillance state, but in a surveillance society. Totalitarian surveillance is not only embodied in our governments; it is embedded in our economy, in our mundane uses of technology and in our everyday interactions.
  • The very concept of the Internet — a single, global, homogenous network that enmeshes the world — is the essence of a surveillance state. The Internet was built in a surveillance-friendly way because governments and serious players in the commercial Internet wanted it that way. There were alternatives at every step of the way. They were ignored.
  • Unlike intelligence agencies, which eavesdrop on international telecommunications lines, the commercial surveillance complex lures billions of human beings with the promise of “free services.” Their business model is the industrial destruction of privacy. And yet even the more strident critics of NSA surveillance do not appear to be calling for an end to Google and Facebook
  • At their core, companies like Google and Facebook are in the same business as the U.S. government’s National Security Agency. They collect a vast amount of information about people, store it, integrate it and use it to predict individual and group behavior, which they then sell to advertisers and others. This similarity made them natural partners for the NSA
  • there is an undeniable “tyrannical” side to the Internet. But the Internet is too complex to be unequivocally categorized as a “tyrannical” or a “democratic” phenomenon.
  • It is possible for more people to communicate and trade with others in more places in a single instant than it ever has been in history. The same developments that make our civilization easier to surveil make it harder to predict. They have made it easier for the larger part of humanity to educate itself, to race to consensus, and to compete with entrenched power groups.
  • If there is a modern analogue to Orwell’s “simple” and “democratic weapon,” which “gives claws to the weak” it is cryptography, the basis for the mathematics behind Bitcoin and the best secure communications programs. It is cheap to produce: cryptographic software can be written on a home computer. It is even cheaper to spread: software can be copied in a way that physical objects cannot. But it is also insuperable — the mathematics at the heart of modern cryptography are sound, and can withstand the might of a superpower. The same technologies that allowed the Allies to encrypt their radio communications against Axis intercepts can now be downloaded over a dial-up Internet connection and deployed with a cheap laptop.
  • It is too early to say whether the “democratizing” or the “tyrannical” side of the Internet will eventually win out. But acknowledging them — and perceiving them as the field of struggle — is the first step toward acting effectively
  • Humanity cannot now reject the Internet, but clearly we cannot surrender it either. Instead, we have to fight for it. Just as the dawn of atomic weapons inaugurated the Cold War, the manifold logic of the Internet is the key to understanding the approaching war for the intellectual center of our civilization
14More

Uber's Business Model Could Change Your Work - NYTimes.com - 0 views

  • Just as Uber is doing for taxis, new technologies have the potential to chop up a broad array of traditional jobs into discrete tasks that can be assigned to people just when they’re needed, with wages set by a dynamic measurement of supply and demand, and every worker’s performance constantly tracked, reviewed and subject to the sometimes harsh light of customer satisfaction.
  • Uber and its ride-sharing competitors, including Lyft and Sidecar, are the boldest examples of this breed, which many in the tech industry see as a new kind of start-up — one whose primary mission is to efficiently allocate human beings and their possessions, rather than information.
  • “I do think we are defining a new category of work that isn’t full-time employment but is not running your own business either,”
  • ...11 more annotations...
  • Various companies are now trying to emulate Uber’s business model in other fields, from daily chores like grocery shopping and laundry to more upmarket products like legal services and even medicine.
  • Proponents of on-demand work point out that many of the tech giants that sprang up over the last decade minted billions in profits without hiring very many people; Facebook, for instance, serves more than a billion users, but employs only a few thousand highly skilled workers, most of them in California.
  • But the rise of such work could also make your income less predictable and your long-term employment less secure. And it may relegate the idea of establishing a lifelong career to a distant memory.
  • “This on-demand economy means a work life that is unpredictable, doesn’t pay very well and is terribly insecure.” After interviewing many workers in the on-demand world, Dr. Reich said he has concluded that “most would much rather have good, well-paying, regular jobs.”
  • “We may end up with a future in which a fraction of the work force would do a portfolio of things to generate an income — you could be an Uber driver, an Instacart shopper, an Airbnb host and a Taskrabbit,”
  • at the end of 2014, Uber had 160,000 drivers regularly working for it in the United States. About 40,000 new drivers signed up in December alone, and the number of sign-ups was doubling every six months.
  • The report found that on average, Uber’s drivers worked fewer hours and earned more per hour than traditional taxi drivers, even when you account for their expenses. That conclusion, though, has raised fierce debate among economists, because it’s not clear how much Uber drivers really are paying in expenses. Drivers on the service use their own cars and pay for their gas; taxi drivers generally do not.
  • A survey of Uber drivers contained in the report found that most were already employed full or part time when they found Uber, and that earning an additional income on the side was a primary benefit of driving for Uber.
  • The larger worry about on-demand jobs is not about benefits, but about a lack of agency — a future in which computers, rather than humans, determine what you do, when and for how much. The rise of Uber-like jobs is the logical culmination of an economic and tech system that holds efficiency as its paramount virtue.
  • “These services are successful because they are tapping into people’s available time more efficiently,” Dr. Sundararajan said. “You could say that people are monetizing their own downtime.”Think about that for a second; isn’t “monetizing downtime” a hellish vision of the future of work?
  • “I’m glad if people like working for Uber, but those subjective feelings have got to be understood in the context of there being very few alternatives,” Dr. Reich said. “Can you imagine if this turns into a Mechanical Turk economy, where everyone is doing piecework at all odd hours, and no one knows when the next job will come, and how much it will pay? What kind of private lives can we possibly have, what kind of relationships, what kind of families?”
24More

The Moral Bucket List - NYTimes.com - 0 views

  • ABOUT once a month I run across a person who radiates an inner light. These people can be in any walk of life. They seem deeply good. They listen well. They make you feel funny and valued. You often catch them looking after other people and as they do so their laugh is musical and their manner is infused with gratitude. They are not thinking about what wonderful work they are doing. They are not thinking about themselves at all.
  • two sets of virtues, the résumé virtues and the eulogy virtues. The résumé virtues are the skills you bring to the marketplace. The eulogy virtues are the ones that are talked about at your funeral — whether you were kind, brave, honest or faithful. Were you capable of deep love?
  • our culture and our educational systems spend more time teaching the skills and strategies you need for career success than the qualities you need to radiate that sort of inner light.
  • ...21 more annotations...
  • But if you live for external achievement, years pass and the deepest parts of you go unexplored and unstructured. You lack a moral vocabulary. It is easy to slip into a self-satisfied moral mediocrity. You grade yourself on a forgiving curve. You figure as long as you are not obviously hurting anybody and people seem to like you, you must be O.K
  • I set out to discover how those deeply good people got that way.
  • I came to the conclusion that wonderful people are made, not born — that the people I admired had achieved an unfakeable inner virtue, built slowly from specific moral and spiritual accomplishments.
  • THE HUMILITY SHIFT We live in the culture of the Big Me. The meritocracy wants you to promote yourself. Social media wants you to broadcast a highlight reel of your life
  • But all the people I’ve ever deeply admired are profoundly honest about their own weaknesses. They have identified their core sin, whether it is selfishness, the desperate need for approval, cowardice, hardheartedness or whatever. They have traced how that core sin leads to the behavior that makes them feel ashamed. They have achieved a profound humility, which has best been defined as an intense self-awareness from a position of other-centeredness.
  • SELF-DEFEAT External success is achieved through competition with others. But character is built during the confrontation with your own weakness. Dwight Eisenhower, for example, realized early on that his core sin was his temper. He developed a moderate, cheerful exterior because he knew he needed to project optimism and confidence to lead.
  • THE DEPENDENCY LEAP Many people give away the book “Oh, the Places You’ll Go!” as a graduation gift. This book suggests that life is an autonomous journey
  • people on the road to character understand that no person can achieve self-mastery on his or her own. Individual will, reason and compassion are not strong enough to consistently defeat selfishness, pride and self-deception. We all need redemptive assistance from outside.
  • People on this road see life as a process of commitment making. Character is defined by how deeply rooted you are. Have you developed deep connections that hold you up in times of challenge and push you toward the good? In the realm of the intellect, a person of character has achieved a settled philosophy about fundamental things. In the realm of emotion, she is embedded in a web of unconditional loves. In the realm of action, she is committed to tasks that can’t be completed in a single lifetime.
  • The stumbler doesn’t build her life by being better than others, but by being better than she used to be. Unexpectedly, there are transcendent moments of deep tranquillity. For most of their lives their inner and outer ambitions are strong and in balance. But eventually, at moments of rare joy, career ambitions pause, the ego rests, the stumbler looks out at a picnic or dinner or a valley and is overwhelmed by a feeling of limitless gratitude, and an acceptance of the fact that life has treated her much better than she deserves.
  • That kind of love decenters the self. It reminds you that your true riches are in another. Most of all, this love electrifies. It puts you in a state of need and makes it delightful to serve what you love. Day’s love for her daughter spilled outward and upward. As she wrote, “No human creature could receive or contain so vast a flood of love and joy as I often felt after the birth of my child. With this came the need to worship, to adore.”
  • She made unshakable commitments in all directions. She became a Catholic, started a radical newspaper, opened settlement houses for the poor and lived among the poor, embracing shared poverty as a way to build community, to not only do good, but be good
  • THE CALL WITHIN THE CALL We all go into professions for many reasons: money, status, security. But some people have experiences that turn a career into a calling. These experiences quiet the self. All that matters is living up to the standard of excellence inherent in their craft.
  • THE CONSCIENCE LEAP In most lives there’s a moment when people strip away all the branding and status symbols, all the prestige that goes with having gone to a certain school or been born into a certain family. They leap out beyond the utilitarian logic and crash through the barriers of their fears.
  • Commencement speakers are always telling young people to follow their passions. Be true to yourself. This is a vision of life that begins with self and ends with self. But people on the road to inner light do not find their vocations by asking, what do I want from life? They ask, what is life asking of me? How can I match my intrinsic talent with one of the world’s deep needs?
  • Their lives often follow a pattern of defeat, recognition, redemption. They have moments of pain and suffering. But they turn those moments into occasions of radical self-understanding — by keeping a journal or making art. As Paul Tillich put it, suffering introduces you to yourself and reminds you that you are not the person you thought you were
  • The people on this road see the moments of suffering as pieces of a larger narrative. They are not really living for happiness, as it is conventionally defined. They see life as a moral drama and feel fulfilled only when they are enmeshed in a struggle on behalf of some ideal.
  • This is a philosophy for stumblers. The stumbler scuffs through life, a little off balance. But the stumbler faces her imperfect nature with unvarnished honesty, with the opposite of squeamishness. Recognizing her limitations, the stumbler at least has a serious foe to overcome and transcend. The stumbler has an outstretched arm, ready to receive and offer assistance. Her friends are there for deep conversation, comfort and advice.
  • External ambitions are never satisfied because there’s always something more to achieve. But the stumblers occasionally experience moments of joy. There’s joy in freely chosen obedience to organizations, ideas and people. There’s joy in mutual stumbling. There’s an aesthetic joy we feel when we see morally good action, when we run across someone who is quiet and humble and good, when we see that however old we are, there’s lots to do ahead.
  • ENERGIZING LOVE
  • Those are the people we want to be.
11More

Are scientists blocking their own progress? - The Washington Post - 1 views

  • Max Planck won a Nobel prize for his revolutionary work in quantum mechanics, but it was his interest in the philosophy of science that led to what is now called “Planck’s Principle.” Planck argued that science was an evolving system of thought which changes slowly over time, fueled by the deaths of old ideas. As he wrote in his 1968 autobiography: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
  • Is our understanding of the world based in pure objective reason, or are the theories that underpin it shaped by generational biases? Do our most famous thinkers actually block new ideas from gaining ground?
  • A new paper published by the National Bureau of Economic Research suggests that fame does play a significant role in deciding when and whether new scientific ideas can gain traction. When a prominent scientist dies, the paper’s authors found, the number of articles published by his or her collaborators tends to fall “precipitously” in the years following the death — those supporters tend not to continue advocating for a once-famous scientist’s ideas once the scientist is gone.
  • ...8 more annotations...
  • the number of research articles written by other scientists — including those with opposing ideas — increases by 8 percent on average, implying that the work of these scientists had been stifled before, but that after the death of a ubiquitous figure, the field becomes more open to new ideas. The study also found that these new articles are less likely to cite previous research and are more likely to be cited by others in the field. Death signifies a changing of the guard
  • Our instinct is often to view science as a concrete tower, growing ever upward and built upon the immovable foundations of earlier pioneers.  Sir Isaac Newton famously characterized this as “standing on the shoulders of giants.”
  • Mid-20th century philosopher Thomas Kuhn was among the first to come to this conclusion, in his 1962 book “The Structure of Scientific Revolutions.” He argued that scientific theories appeared in punctuated “paradigm shifts,” in which the underlying assumptions of a field are questioned and eventually overthrown
  • Kuhn’s book was, to some extent, a paradigm shift in its own right. According to his logic, commonly held notions in science were bound to change and become outdated. What we believe today will tomorrow be revised, rewritten — and in the most extreme cases ridiculed.
  • the journal Nature earlier this year said scientific data is prone to bias because researchers design experiments and make observations in ways that support hypotheses
  • equally as important are simple shifts in perspective. It only takes one researcher seeing an accepted scientific model in a new light for a solidified paradigm to enter what Kuhn called a “crisis phase” and beg for alternative explanations
  • The NBER study shows that those who questioned consensus ought to be given the opportunity to make their case, not ignored, silenced or pushed to the back of the line.
  • We’re likely to see these “paradigm shifts” happen at a much faster rate as data and research become easier to share worldwide. For some, this reality might seem chaotic; for the truly curious, it is exhilarating. The result may be a more democratic version of science — one in which the progress of ideas doesn’t have to wait until the funeral of a great mind.
7More

Where were Republican moderates 20 years ago? - The Washington Post - 0 views

  • There have always been radicals on both sides of the political spectrum. But what is different about the conservative movement is that, since the 1990s, some of its most distinguished mainstream members have embraced the rhetoric and tactics of the extremes.
  • It is gratifying to see the National Review mobilize against Trump, decrying his “free-floating populism” and disdain for the details of public policy. But where were the magazine’s editors when Sarah Palin put these same forces on full display eight years ago? Loudly cheering her on.
  • Palin knew next to nothing about national or international public policy, but she almost celebrated that ignorance, playing to the anti-intellectualism and anti-elitism of parts of the conservative base.
  • ...4 more annotations...
  • But over the past decade, I can recall conversations with some of these individuals in which they refused to accept that there was any problem within the Republican Party, attributing such criticism to media bias.
  • We still see this denial, with the truly bizarre claim by some in the media that the rise of Trump is really all the fault of . . . Obama. The logic is varied.
  • Here is a much simpler explanation for Donald Trump: Republicans have fed the country ideas about decline, betrayal and treason. They have encouraged the forces of anti-intellectualism, obstructionism and populism. They have flirted with bigotry and racism. Trump merely chose to unashamedly embrace all of it, saying plainly what they were hinting at for years. In doing so, he hit a jackpot.
  • The problem is not that Republican leaders should have begun to condemn Trump last year. It is that they should have condemned the ideas and tactics that led to his rise when they began to flourish 20 years ago.
11More

Physicists in Europe Find Tantalizing Hints of a Mysterious New Particle - The New York... - 0 views

  • Two teams of physicists working independently at the Large Hadron Collider at CERN, the European Organization for Nuclear Research, reported on Tuesday that they had seen traces of what could be a new fundamental particle of nature.
  • One possibility, out of a gaggle of wild and not-so-wild ideas springing to life as the day went on, is that the particle — assuming it is real — is a heavier version of the Higgs boson, a particle that explains why other particles have mass. Another is that it is a graviton, the supposed quantum carrier of gravity, whose discovery could imply the existence of extra dimensions of space-time.
  • At the end of a long chain of “ifs” could be a revolution, the first clues to a theory of nature that goes beyond the so-called Standard Model, which has ruled physics for the last quarter-century.
  • ...8 more annotations...
  • The Higgs boson was the last missing piece of the Standard Model, which explains all we know about subatomic particles and forces. But there are questions this model does not answer, such as what happens at the bottom of a black hole, the identity of the dark matter and dark energy that rule the cosmos, or why the universe is matter and not antimatter.
  • When physicists announced in 2012 that they had indeed discovered the Higgs boson, it was not the end of physics. It was not even, to paraphrase Winston Churchill, the beginning of the end.
  • A coincidence is the most probable explanation for the surprising bumps in data from the collider, physicists from the experiments cautioned, saying that a lot more data was needed and would in fact soon be available
  • The Large Hadron Collider was built at a cost of some $10 billion, to speed protons around an 18-mile underground track at more than 99 percent of the speed of light and smash them together in search of new particles and forces of nature. By virtue of Einstein’s equivalence of mass and energy, the more energy poured into these collisions, the more massive particles can come out of them. And by the logic of quantum microscopy, the more energy they have to spend, the smaller and more intimate details of nature physicists can see.
  • Since June, after a two-year shutdown, CERN physicists have been running their collider at nearly twice the energy with which they discovered the Higgs, firing twin beams of protons with 6.5 trillion electron volts of energy at each other in search of new particles to help point them to deeper laws.
  • The most intriguing result so far, reported on Tuesday, is an excess of pairs of gamma rays corresponding to an energy of about 750 billion electron volts. The gamma rays, the physicists said, could be produced by the radioactive decay of a new particle, in this case perhaps a cousin of the Higgs boson, which itself was first noticed because it decayed into an abundance of gamma rays.
  • Or it could be a more massive particle that has decayed in steps down to a pair of photons. Nobody knows. No model predicted this, which is how some scientists like it.
  • “We are barely coming to terms with the power and the glory” of the CERN collider’s ability to operate at 13 trillion electron volts, Dr. Spiropulu said in a text message. “We are now entering the era of taking a shot in the dark!”
20More

Let's call them all lunatics: Fearful "balanced" "journalists" let wingnuts run wild - ... - 0 views

  • In their 2012 book, “It’s Even Worse Than it Looks: How the American Constitutional System Collided With the New Politics of Extremism,” Thomas Mann and Norm Ornstein argued that America’s political dysfunction had two causes: First, the mismatch between our constitutional system, requiring compromise, and our increasingly polarized, parliamentary-style politics.
  • Second, the fact that polarization has been asymmetric, turning the GOP into an insurrectionary anti-government party, even when in power.
  • Despite overwhelming historical data showing asymmetrical polarization in Congress (more recent additions here), their argument did not convince the anecdote-obsessed Beltway pundit class, with its deep belief that “both sides do it,” no matter what “it” may be.
  • ...17 more annotations...
  • It’s true there are “extremists on both sides,” but as this Wonk Blog post showed, the percentage of non-centrist Republicans skyrocketed from under 10 percent in the Ford years (less than Democrats) to almost 90 percent today, while the Democratic percentage has stayed basically flat [chart].
  • What’s more, in the last session (2013-2014), the data shows that 147 House Republicans — more than half the caucus — were more ideologically extreme than the most extreme Democrat in the House. There is simply no comparison between the two partie
  • But it’s a fact that “balanced” journalism has to ignore. To admit that the political world isn’t balanced would shake their whole belief system to its core. And yet, the shaking seems to have begun
  • The GOP’s strategic logic is simple and straightforward: If the media is going to split the difference between what Democrats and Republicans say, then if Republicans simply double their demands, suddenly the media, embracing the “sensible center,” will now articulate the old GOP position as the “sensible center,” the “common sense” place to be
  • Their stubborn adherence to a false balance narrative has, ironically, become an integral part of the GOP’s relentless rightward push. By talking about “government dysfunction” instead of “Republican obstruction,” the media actively helps the most extreme anti-government Republicans thwart any efforts at competent governance and it helps promote their “government is horrible” worldview
  • There was once a penalty for becoming too politically extreme: one’s actions would be characterized as unrealistic, destructive, heedless of past experience, etc. Sometimes this was justified, sometimes not (as with the Civil Rights movement). But right or wrong, this media practice inhibited radical movements in either direction.
  • For quite some time now, however, conservative Republicans have realized that by moving right and attacking the media for any criticism, they can turn the media into a tacit ally, forcing them to treat preposterous claims as serious ideas, or even proven facts
  • Norm’s response underscores the reality of asymmetric polarization, which the mainstream media and most good government groups have avoided discussing — at great costs to the country
  • Thus, when they were planning to force a government shutdown, a key part of their strategy was spinning the media with a preposterous argument that it was the Democrats who were shutting down the government, even though, as the New York Times reported, the shutdown plan traced back to a meeting early in President Obama’s second term, led by former Attorney General Edwin R. Meese.
  • What’s more, once the media plays along, it’s a trick that can be used over and over again. One can keep moving farther and farther right indefinitely, pulling the “objective” media along for the ride, every step of the way. (Conservatives even developed an operational model to describe the process, known as the “Overton Window,” explained by a conservative activist here.)
  • The basis for all this is a cultural illusion that the “nonpartisan” media is somehow objective, philosophically in tune with science.
  • historically, this is far from true. Up until the late 19th century, American journalism was quite partisan, serving substantial “niche” audiences, sustained by subscriptions.
  • hen advertising exploded as a revenue source in the early 20th century, a new journalistic model emerged, trying to appeal across parties, while taking care not to anger large advertisers. The broader story is well told by Paul Starr in “The Creation of the Media
  • Jeremy Iggers incorporates this history into his account of how journalism ethics confuses the purposes of journalism in “Good News, Bad News: Journalism Ethics and the Public Interest.”
  • Such is the basis for the media’s claims of “objectivity.” Starr’s history explains the forces leading to why this happened.
  • the blogosphere’s origins were not just Usenet, email lists and the like, they were also the underground press tracing back to IF Stone’s Weekly and George Seldes’ In Fact; the black press, both commercial and movement-based; political journals of the left and right; and so on
  • These underappreciated traditions provide largely untapped examples of how to do quality political journalism outside of the artificial construct in which false balance is rooted. They point the way forward for us, beyond our current state of asymmetrical dysfunction.
15More

How to Cultivate the Art of Serendipity - The New York Times - 0 views

  • A surprising number of the conveniences of modern life were invented when someone stumbled upon a discovery or capitalized on an accident
  • wonder whether we can train ourselves to become more serendipitous. How do we cultivate the art of finding what we’re not seeking?
  • Croatian has no word to capture the thrill of the unexpected discovery, so she was delighted when — after moving to the United States on a Fulbright scholarship in the 1980s — she learned the English word “serendipity.”
  • ...12 more annotations...
  • Today we think of serendipity as something like dumb luck. But its original meaning was very different.
  • suggested that this old tale contained a crucial idea about human genius: “As their highnesses travelled, they were always making discoveries, by accident and sagacity, of things which they were not in quest of.” And he proposed a new word — “serendipity” — to describe this princely talent for detective work. At its birth, serendipity meant a skill rather than a random stroke of good fortune.
  • sees serendipity as something people do. In the mid-1990s, she began a study of about 100 people to find out how they created their own serendipity, or failed to do so.
  • As people dredge the unknown, they are engaging in a highly creative act. What an inventor “finds” is always an expression of him- or herself.
  • You become a super-encounterer, according to Dr. Erdelez, in part because you believe that you are one — it helps to assume that you possess special powers of perception
  • “gathering string” is just another way of talking about super-encountering. After all, “string” is the stuff that accumulates in a journalist’s pocket. It’s the note you jot down in your car after the interview, the knickknack you notice on someone’s shelf, or the anomaly that jumps out at you in Appendix B of an otherwise boring research study.
  • came up with the term super-encounterer to give us a way to talk about the people rather than just the discoveries. Without such words, we tend to become dazzled by the happy accident itself, to think of it as something that exists independent of an observer.
  • We can slip into a twisted logic in which we half-believe the penicillin picked Alexander Fleming to be its emissary, or that the moons of Jupiter wanted to be seen by Galileo. But discoveries are products of the human mind.
  • subjects fell into three distinct groups. Some she called “non-encounterers”; they saw through a tight focus, a kind of chink hole, and they tended to stick to their to-do lists when searching for information rather than wandering off into the margins. Other people were “occasional encounterers,” who stumbled into moments of serendipity now and then. Most interesting were the “super-encounterers,” who reported that happy surprises popped up wherever they looked.
  • One survey of patent holders (the PatVal study of European inventors, published in 2005) found that an incredible 50 percent of patents resulted from what could be described as a serendipitous process. Thousands of survey respondents reported that their idea evolved when they were working on an unrelated project — and often when they weren’t even trying to invent anything.
  • need to develop a new, interdisciplinary field — call it serendipity studies — that can help us create a taxonomy of discoveries in the chemistry lab, the newsroom, the forest, the classroom, the particle accelerator and the hospital. By observing and documenting the many different “species” of super-encounterers, we might begin to understand their minds.
  • Of course, even if we do organize the study of serendipity, it will always be a whimsical undertaking, given that the phenomenon is difficult to define, amazingly variable and hard to capture in data. The clues will no doubt emerge where we least expect them
14More

BBC - Future - What Sherlock Holmes taught us about the mind - 0 views

  • The century-old detective stories are being studied by today’s neurologists – but why? As it turns out, not even modern technology can replace their lessons in rational thinking.
  • Arthur Conan Doyle was a physician himself, and there is evidence that he modelled the character of Holmes on one of the leading doctors of the day, Joseph Bell of the Royal Edinburgh Infirmary. “I thought I would try my hand at writing a story where the hero would treat crime as Dr Bell treated disease,”
  • Conan Doyle may have also drawn some inspiration from other doctors, such as William Gowers, who wrote the Bible of Neurology
  • ...11 more annotations...
  • Gowers often taught his students to begin their diagnosis from the moment a patient walked through the door
  • “Did you notice him as he came into the room? If you did not then you should have done so. One of the habits to be acquired and never omitted is to observe a patient as he enters the room; to note his aspect and his gait. If you did so, you would have seen that he seemed lame, and you may have been struck by that which must strike you now – an unusual tint of his face.”
  • the importance of the seemingly inconsequential that seems to inspire both men. “It has long been an axiom of mine that the little things are infinitely the most important,” Conan Doyle wrote
  • Both Gowers and Holmes also warned against letting your preconceptions fog your judgement. For both men, cool, unprejudiced observation was the order of the day. It is for this reason that Holmes chastises Watson in The Scandal of Bohemia: “You see, but you do not observe. The distinction is clear.”
  • Gowers: “The method you should adopt is this: Whenever you find yourself in the presence of a case that is not familiar to you in all its detail forget for a time all your types and all your names. Deal with the case as one that has never been seen before, and work it out as a new problem sui generis, to be investigated as such.”
  • both men “reasoned backwards”, for instance, dissecting all the possible paths that may have led to a particular disease (in Gowers’ case) or murder (in Holmes’)
  • Holmes’ most famous aphorism: “When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
  • the most important lesson to be learned, from both Gowers and Holmes, is the value of recognising your errors. “Gentlemen – It is always pleasant to be right, but it is generally a much more useful thing to be wrong,” wrote Gowers
  • This humility is key in beating the ‘curse of expertise’ that afflicts so many talented and intelligent people.
  • University College London has documented many instances in which apparent experts in both medicine and forensic science have allowed their own biases to cloud their judgements – sometimes even in life or death situations.
  • Even the most advanced technology can never replace the powers of simple observation and rational deduction.
29More

Can truth survive this president? An honest investigation. - The Washington Post - 0 views

  • in the summer of 2002, long before “fake news” or “post-truth” infected the vernacular, one of President George W. Bush’s top advisers mocked a journalist for being part of the “reality-based community.” Seeking answers in reality was for suckers, the unnamed adviser explained. “We’re an empire now, and when we act, we create our own reality.”
  • This was the hubris and idealism of a post-Cold War, pre-Iraq War superpower: If you exert enough pressure, events will bend to your will.
  • the deceit emanating from the White House today is lazier, more cynical. It is not born of grand strategy or ideology; it is impulsive and self-serving. It is not arrogant, but shameless.
  • ...26 more annotations...
  • Bush wanted to remake the world. President Trump, by contrast, just wants to make it up as he goes along
  • Through all their debates over who is to blame for imperiling truth (whether Trump, postmodernism, social media or Fox News), as well as the consequences (invariably dire) and the solutions (usually vague), a few conclusions materialize, should you choose to believe them.
  • There is a pattern and logic behind the dishonesty of Trump and his surrogates; however, it’s less multidimensional chess than the simple subordination of reality to political and personal ambition
  • Trump’s untruth sells best precisely when feelings and instincts overpower facts, when America becomes a safe space for fabrication.
  • Rand Corp. scholars Jennifer Kavanagh and Michael D. Rich point to the Gilded Age, the Roaring Twenties and the rise of television in the mid-20th century as recent periods of what they call “Truth Decay” — marked by growing disagreement over facts and interpretation of data; a blurring of lines between opinion, fact and personal experience; and diminishing trust in once-respected sources of information.
  • In eras of truth decay, “competing narratives emerge, tribalism within the U.S. electorate increases, and political paralysis and dysfunction grow,”
  • Once you add the silos of social media as well as deeply polarized politics and deteriorating civic education, it becomes “nearly impossible to have the types of meaningful policy debates that form the foundation of democracy.”
  • To interpret our era’s debasement of language, Kakutani reflects perceptively on the World War II-era works of Victor Klemperer, who showed how the Nazis used “words as ‘tiny doses of arsenic’ to poison and subvert the German culture,” and of Stefan Zweig, whose memoir “The World of Yesterday” highlights how ordinary Germans failed to grasp the sudden erosion of their freedoms.
  • Kakutani calls out lefty academics who for decades preached postmodernism and social constructivism, which argued that truth is not universal but a reflection of relative power, structural forces and personal vantage points.
  • postmodernists rejected Enlightenment ideals as “vestiges of old patriarchal and imperialist thinking,” Kakutani writes, paving the way for today’s violence against fact in politics and science.
  • “dumbed-down corollaries” of postmodernist thought have been hijacked by Trump’s defenders, who use them to explain away his lies, inconsistencies and broken promises.
  • intelligent-design proponents and later climate deniers drew from postmodernism to undermine public perceptions of evolution and climate change. “Even if right-wing politicians and other science deniers were not reading Derrida and Foucault, the germ of the idea made its way to them: science does not have a monopoly on the truth,
  • McIntyre quotes at length from mea culpas by postmodernist and social constructivist writers agonizing over what their theories have wrought, shocked that conservatives would use them for nefarious purposes
  • pro-Trump troll and conspiracy theorist Mike Cernovich , who helped popularize the “Pizzagate” lie, has forthrightly cited his unlikely influences. “Look, I read postmodernist theory in college,” Cernovich told the New Yorker in 2016. “If everything is a narrative, then we need alternatives to the dominant narrative. I don’t seem like a guy who reads [Jacques] Lacan, do I?
  • When truth becomes malleable and contestable regardless of evidence, a mere tussle of manufactured narratives, it becomes less about conveying facts than about picking sides, particularly in politics.
  • In “On Truth,” Cambridge University philosopher Simon Blackburn writes that truth is attainable, if at all, “only at the vanishing end points of enquiry,” adding that, “instead of ‘facts first’ we may do better if we think of ‘enquiry first,’ with the notion of fact modestly waiting to be invited to the feast afterward.
  • He is concerned, but not overwhelmingly so, about the survival of truth under Trump. “Outside the fevered world of politics, truth has a secure enough foothold,” Blackburn writes. “Perjury is still a serious crime, and we still hope that our pilots and surgeons know their way about.
  • Kavanaugh and Rich offer similar consolation: “Facts and data have become more important in most other fields, with political and civil discourse being striking exceptions. Thus, it is hard to argue that the world is truly ‘post-fact.’ ”
  • McIntyre argues persuasively that our methods of ascertaining truth — not just the facts themselves — are under attack, too, and that this assault is especially dangerous.
  • Ideologues don’t just disregard facts they disagree with, he explains, but willingly embrace any information, however dubious, that fits their agenda. “This is not the abandonment of facts, but a corruption of the process by which facts are credibly gathered and reliably used to shape one’s beliefs about reality. Indeed, the rejection of this undermines the idea that some things are true irrespective of how we feel about them.”
  • “It is hardly a depressing new phenomenon that people’s beliefs are capable of being moved by their hopes, grievances and fears,” Blackburn writes. “In order to move people, objective facts must become personal beliefs.” But it can’t work — or shouldn’t work — in reverse.
  • More than fearing a post-truth world, Blackburn is concerned by a “post-shame environment,” in which politicians easily brush off their open disregard for truth.
  • it is human nature to rationalize away the dissonance. “Why get upset by his lies, when all politicians lie?” Kakutani asks, distilling the mind-set. “Why get upset by his venality, when the law of the jungle rules?”
  • So any opposition is deemed a witch hunt, or fake news, rigged or just so unfair. Trump is not killing the truth. But he is vandalizing it, constantly and indiscriminately, diminishing its prestige and appeal, coaxing us to look away from it.
  • the collateral damage includes the American experiment.
  • “One of the most important ways to fight back against post-truth is to fight it within ourselves,” he writes, whatever our particular politics may be. “It is easy to identify a truth that someone else does not want to see. But how many of us are prepared to do this with our own beliefs? To doubt something that we want to believe, even though a little piece of us whispers that we do not have all the facts?”
3More

Hearing Bilingual - How Babies Tell Languages Apart - NYTimes.com - 4 views

  • In one recent study, Dr. Werker and her collaborators showed that babies born to bilingual mothers not only prefer both of those languages over others — but are also able to register that the two languages are different. In addition to this ability to use rhythmic sound to discriminate between languages, Dr. Werker has studied other strategies that infants use as they grow, showing how their brains use different kinds of perception to learn languages, and also to keep them separate.
  • Over the past decade, Ellen Bialystok, a distinguished research professor of psychology at York University in Toronto, has shown that bilingual children develop crucial skills in addition to their double vocabularies, learning different ways to solve logic problems or to handle multitasking, skills that are often considered part of the brain’s so-called executive function. These higher-level cognitive abilities are localized to the frontal and prefrontal cortex in the brain. “Overwhelmingly, children who are bilingual from early on have precocious development of executive function,” Dr. Bialystok said. Dr. Kuhl calls bilingual babies “more cognitively flexible” than monolingual infants.
  •  
    I had no idea that language could play such a huge role in the development of an infant! This makes me wonder as to what other external social factors can come into consequence, like music or visual perceptions.
19More

'ContraPoints' Is Political Philosophy Made for YouTube - The Atlantic - 1 views

  • While Wynn positions herself on the left, she is no dogmatic ideologue, readily admitting to points on the right and criticizing leftist arguments when warranted
  • She has described her work as “edutainment” and “propaganda,” and it’s both
  • But what makes her videos unique is the way Wynn combines those two elements: high standards of rational argument and not-quite-rational persuasion. ContraPoints offers compelling speech aimed at truth, rendered in the raucous, meme-laden idiom of the interne
  • ...16 more annotations...
  • In 2014, Wynn noticed a trend on YouTube that disturbed her: Videos with hyperbolic titles like “why feminism ruins everything,” “SJW cringe compilation,” and “Ben Shapiro DESTROYS Every College Snowflake” were attracting millions of views and spawning long, jeering comment threads. Wynn felt she was watching the growth of a community of outrage that believes feminists, Marxists, and multiculturalists are conspiring to destroy freedom of speech, liquidate gender norms, and demolish Western civilization
  • Wynn created ContraPoints to offer entertaining, coherent rebuttals to these kinds of ideas. Her videos also explain left-wing talking points—like rape culture and cultural appropriation—and use philosophy to explore topics that are important to Wynn, such as the meaning of gender for trans people.
  • Wynn thinks it’s a mistake to assume that viewers of angry, right-wing videos are beyond redemption. “It’s quite difficult to get through to the people who are really committed to these anti-progressive beliefs,” Wynn told me recently. However, she said, she believes that many viewers find such ideas “psychologically resonant” without being hardened reactionaries. This broad, not fully committed center—comprising people whose minds can still be changed—is Wynn’s target audience.
  • Usually, the videos to which Wynn is responding take the stance of dogged reason cutting through the emotional excesses of so-called “political correctness.” For example, the American conservative commentator Ben Shapiro, who is a target of a recent ContraPoints video, has made “facts don’t care about your feelings” his motto. Wynn’s first step in trying to win over those who find anti-progressive views appealing is to show that these ideas often rest on a flimsy foundation. To do so, she fully adopts the rational standards of argument that her rivals pride themselves on following, and demonstrates how they fail to achieve them
  • Wynn dissects her opponents’ positions, holding up fallacies, evasions, and other rhetorical tricks for examination, all the while providing a running commentary on good argumentative method.
  • The host defends her own positions according to the same principles. Wynn takes on the strongest version of her opponent’s argument, acknowledges when she thinks her opponents are right and when she has been wrong, clarifies when misunderstood, and provides plenty of evidence for her claims
  • Wynn is a former Ph.D. student in philosophy, and though her videos are too rich with dick jokes for official settings, her argumentative practice would pass muster in any grad seminar.
  • she critiques many of her leftist allies for being bad at persuasion.
  • Socrates persuaded by both the logic of argument and the dynamic of fandom. Wynn is beginning to grow a dedicated following of her own: Members of online discussion groups refer to her as “mother” and “the queen,” produce fan art, and post photos of themselves dressed as characters from her videos.
  • she shares Socrates’s view that philosophy is more an erotic art than a martial one
  • As she puts it, she’s not trying to destroy the people she addresses, but seduce them
  • for Wynn, the true key to persuasion is to engage her audience on an emotional level.
  • One thing she has come across repeatedly is a disdain for the left’s perceived moral superiority. Anti-progressives of all stripes, Wynn told me, show an “intense defensiveness against being told what to do” and a “repulsion in response to moralizing.”
  • Matching her speech to the audience’s tastes presents a prickly rhetorical challenge. In an early video, Contra complains: “The problem is this medium. These goddamn savages demand a circus, and I intend to give them one, but behind the curtain, I really just want to have a conversation.
  • Philosophical conversation requires empathy and good-faith engagement. But the native tongue of political YouTube is ironic antagonism. It’s Wynn’s inimitable way of combining these two ingredients that gives ContraPoints its distinctive mouthfeel.
  • Wynn spends weeks in the online communities of her opponents—whether they’re climate skeptics or trans-exclusionary feminists—trying to understand what they believe and why they believe it. In Socrates’s words, she’s studying the souls of her audience.
4More

How an Argument Over Zombies Helps Explain What Makes Us Human | Big Think - 1 views

  • Zombies are a big part of our pop culture. They are both a cathartic exploration of what it means to be human and a vehicle for social commentary. The word “zombie” comes from Haitian folklore and refers to a corpse animated by witchcraft.
  • The concept is kind of a mind trick. Imagine a being that looks and even talks like a human. It goes through all the normal motions of a human and yet has no consciousness. And you would have no idea that it is not like you.
  • If a p-zombie that is exactly like us, except for the sense of self and consciousness, is possible to logically conceive, then it could support dualism, an alternative view that sees the world consisting of not just the physical but also the mental.
  •  
    What is human? In the past paradigms, we thought what separates human from all the other things is spirit. Then we thought it is reason that distinguish human from other animals. However, as we talked about in the modern paradigm, we say that there is no spirit or reason in human and all the decision we make is basically not our own. It is the result of numerous subconscious suggestions. So is there really a difference between zombie and human? I think there are people who live like a zombie and there are "zombies" that live like a human. --Sissi (4/24/2017)
42More

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
13More

What's behind the confidence of the incompetent? This suddenly popular psychological ph... - 0 views

  • Someone who has very little knowledge in a subject claims to know a lot. That person might even boast about being an expert.
  • This phenomenon has a name: the Dunning-Kruger effect. It’s not a disease, syndrome or mental illness; it is present in everybody to some extent, and it’s been around as long as human cognition, though only recently has it been studied and documented in social psychology.
  • Charles Darwin followed that up in 1871 with “ignorance more frequently begets confidence than does knowledge.”
  • ...10 more annotations...
  • Put simply, incompetent people think they know more than they really do, and they tend to be more boastful about it.
  • To test Darwin’s theory, the researchers quizzed people on several topics, such as grammar, logical reasoning and humor. After each test, they asked the participants how they thought they did. Specifically, participants were asked how many of the other quiz-takers they beat.
  • Time after time, no matter the subject, the people who did poorly on the tests ranked their competence much higher
  • On average, test takers who scored as low as the 10th percentile ranked themselves near the 70th percentile. Those least likely to know what they were talking about believed they knew as much as the experts.
  • Dunning and Kruger’s results have been replicated in at least a dozen different domains: math skills, wine tasting, chess, medical knowledge among surgeons and firearm safety among hunters.
  • Even though President Trump’s statements are rife with errors, falsehoods or inaccuracies, he expresses great confidence in his aptitude. He says he does not read extensively because he solves problems “with very little knowledge other than the knowledge I [already] had.” He has said in interviews he doesn’t read lengthy reports because “I already know exactly what it is.”
  • He has “the best words” and cites his “high levels of intelligence” in rejecting the scientific consensus on climate change. Decades ago, he said he could end the Cold War: “It would take an hour and a half to learn everything there is to learn about missiles,” Trump told The Washington Post’s Lois Romano over dinner in 1984. “I think I know most of it anyway.”
  • Whether people want to understand “the other side” or they’re just looking for an epithet, the Dunning-Kruger effect works as both, Dunning said, which he believes explains the rise of interest.
  • Dunning says the effect is particularly dangerous when someone with influence or the means to do harm doesn’t have anyone who can speak honestly about their mistakes.
  • Not surprisingly (though no less concerning), Dunning’s follow-up research shows the poorest performers are also the least likely to accept criticism or show interest in self improvement.
5More

'Ebooks are stupid', says head of one of world's biggest publishers | Books | The Guardian - 1 views

  • . According to Nourry, the “plateau, or rather slight decline”, that ebook sales have seen in the US and the UK in recent years is “not going to reverse”.
  • “It’s the limit of the ebook format. The ebook is a stupid product. It is exactly the same as print, except it’s electronic. There is no creativity, no enhancement, no real digital experience,”
  • “We, as publishers, have not done a great job going digital. We’ve tried. We’ve tried enhanced or enriched ebooks – didn’t work. We’ve tried apps, websites with our content – we have one or two successes among a hundred failures. I’m talking about the entire industry. We’ve not done very well,”
  • ...2 more annotations...
  • “I’m convinced there is something we can invent using our content and digital properties beyond ebooks, but I reached the conclusion that we don’t really have the skills and talents in our companies, because publishers and editors are accustomed to picking a manuscript and creating a design on a flat page. They don’t really know the full potential of 3-D and digital,”
  • “This wasn’t just coming from thinking of our revenues. If you let the price of ebooks go down to say $2 or $3 in western markets, you are going to kill all infrastructure, you’re going to kill booksellers, you’re going to kill supermarkets, and you are going to kill the author’s revenues,” he said. “You have to defend the logic of your market against the interest of the big technology companies and their business models. The battle in 2014 was all about that. We had to do it.”
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
9More

Opinion | The Zombie Style in American Politics - The New York Times - 0 views

  • This was an awkward observation for a party that, then as now, wanted to slash taxes for the rich and dismantle the social safety net. How would conservatives respond?
  • The answer was multilayered denial. Inequality wasn’t rising. O.K., it was rising, but that wasn’t a problem. O.K., rising inequality was unfortunate, but there was nothing that could be done about it without crippling economic growth.
  • You might think that the right would have to choose one of those positions, or at least that once you’d managed to refute one layer of the argument, say by showing that inequality was indeed rising, you could put that argument behind you and move on to the next one. But no: Old arguments, like the wights in “Game of Thrones,” would just keep rising up after you thought you had killed them.
  • ...6 more annotations...
  • You see the same thing on climate change. Global warming is a myth — a hoax concocted by a vast conspiracy of scientists around the world. O.K., the climate is changing, but it’s a natural phenomenon that has nothing to do with human activity. O.K., man-made climate change is real, but we can’t do anything about it without destroying the economy.
  • What the right’s positioning on inequality, climate and now Russian election interference have in common is that in each case the people pretending to be making a serious argument are actually apparatchiks operating in bad faith.
  • in each case those making denialist arguments, while they may invoke evidence, don’t actually care what the evidence says; at a fundamental level, they aren’t interested in the truth. Their goal, instead, is to serve a predetermined agenda.
  • eporting about these debates typically frames them as disputes about the facts and what they mean, when the reality is that one side isn’t interested in the facts.
  • the pressures that often lead to false equivalence. Calling out dishonesty and bad faith can seem like partisan bias when, to put it bluntly, one side of the political spectrum lies all the time, while the other side doesn’t.
  • pretending that good faith exists when it doesn’t is unfair to readers. The public deserves to know that the big debates in modern U.S. politics aren’t a conventional clash of rival ideas.
17More

Is our world a simulation? Why some scientists say it's more likely than not | Technolo... - 3 views

  • Musk is just one of the people in Silicon Valley to take a keen interest in the “simulation hypothesis”, which argues that what we experience as reality is actually a giant computer simulation created by a more sophisticated intelligence
  • Oxford University’s Nick Bostrom in 2003 (although the idea dates back as far as the 17th-century philosopher René Descartes). In a paper titled “Are You Living In a Simulation?”, Bostrom suggested that members of an advanced “posthuman” civilization with vast computing power might choose to run simulations of their ancestors in the universe.
  • If we believe that there is nothing supernatural about what causes consciousness and it’s merely the product of a very complex architecture in the human brain, we’ll be able to reproduce it. “Soon there will be nothing technical standing in the way to making machines that have their own consciousness,
  • ...14 more annotations...
  • At the same time, videogames are becoming more and more sophisticated and in the future we’ll be able to have simulations of conscious entities inside them.
  • “Forty years ago we had Pong – two rectangles and a dot. That’s where we were. Now 40 years later, we have photorealistic, 3D simulations with millions of people playing simultaneously and it’s getting better every year. And soon we’ll have virtual reality, we’ll have augmented reality,” said Musk. “If you assume any rate of improvement at all, then the games will become indistinguishable from reality.”
  • “If one progresses at the current rate of technology a few decades into the future, very quickly we will be a society where there are artificial entities living in simulations that are much more abundant than human beings.
  • If there are many more simulated minds than organic ones, then the chances of us being among the real minds starts to look more and more unlikely. As Terrile puts it: “If in the future there are more digital people living in simulated environments than there are today, then what is to say we are not part of that already?”
  • Reasons to believe that the universe is a simulation include the fact that it behaves mathematically and is broken up into pieces (subatomic particles) like a pixelated video game. “Even things that we think of as continuous – time, energy, space, volume – all have a finite limit to their size. If that’s the case, then our universe is both computable and finite. Those properties allow the universe to be simulated,” Terrile said
  • “Is it logically possible that we are in a simulation? Yes. Are we probably in a simulation? I would say no,” said Max Tegmark, a professor of physics at MIT.
  • “In order to make the argument in the first place, we need to know what the fundamental laws of physics are where the simulations are being made. And if we are in a simulation then we have no clue what the laws of physics are. What I teach at MIT would be the simulated laws of physics,”
  • Terrile believes that recognizing that we are probably living in a simulation is as game-changing as Copernicus realizing that the Earth was not the center of the universe. “It was such a profound idea that it wasn’t even thought of as an assumption,”
  • That we might be in a simulation is, Terrile argues, a simpler explanation for our existence than the idea that we are the first generation to rise up from primordial ooze and evolve into molecules, biology and eventually intelligence and self-awareness. The simulation hypothesis also accounts for peculiarities in quantum mechanics, particularly the measurement problem, whereby things only become defined when they are observed.
  • “For decades it’s been a problem. Scientists have bent over backwards to eliminate the idea that we need a conscious observer. Maybe the real solution is you do need a conscious entity like a conscious player of a video game,
  • How can the hypothesis be put to the test
  • scientists can look for hallmarks of simulation. “Suppose someone is simulating our universe – it would be very tempting to cut corners in ways that makes the simulation cheaper to run. You could look for evidence of that in an experiment,” said Tegmark
  • First, it provides a scientific basis for some kind of afterlife or larger domain of reality above our world. “You don’t need a miracle, faith or anything special to believe it. It comes naturally out of the laws of physics,”
  • it means we will soon have the same ability to create our own simulations. “We will have the power of mind and matter to be able to create whatever we want and occupy those worlds.”
8More

Can Social Networks Do Better? We Don't Know Because They Haven't Tried - Talking Point... - 0 views

  • it’s not fair to say it’s Facebook or a Facebook problem. Facebook is just the latest media and communications medium. We hardly blame the technology of the book for spreading anti-Semitism via the notorious Protocols of the Elders of Zion
  • But of course, it’s not that simple. Social media platforms have distinct features that earlier communications media did not. The interactive nature of the media, the collection of data which is then run through algorithms and artificial intelligence creates something different.
  • All social media platforms are engineered with two basic goals: maximize the time you spend on the platform and make advertising as effective and thus as lucrative as possible. This means that social media can never be simply a common carrier, a distribution technology that has no substantial influence over the nature of the communication that travels over it.
  • ...5 more annotations...
  • it’s a substantial difference which deprives social media platforms of the kind of hands-off logic that would make it ridiculous to say phones are bad or the phone company is responsible if planning for a mass murder was carried out over the phone.
  • the Internet doesn’t ‘do’ anything more than make the distribution of information more efficient and radically lower the formal, informal and financial barriers to entry that used to stand in the way of various marginalized ideas.
  • Social media can never plead innocence like this because the platforms are designed to addict you and convince you of things.
  • If the question is: what can social media platforms do to protect against government-backed subversion campaigns like the one we saw in the 2016 campaign the best answer is, we don’t know. And we don’t know for a simple reason: they haven’t tried.
  • The point is straightforward: the mass collection of data, harnessed to modern computing power and the chance to amass unimaginable wealth has spurred vast technological innovation.
« First ‹ Previous 161 - 180 of 253 Next › Last »
Showing 20 items per page