Skip to main content

Home/ TOK Friends/ Group items tagged flow

Rss Feed Group items tagged

Javier E

The Age of Social Media Is Ending - The Atlantic - 0 views

  • Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
  • A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
  • “social media,” a name so familiar that it has ceased to bear meaning. But two decades ago, that term didn’t exist
  • ...35 more annotations...
  • a “web 2.0” revolution in “user-generated content,” offering easy-to-use, easily adopted tools on websites and then mobile apps. They were built for creating and sharing “content,”
  • As the original name suggested, social networking involved connecting, not publishing. By connecting your personal network of trusted contacts (or “strong ties,” as sociologists call them) to others’ such networks (via “weak ties”), you could surface a larger network of trusted contacts
  • The whole idea of social networks was networking: building or deepening relationships, mostly with people you knew. How and why that deepening happened was largely left to the users to decide.
  • That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts.
  • Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
  • A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
  • The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
  • The toxicity of social media makes it easy to forget how truly magical this innovation felt when it was new. From 2004 to 2009, you could join Facebook and everyone you’d ever known—including people you’d definitely lost track of—was right there, ready to connect or reconnect. The posts and photos I saw characterized my friends’ changing lives, not the conspiracy theories that their unhinged friends had shared with them
  • Twitter, which launched in 2006, was probably the first true social-media site, even if nobody called it that at the time. Instead of focusing on connecting people, the site amounted to a giant, asynchronous chat room for the world. Twitter was for talking to everyone—which is perhaps one of the reasons journalists have flocked to it
  • on Twitter, anything anybody posted could be seen instantly by anyone else. And furthermore, unlike posts on blogs or images on Flickr or videos on YouTube, tweets were short and low-effort, making it easy to post many of them a week or even a day.
  • soon enough, all social networks became social media first and foremost. When groups, pages, and the News Feed launched, Facebook began encouraging users to share content published by others in order to increase engagement on the service, rather than to provide updates to friends. LinkedIn launched a program to publish content across the platform, too. Twitter, already principally a publishing platform, added a dedicated “retweet” feature, making it far easier to spread content virally across user networks.
  • When we look back at this moment, social media had already arrived in spirit if not by name. RSS readers offered a feed of blog posts to catch up on, complete with unread counts. MySpace fused music and chatter; YouTube did it with video (“Broadcast Yourself”)
  • From being asked to review every product you buy to believing that every tweet or Instagram image warrants likes or comments or follows, social media produced a positively unhinged, sociopathic rendition of human sociality.
  • Other services arrived or evolved in this vein, among them Reddit, Snapchat, and WhatsApp, all far more popular than Twitter. Social networks, once latent routes for possible contact, became superhighways of constant content
  • Although you can connect the app to your contacts and follow specific users, on TikTok, you are more likely to simply plug into a continuous flow of video content that has oozed to the surface via algorithm.
  • In the social-networking era, the connections were essential, driving both content creation and consumption. But the social-media era seeks the thinnest, most soluble connections possible, just enough to allow the content to flow.
  • This is also why journalists became so dependent on Twitter: It’s a constant stream of sources, events, and reactions—a reporting automat, not to mention an outbound vector for media tastemakers to make tastes.
  • “influencer” became an aspirational role, especially for young people for whom Instagram fame seemed more achievable than traditional celebrity—or perhaps employment of any kind.
  • social-media operators discovered that the more emotionally charged the content, the better it spread across its users’ networks. Polarizing, offensive, or just plain fraudulent information was optimized for distribution. By the time the platforms realized and the public revolted, it was too late to turn off these feedback loops.
  • The ensuing disaster was multipar
  • Rounding up friends or business contacts into a pen in your online profile for possible future use was never a healthy way to understand social relationships.
  • when social networking evolved into social media, user expectations escalated. Driven by venture capitalists’ expectations and then Wall Street’s demands, the tech companies—Google and Facebook and all the rest—became addicted to massive scale
  • Social media showed that everyone has the potential to reach a massive audience at low cost and high gain—and that potential gave many people the impression that they deserve such an audience.
  • On social media, everyone believes that anyone to whom they have access owes them an audience: a writer who posted a take, a celebrity who announced a project, a pretty girl just trying to live her life, that anon who said something afflictive
  • When network connections become activated for any reason or no reason, then every connection seems worthy of traversing.
  • people just aren’t meant to talk to one another this much. They shouldn’t have that much to say, they shouldn’t expect to receive such a large audience for that expression, and they shouldn’t suppose a right to comment or rejoinder for every thought or notion either.
  • Facebook and all the rest enjoyed a massive rise in engagement and the associated data-driven advertising profits that the attention-driven content economy created. The same phenomenon also created the influencer economy, in which individual social-media users became valuable as channels for distributing marketing messages or product sponsorships by means of their posts’ real or imagined reach
  • That’s no surprise, I guess, given that the model was forged in the fires of Big Tech companies such as Facebook, where sociopathy is a design philosophy.
  • If change is possible, carrying it out will be difficult, because we have adapted our lives to conform to social media’s pleasures and torments. It’s seemingly as hard to give up on social media as it was to give up smoking en masse
  • Quitting that habit took decades of regulatory intervention, public-relations campaigning, social shaming, and aesthetic shifts. At a cultural level, we didn’t stop smoking just because the habit was unpleasant or uncool or even because it might kill us. We did so slowly and over time, by forcing social life to suffocate the practice. That process must now begin in earnest for social media.
  • Something may yet survive the fire that would burn it down: social networks, the services’ overlooked, molten core. It was never a terrible idea, at least, to use computers to connect to one another on occasion, for justified reasons, and in moderation
  • The problem came from doing so all the time, as a lifestyle, an aspiration, an obsession. The offer was always too good to be true, but it’s taken us two decades to realize the Faustian nature of the bargain.
  • when I first wrote about downscale, the ambition seemed necessary but impossible. It still feels unlikely—but perhaps newly plausible.
  • To win the soul of social life, we must learn to muzzle it again, across the globe, among billions of people. To speak less, to fewer people and less often–and for them to do the same to you, and everyone else as well
  • We cannot make social media good, because it is fundamentally bad, deep in its very structure. All we can do is hope that it withers away, and play our small part in helping abandon it.
Javier E

The "missing law" of nature was here all along | Salon.com - 0 views

  • recently published scientific article proposes a sweeping new law of nature, approaching the matter with dry, clinical efficiency that still reads like poetry.
  • “Evolving systems are asymmetrical with respect to time; they display temporal increases in diversity, distribution, and/or patterned behavior,” they continue, mounting their case from the shoulders of Charles Darwin, extending it toward all things living and not. 
  • To join the known physics laws of thermodynamics, electromagnetism and Newton’s laws of motion and gravity, the nine scientists and philosophers behind the paper propose their “law of increasing functional information.”
  • ...27 more annotations...
  • In short, a complex and evolving system — whether that’s a flock of gold finches or a nebula or the English language — will produce ever more diverse and intricately detailed states and configurations of itself.
  • Some of these more diverse and intricate configurations, the scientists write, are shed and forgotten over time. The configurations that persist are ones that find some utility or novel function in a process akin to natural selection, but a selection process driven by the passing-on of information rather than just the sowing of biological genes
  • Have they finally glimpsed, I wonder, the connectedness and symbiotic co-evolution of their own scientific ideas with those of the world’s writers
  • Have they learned to describe in their own quantifying language that cradle from which both our disciplines have emerged and the firmament on which they both stand — the hearing and telling of stories in order to exist?
  • Have they quantified the quality of all existent matter, living and not: that all things inherit a story in data to tell, and that our stories are told by the very forms we take to tell them? 
  • “Is there a universal basis for selection? Is there a more quantitative formalism underlying this conjectured conceptual equivalence—a formalism rooted in the transfer of information?,” they ask of the world’s disparate phenomena. “The answer to both questions is yes.”
  • Yes. They’ve glimpsed it, whether they know it or not. Sing to me, O Muse, of functional information and its complex diversity.
  • The principle of complexity evolving at its own pace when left to its own devices, independent of time but certainly in a dance with it, is nothing new. Not in science, nor in its closest humanities kin, science and nature writing. Give things time and nourishing environs, protect them from your own intrusions and — living organisms or not — they will produce abundant enlacement of forms.
  • This is how poetry was born from the same larynxes and phalanges that tendered nuclear equations: We featherless bipeds gave language our time and delighted attendance until its forms were so multivariate that they overflowed with inevitable utility.
  • In her Pulitzer-winning “Pilgrim at Tinker Creek,” nature writer Annie Dillard explains plainly that evolution is the vehicle of such intricacy in the natural world, as much as it is in our own thoughts and actions. 
  • “The stability of simple forms is the sturdy base from which more complex, stable forms might arise, forming in turn more complex forms,” she explains, drawing on the undercap frills of mushrooms and filament-fine filtering tubes inside human kidneys to illustrate her point. 
  • “Utility to the creature is evolution’s only aesthetic consideration. Form follows function in the created world, so far as I know, and the creature that functions, however bizarre, survives to perpetuate its form,” writes Dillard.
  • “Of the multiplicity of forms, I know nothing. Except that, apparently, anything goes. This holds for forms of behavior as well as design — the mantis munching her mate, the frog wintering in mud.” 
  • She notes that, of all forms of life we’ve ever known to exist, only about 10% are still alive. What extravagant multiplicity. 
  • “Intricacy is that which is given from the beginning, the birthright, and in the intricacy is the hardiness of complexity that ensures against the failures of all life,” Dillard writes. “The wonder is — given the errant nature of freedom and the burgeoning texture of time — the wonder is that all the forms are not monsters, that there is beauty at all, grace gratuitous.”
  • “This paper, and the reason why I'm so proud of it, is because it really represents a connection between science and the philosophy of science that perhaps offers a new lens into why we see everything that we see in the universe,” lead scientist Michael Wong told Motherboard in a recent interview. 
  • Wong is an astrobiologist and planetary scientist at the Carnegie Institute for Science. In his team’s paper, that bridge toward scientific philosophy is not only preceded by a long history of literary creativity but directly theorizes about the creative act itself.  
  • “The creation of art and music may seem to have very little to do with the maintenance of society, but their origins may stem from the need to transmit information and create bonds among communities, and to this day, they enrich life in innumerable ways,” Wong’s team writes.  
  • “Perhaps, like eddies swirling off of a primary flow field, selection pressures for ancillary functions can become so distant from the core functions of their host systems that they can effectively be treated as independently evolving systems,” the authors add, pointing toward the elaborate mating dance culture observed in birds of paradise.
  • “Perhaps it will be humanity’s ability to learn, invent, and adopt new collective modes of being that will lead to its long-term persistence as a planetary phenomenon. In light of these considerations, we suspect that the general principles of selection and function discussed here may also apply to the evolution of symbolic and social systems.”
  • The Mekhilta teaches that all Ten Commandments were pronounced in a single utterance. Similarly, the Maharsha says the Torah’s 613 mitzvoth are only perceived as a plurality because we’re time-bound humans, even though they together form a singular truth which is indivisible from He who expressed it. 
  • Or, as the Mishna would have it, “the creations were all made in generic form, and they gradually expanded.” 
  • Like swirling eddies off of a primary flow field.
  • “O Lord, how manifold are thy works!,” cried out David in his psalm. “In wisdom hast thou made them all: the earth is full of thy riches. So is this great and wide sea, wherein are things creeping innumerable, both small and great beasts.” 
  • In all things, then — from poetic inventions, to rare biodiverse ecosystems, to the charted history of our interstellar equations — it is best if we conserve our world’s intellectual and physical diversity, for both the study and testimony of its immeasurable multiplicity.
  • Because, whether wittingly or not, science is singing the tune of the humanities. And whether expressed in algebraic logic or ancient Greek hymn, its chorus is the same throughout the universe: Be fruitful and multiply. 
  • Both intricate configurations of art and matter arise and fade according to their shared characteristic, long-known by students of the humanities: each have been graced with enough time to attend to the necessary affairs of their most enduring pleasures. 
Javier E

Linguists identify 15,000-year-old 'ultraconserved words' - The Washington Post - 0 views

  • You, hear me! Give this fire to that old man. Pull the black worm off the bark and give it to the mother. And no spitting in the ashes! It’s an odd little speech. But if you went back 15,000 years and spoke these words to hunter-gatherers in Asia in any one of hundreds of modern languages, there is a chance they would understand at least some of what you were saying.
  • That’s because all of the nouns, verbs, adjectives and adverbs in the four sentences are words that have descended largely unchanged from a language that died out as the glaciers retreated at the end of the last Ice Age. Those few words mean the same thing, and sound almost the same, as they did then.
  • A team of researchers has come up with a list of two dozen “ultraconserved words” that have survived 150 centuries. It includes some predictable entries: “mother,” “not,” “what,” “to hear” and “man.” It also contains surprises: “to flow,” “ashes” and “worm.”
  • ...2 more annotations...
  • The existence of the long-lived words suggests there was a “proto-Eurasiatic” language that was the common ancestor to about 700 contemporary languages that are the native tongues of more than half the world’s people.
  • In all, “proto-Eurasiatic” gave birth to seven language families. Several of the world’s important language families, however, fall outside that lineage, such as the one that includes Chinese and Tibetan; several African language families, and those of American Indians and Australian aborigines.
Javier E

A Million First Dates - Dan Slater - The Atlantic - 0 views

  • The positive aspects of online dating are clear: the Internet makes it easier for single people to meet other single people with whom they might be compatible, raising the bar for what they consider a good relationship. But what if online dating makes it too easy to meet someone new? What if it raises the bar for a good relationship too high? What if the prospect of finding an ever-more-compatible mate with the click of a mouse means a future of relationship instability, in which we keep chasing the elusive rabbit around the dating track?
  • the rise of online dating will mean an overall decrease in commitment.
  • I often wonder whether matching you up with great people is getting so efficient, and the process so enjoyable, that marriage will become obsolete.”
  • ...5 more annotations...
  • “Historically,” says Greg Blatt, the CEO of Match.com’s parent company, “relationships have been billed as ‘hard’ because, historically, commitment has been the goal. You could say online dating is simply changing people’s ideas about whether commitment itself is a life value.” Mate scarcity also plays an important role in people’s relationship decisions. “Look, if I lived in Iowa, I’d be married with four children by now,” says Blatt, a 40‑something bachelor in Manhattan. “That’s just how it is.”
  • “I think divorce rates will increase as life in general becomes more real-time,” says Niccolò Formai, the head of social-media marketing at Badoo, a meeting-and-dating app with about 25 million active users worldwide. “Think about the evolution of other kinds of content on the Web—stock quotes, news. The goal has always been to make it faster. The same thing will happen with meeting. It’s exhilarating to connect with new people, not to mention beneficial for reasons having nothing to do with romance. You network for a job. You find a flatmate. Over time you’ll expect that constant flow. People always said that the need for stability would keep commitment alive. But that thinking was based on a world in which you didn’t meet that many people.”
  • “You could say online dating allows people to get into relationships, learn things, and ultimately make a better selection,” says Gonzaga. “But you could also easily see a world in which online dating leads to people leaving relationships the moment they’re not working—an overall weakening of commitment.”
  • Explaining the mentality of a typical dating-site executive, Justin Parfitt, a dating entrepreneur based in San Francisco, puts the matter bluntly: “They’re thinking, Let’s keep this fucker coming back to the site as often as we can.” For instance, long after their accounts become inactive on Match.com and some other sites, lapsed users receive notifications informing them that wonderful people are browsing their profiles and are eager to chat. “Most of our users are return customers,” says Match.com’s Blatt.
  • The market is hugely more efficient … People expect to—and this will be increasingly the case over time—access people anywhere, anytime, based on complex search requests … Such a feeling of access affects our pursuit of love … the whole world (versus, say, the city we live in) will, increasingly, feel like the market for our partner(s). Our pickiness will probably increase.” “Above all, Internet dating has helped people of all ages realize that there’s no need to settle for a mediocre relationship.”
Javier E

Yelp and the Wisdom of 'The Lonely Crowd' : The New Yorker - 1 views

  • David Riesman spent the first half of his career writing one of the most important books of the twentieth century. He spent the second half correcting its pervasive misprision. “The Lonely Crowd,” an analysis of the varieties of social character that examined the new American middle class
  • the “profound misinterpretation” of the book as a simplistic critique of epidemic American postwar conformity via its description of the contours of the “other-directed character,” whose identity and behavior is shaped by its relationships.
  • he never meant to suggest that Americans now were any more conformist than they ever had been, or that there’s even such a thing as social structure without conformist consensus.
  • ...17 more annotations...
  • In this past weekend’s Styles section of the New York Times, Siegel uses “The Lonely Crowd” to analyze the putative “Yelpification” of contemporary life: according to Siegel, Riesman’s view was that “people went from being ‘inner-directed’ to ‘outer-directed,’ from heeding their own instincts and judgment to depending on the judgments and opinions of tastemakers and trendsetters.” The “conformist power of the crowd” and its delighted ability to write online reviews led Siegel down a sad path to a lackluster expensive dinner.
  • What Riesman actually suggested was that we think of social organization in terms of a series of “ideal types” along a spectrum of increasingly loose authority
  • On one end of the spectrum is a “tradition-directed” community, where we all understand that what we’re supposed to do is what we’re supposed to do because it’s just the thing that one does; authority is unequivocal, and there’s neither the room nor the desire for autonomous action
  • In the middle of the spectrum, as one moves toward a freer distribution of, and response to, authority, is “inner-direction.” The inner-directed character is concerned not with “what one does” but with “what people like us do.” Which is to say that she looks to her own internalizations of past authorities to get a sense for how to conduct her affairs.
  • Contemporary society, Riesman thought, was best understood as chiefly “other-directed,” where the inculcated authority of the vertical (one’s lineage) gives way to the muddled authority of the horizontal (one’s peers).
  • The inner-directed person orients herself by an internal “gyroscope,” while the other-directed person orients herself by “radar.”
  • It’s not that the inner-directed person consults some deep, subjective, romantically sui generis oracle. It’s that the inner-directed person consults the internalized voices of a mostly dead lineage, while her other-directed counterpart heeds the external voices of her living contemporaries.
  • “the gyroscopic mechanism allows the inner-directed person to appear far more independent than he really is: he is no less a conformist to others than the other-directed person, but the voices to which he listens are more distant, of an older generation, their cues internalized in his childhood.” The inner-directed person is, simply, “somewhat less concerned than the other-directed person with continuously obtaining from contemporaries (or their stand-ins: the mass media) a flow of guidance, expectation, and approbation.
  • Riesman drew no moral from the transition from a community of primarily inner-directed people to a community of the other-directed. Instead, he saw that each ideal type had different advantages and faced different problems
  • As Riesman understood it, the primary disciplining emotion under tradition direction is shame, the threat of ostracism and exile that enforces traditional action. Inner-directed people experience not shame but guilt, or the fear that one’s behavior won’t be commensurate with the imago within. And, finally, other-directed folks experience not guilt but a “contagious, highly diffuse” anxiety—the possibility that, now that authority itself is diffuse and ambiguous, we might be doing the wrong thing all the time.
  • Siegel is right to make the inference, if wayward in his conclusions. It makes sense to associate the anxiety of how to relate to livingly diffuse authorities with the Internet, which presents the greatest signal-to-noise-ratio problem in human history.
  • The problem with Yelp is not the role it plays, for Siegel, in the proliferation of monoculture; most people of my generation have learned to ignore Yelp entirely. It’s the fact that, after about a year of usefulness, Yelp very quickly became a terrible source of information.
  • There are several reasons for this. The first is the nature of an algorithmic response to the world. As Jaron Lanier points out in “Who Owns the Future?,” the hubris behind each new algorithm is the idea that its predictive and evaluatory structure is game-proof; but the minute any given algorithm gains real currency, all the smart and devious people devote themselves to gaming it. On Yelp, the obvious case would be garnering positive reviews by any means necessary.
  • A second problem with Yelp’s algorithmic ranking is in the very idea of using online reviews; as anybody with a book on Amazon knows, they tend to draw more contributions from people who feel very strongly about something, positively or negatively. This undermines the statistical relevance of their recommendations.
  • the biggest problem with Yelp is not that it’s a popularity contest. It’s not even that it’s an exploitable popularity contest.
  • it’s the fact that Yelp makes money by selling ads and prime placements to the very businesses it lists under ostensibly neutral third-party review
  • But Yelp’s valuations are always possibly in bad faith, even if its authority is dressed up as the distilled algorithmic wisdom of a crowd. For Riesman, that’s the worst of all possible worlds: a manipulated consumer certainty that only shores up the authority of an unchosen, hidden source. In that world, cold monkfish is the least of our problems.
sophie mester

David Lynch Is Back … as a Guru of Transcendental Meditation - NYTimes.com - 0 views

  • As the car hummed along and we relived his spiritual journey, I asked Lynch what he really believed. Did he see Transcendental Meditation as simply a technique for relaxation, perfect for young Hollywood actresses, or rather as an all-encompassing way of life, as Maharishi had encouraged — one with peace palaces and an army of meditators fomenting world peace? Lynch paused, and then spoke for more than five minutes, explaining that T.M. was the answer for all seeking true inner happiness. He ended with this thought: “Things like traumatic stress and anxiety and tension and sorrow and depression and hate and bitter, selfish anger and fear start to lift away. And that’s a huge sense of freedom when that heavy weight of negativity begins to lift. So it’s like gold flowing in from within and garbage going out. The things in life that used to almost kill you, stress you, depress you, make you sad, make you afraid — they have less and less power. It’s like you’re building up a flak jacket of protection. You’re starting to glow with this from within.”
    • sophie mester
       
      belief that TM allows a person to consciously influence their emotions, and the power those emotions have over their lives.
  • I still meditate. For 20 minutes or more, twice a day, I’m able to step back from the news scroll of thoughts and be truly quiet. I use T.M. to deal with anxiety and fatigue and to stave off occasional despair. But that’s because, in my head, I’ve managed to excise the weird flotsam of spirituality that engulfed T.M. for the first part of my life. Now, for me, it is something very simple, like doing yoga or avoiding dairy. Objectively speaking, meditation has been shown to decrease the incidence of heart attacks and strokes and increase longevity. The Department of Veterans Affairs and the Department of Defense commissioned studies to determine whether T.M. can help veterans alleviate post-traumatic stress disorder. Thanks to the David Lynch foundation, low-performing public schools have instituted “Quiet Time,” an elective 10 minutes, twice a day, during which students meditate, with some encouraging results.
    • sophie mester
       
      Objective support for the power of TM - decrease incidence of heart attacks/strokes, increase longevity, help those suffering from PTSD.
  • The office of the David Lynch Foundation for Consciousness-Based Education and World Peace in New York is filled with young adults, many of whom grew up practicing Transcendental Meditation. Since Lynch started spreading the good news about T.M., the number of people learning the technique has increased tenfold. Close to Lynch’s heart are those suffering from PTSD, it seems, but it is in his own industry that he has made a more visible impact. Roth, who runs the foundation, spends much of his time flying around the world as well as initiating a long list of public figures: Gwyneth Paltrow, Ellen DeGeneres, Russell Simmons, Katy Perry, Susan Sarandon, Candy Crowley, Soledad O’Brien, George Stephanopoulos and Paul McCartney’s grandchildren.
    • sophie mester
       
      large following of TM suggests its potential to have a positive mental impact.
Javier E

Dogs, Cats and Leadership - The New York Times - 1 views

  • the performance of presidents, especially on foreign policy, is shaped by how leaders attach to problems. Some leaders are like dogs: They want to bound right in and make things happen. Some are more like cats: They want to detach and maybe look for a pressure point here or there.
  • we should be asking them a different set of questions:
  • How much do you think a president can change the flow of world events? President Obama, for example, has a limited or, if you want to put it that way, realistic view of the extent of American influence. He subscribes to a series of propositions that frequently push him toward nonintervention: The world “is a tough, complicated, messy, mean place and full of hardship and tragedy,” he told Goldberg. You can’t fix everything. Sometimes you can only shine a spotlight.
  • ...8 more annotations...
  • Furthermore, Obama argues, because of our history, American military efforts are looked at with suspicion. Allies are unreliable. Ukraine is always going to be in Russia’s sphere of influence, so its efforts there will always trump ours. The Middle East is a morass and no longer that important to U.S. interests.
  • Do you think out loud in tandem with a community, or do you process internally? Throughout the Goldberg article, Obama is seen thinking deeply and subtly, but apart from the group around him. In catlike fashion, he is a man who knows his own mind and trusts his own judgment. His decision not to bomb Syria after it crossed the chemical weapons red line was made almost entirely alone.
  • More generally, Obama expresses disdain with the foreign policy community. He is critical of most of his fellow world leaders — impatient with most European ones, fed up with most Middle Eastern ones.
  • When seeking a description of a situation, does your mind leap for the clarifying single truth or do you step back to see the complex web of factors? Ronald Reagan typified the single clarifying truth habit of mind, both when he was describing an enemy (Evil Empire) and when he was calling for change (tear down this wal
  • , Obama leans to the other side of the spectrum. He is continually stepping back, starting with analyses of human nature, how people behave when social order breaks down, the roots and nature of tribalism.
  • Do you see international affairs as a passionate struggle or a conversation and negotiation? Continue reading the main story 343 Comments Obama shows a continual distrust of passion. He doesn’t see much value in macho bluffing or chest-thumping, or in lofty Churchillian rhetoric, or in bombings done in the name of “credibility.”
  • He may be critical, but he is not a hater. He doesn’t even let anger interfere with his appraisal of Vladimir Putin
  • it’s striking how many Americans have responded by going for Donald Trump and Bernie Sanders, who are bad versions of the bounding in/we-can-change-everything doggy type.
Javier E

This is what it's like to grow up in the age of likes, lols and longing | The Washingto... - 1 views

  • She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
  • She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
  • Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too.
  • ...19 more annotations...
  • “Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
  • The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
  • Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
  • “It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
  • School is where she thrives: She is beloved by her teachers, will soon star as young Simba in the eighth-grade performance of “The Lion King” musical, and gets straight A’s. Her school doesn’t offer a math course challenging enough for her, so she takes honors algebra online through Johns Hopkins University.
  • “Happy birthday posts are a pretty big deal,” she says. “It really shows who cares enough to put you on their page.”
  • He checks the phone bill to see who she’s called and how much she’s been texting, but she barely calls anyone and chats mostly through Snapchat, where her messages disappear.
  • Some of Katherine’s very best friends have never been to her house, or she to theirs. To Dave, it seems like they rarely hang out, but he knows that to her, it seems like they’re together all the time.
  • Dave Pommerening wants to figure out how to get her to use it less. One month, she ate up 18 gigabytes of data. Most large plans max out at 10. He intervened and capped her at four GB. “I don’t want to crimp it too much,” he says. “That’s something, from my perspective, I’m going to have to figure out, how to get my arms around that.”
  • Even if her dad tried snooping around her apps, the true dramas of teenage girl life are not written in the comments. Like how sometimes, Katherine’s friends will borrow her phone just to un-like all the Instagram photos of girls they don’t like. Katherine can’t go back to those girls’ pages and re-like the photos because that would be stalking, which is forbidden.
  • Or how last week, at the middle school dance, her friends got the phone numbers of 10 boys, but then they had to delete five of them because they were seventh-graders. And before she could add the boys on Snapchat, she realized she had to change her username because it was her childhood nickname and that was totally embarrassing.
  • Then, because she changed her username, her Snapchat score reverted to zero. The app awards about one point for every snap you send and receive. It’s also totally embarrassing and stressful to have a low Snapchat score. So in one day, she sent enough snaps to earn 1,000 points.
  • Snapchat is where flirting happens. She doesn’t know anyone who has sent a naked picture to a boy, but she knows it happens with older girls, who know they have met the right guy.
  • Nothing her dad could find on her phone shows that for as good as Katherine is at math, basketball and singing, she wants to get better at her phone. To be one of the girls who knows what to post, how to caption it, when to like, what to comment.
  • Katherine doesn’t need magazines or billboards to see computer-perfect women. They’re right on her phone, all the time, in between photos of her normal-looking friends. There’s Aisha, there’s Kendall Jenner’s butt. There’s Olivia, there’s YouTube star Jenna Marbles in lingerie.
  • The whole world is at her fingertips and has been for years. This, Katherine offers as a theory one day, is why she doesn’t feel like she’s 13 years old at all. She’s probably, like, 16.
  • “I don’t feel like a child anymore” she says. “I’m not doing anything childish. At the end of sixth grade” — when all her friends got phones and downloaded Snapchat, Instagram and Twitter — “I just stopped doing everything I normally did. Playing games at recess, playing with toys, all of it, done.”
  • Her scooter sat in the garage, covered in dust. Her stuffed animals were passed down to Lila. The wooden playground in the back yard stood empty. She kept her skateboard with neon yellow wheels, because riding it is still cool to her friends.
  • On the morning of her 14th birthday, Katherine wakes up to an alarm ringing on her phone. It’s 6:30 a.m. She rolls over and shuts it off in the dark. Her grandparents, here to celebrate the end of her first year of teenagehood, are sleeping in the guest room down the hall. She can hear the dogs shuffling across the hardwood downstairs, waiting to be fed. Propping herself up on her peace-sign-covered pillow, she opens Instagram. Later, Lila will give her a Starbucks gift card. Her dad will bring doughnuts to her class. Her grandparents will take her to the Melting Pot for dinner. But first, her friends will decide whether to post pictures of Katherine for her birthday. Whether they like her enough to put a picture of her on their page. Those pictures, if they come, will get likes and maybe tbhs. They should be posted in the morning, any minute now. She scrolls past a friend posing in a bikini on the beach. Then a picture posted by Kendall Jenner. A selfie with coffee. A basketball Vine. A selfie with a girl’s tongue out. She scrolls, she waits. For that little notification box to appear.
Javier E

Why You Will Marry the Wrong Person - The New York Times - 1 views

  • IT’S one of the things we are most afraid might happen to us. We go to great lengths to avoid it. And yet we do it all the same: We marry the wrong person.
  • Partly, it’s because we have a bewildering array of problems that emerge when we try to get close to others. We seem normal only to those who don’t know us very well. In a wiser, more self-aware society than our own, a standard question on any early dinner date would be: “And how are you crazy?
  • Marriage ends up as a hopeful, generous, infinitely kind gamble taken by two people who don’t know yet who they are or who the other might be, binding themselves to a future they cannot conceive of and have carefully avoided investigating.
  • ...18 more annotations...
  • For most of recorded history, people married for logical sorts of reasons:
  • And from such reasonable marriages, there flowed loneliness, infidelity, abuse, hardness of heart and screams heard through the nursery doors
  • The marriage of reason was not, in hindsight, reasonable at all; it was often expedient, narrow-minded, snobbish and exploitative. That is why what has replaced it — the marriage of feeling — has largely been spared the need to account for itself
  • Finally, we marry to make a nice feeling permanent. We imagine that marriage will help us to bottle the joy we felt when the thought of proposing first came to us: Perhaps we were in Venice, on the lagoon, in a motorboat
  • But though we believe ourselves to be seeking happiness in marriage, it isn’t that simple. What we really seek is familiarity
  • We are looking to recreate, within our adult relationships, the feelings we knew so well in childhood. The love most of us will have tasted early on was often confused with other, more destructive dynamics: feelings of wanting to help an adult who was out of control, of being deprived of a parent’s warmth or scared of his anger, of not feeling secure enough to communicate our wishes.
  • How logical, then, that we should as grown-ups find ourselves rejecting certain candidates for marriage not because they are wrong but because they are too right — too balanced, mature, understanding and reliable — given that in our hearts, such rightness feels foreign. We marry the wrong people because we don’t associate being loved with feeling happy.
  • We make mistakes, too, because we are so lonely. No one can be in an optimal frame of mind to choose a partner when remaining single feels unbearable. We have to be wholly at peace with the prospect of many years of solitude in order to be appropriately picky
  • What matters in the marriage of feeling is that two people are drawn to each other by an overwhelming instinct and know in their hearts that it is right
  • marriage tends decisively to move us onto another, very different and more administrative plane, which perhaps unfolds in a suburban house, with a long commute and maddening children who kill the passion from which they emerged. The only ingredient in common is the partner. And that might have been the wrong ingredient to bottle.
  • The good news is that it doesn’t matter if we find we have married the wrong person.
  • We mustn’t abandon him or her, only the founding Romantic idea upon which the Western understanding of marriage has been based the last 250 years: that a perfect being exists who can meet all our needs and satisfy our every yearning.
  • WE need to swap the Romantic view for a tragic (and at points comedic) awareness that every human will frustrate, anger, annoy, madden and disappoint us — and we will (without any malice) do the same to them.
  • But none of this is unusual or grounds for divorce. Choosing whom to commit ourselves to is merely a case of identifying which particular variety of suffering we would most like to sacrifice ourselves for.
  • pessimism relieves the excessive imaginative pressure that our romantic culture places upon marriage. The failure of one particular partner to save us from our grief and melancholy is not an argument against that person and no sign that a union deserves to fail or be upgraded.
  • The person who is best suited to us is not the person who shares our every taste (he or she doesn’t exist), but the person who can negotiate differences in taste intelligently — the person who is good at disagreement.
  • Rather than some notional idea of perfect complementarity, it is the capacity to tolerate differences with generosity that is the true marker of the “not overly wrong” person
  • We should learn to accommodate ourselves to “wrongness,” striving always to adopt a more forgiving, humorous and kindly perspective on its multiple examples in ourselves and in our partners.
kirkpatrickry

Wiretapping the senses: Scientists monitor conversation between sensory perception, beh... - 0 views

  • Many types of sensory information enter the brain at a structure called the primary sensory cortex, where they are processed by different layers of cells in ways that ultimately influence an animal's perception and behavioral response.
  • An ultimate goal of neurobiological research is to understand how a brain integrates a constant flow of various types of stimuli, makes sense of it, and helps coordinate an appropriate behavioral respons
  • Understanding the most basic principles of this system will require careful studies of regions of animal brains that are simple enough to keep track of nerve impulses as they enter, and yet complex enough to follow different types of signals as they exit along different routes.
Javier E

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
anonymous

Symmetry in the universe: Physics says you shouldn't exist. - 0 views

  • You, me, and even the most calming manatee are nothing but impurities in an otherwise beautifully simple universe.
  • Your existence wasn’t just predicated on amorousness and luck of your ancestors, but on an almost absurdly finely tuned universe. Had the universe opted to turn up the strength of the electromagnetic force by even a small factor, poof
  • if the universe were only minutely denser than the one we inhabit, it would have collapsed before it began.
  • ...12 more annotations...
  • Worse still, the laws of physics themselves seem to be working against us. Ours isn’t just a randomly hostile universe, it's an actively hostile universe
  • The history of physics, in fact, is a marvel of using simple symmetry principles to construct complicated laws of the universe
  • if the entire universe were made symmetric, then all of the good features (e.g., you) are decidedly asymmetric lumps that ruin the otherwise perfect beauty of the cosmo
  • it would be a mistake to be comforted by the symmetries of the universe. In truth, they are your worst enemies. Everything we know about those rational, predictable arrangements dictates that you shouldn't be here at all.
  • How hostile is the universe to your fundamental existence? Very. Even the simplest assumptions about our place in the universe seem to lead inexorably to devastating results
  • The symmetry of the universe would bake us in no time at all, but an asymmetry rescues us
  • In literally every experiment and observation that we’ve ever done, matter and antimatter get created (or annihilated) in perfect concert. That is, every experiment except for one: us.
  • Matter and antimatter should have completely annihilated one another in the first nanoseconds after the Big Bang. You should not even exist. But you do, and there’s lots more matter where you came from.
  • if the perfect symmetry between matter and antimatter remained perfect, you wouldn’t be here to think about it.
  • The flow of time (as near as we can tell) is completely arbitrary. Does entropy increase with time or does it make time? Are our memories the thing that ultimately breaks the symmetry of time?
  • It seems only a matter of luck (and some fairly arbitrary-looking math) that a symmetric universe would end up being remotely hospitable to complex creatures like us
  • Without electrons binding to protons, there would be no chemistry, no molecules, and nothing more complicated than a cloud of charged gas. And you’re not a sentient cloud of gas, are you?
kushnerha

A new atlas maps word meanings in the brain | PBS NewsHour - 0 views

  • like Google Maps for your cerebral cortex: A new interactive atlas, developed with the help of such unlikely tools as public radio podcasts and Wikipedia, purports to show which bits of your brain help you understand which types of concepts.
  • Hear a word relating to family, loss, or the passing of time — such as “wife,” “month,” or “remarried”— and a ridge called the right angular gyrus may be working overtime. Listening to your contractor talking about the design of your new front porch? Thank a pea-sized spot of brain behind your left ear.
  • The research on the “brain dictionary” has the hallmarks of a big scientific splash: Published on Wednesday in Nature, it’s accompanied by both a video and an interactive website where you can click your way from brain region to brain region, seeing what kinds of words are processed in each. Yet neuroscientists aren’t uniformly impressed.
  • ...9 more annotations...
  • invoked an old metaphor to explain why he isn’t convinced by the analysis: He compared it to establishing a theory of how weather works by pointing a video camera out the window for 7 hours.
  • Indeed, among neuroscientists, the new “comprehensive atlas” of the cerebral cortex is almost as controversial as a historical atlas of the Middle East. That’s because every word has a constellation of meanings and associations — and it’s hard for scientists to agree about how best to study them in the lab.
  • For this study, neuroscientist Jack Gallant and his team at the University of California, Berkeley played more than two hours’ worth of stories from the Moth Radio Hour for seven grad students and postdocs while measuring their cerebral blood flow using functional magnetic resonance imaging. Then, they linked the activity in some 50,000 pea-sized regions of the cortex to the “meaning” of the words being heard at that moment.
  • How, you might ask, did they establish the meaning of words? The neuroscientists pulled all the nouns and verbs from the podcasts. With a computer program, they then looked across millions of pages of text to see how often the words from the podcasts are used near 985 common words taken from Wikipedia’s List of 1,000 Basic Words. “Wolf,” for instance, would presumably be used more often in proximity to “dog” than to, say, “eggplant.” Using that data, the program assigned numbers that approximated the meaning of each individual word from the podcasts — and, with some fancy number crunching, they figured out what areas of the brain were activated when their research subjects heard words with certain meanings.
  • Everyone agrees that the research is innovative in its method. After all, linking up the meanings of thousands of words to the second-by-second brain activity in thousands of tiny brain regions is no mean feat. “That’s way more data than any human being can possibly think about,” said Gallant.
  • What they can’t agree on is what it means. “In this study, our goal was not to ask a specific question. Our goal was to map everything so that we can ask questions after that,” said Gallant. “One of the most frequent questions we get is, ‘What does it mean?’ If I gave you a globe, you wouldn’t ask what it means, you’d start using it for stuff. You can look for the smallest ocean or how long it will take to get to San Francisco.”
  • This “data-driven approach” still involves assumptions about how to break up language into different categories of meaning
  • “Of course it’s a very simplified version of how meaning is captured in our minds, but it seems to be a pretty good proxy,” she said.
  • hordes of unanswered questions: “We can map where your brain represents the meaning of a narrative text that is associated with family, but we don’t know why the brain is responding to family at that location. Is it the word ‘father’ itself? Is it your memories of your own father? Is it your own thinking about being a parent yourself?” He hopes that it’s just those types of questions that researchers will ask, using his brain map as a guide.
oliviaodon

A scientific revolution? - 0 views

  • Puzzle-solving science, according to Kuhn, can therefore trigger a scientific revolution as scientists struggle to explain these anomalies and develop a novel basic theory to incorporate them into the existing body of knowledge. After an extended period of upheaval, in which followers of the new theory storm the bastions of accepted dogma, the old paradigm is gradually replaced.
  • biology is heading towards a similar scientific revolution that may shatter one of its most central paradigms. The discovery of a few small proteins with anomalous behaviour is about to overcome a central tenet of molecular biology: that information flows unidirectionally from the gene to the protein to the phenotype. It started with the discovery that prions, a class of small proteins that can exist in different forms, cause a range of highly debilitating diseases. This sparked further research
  • Scientific revolutions are still rare in biology, given that the field, unlike astronomy or physics, is relatively young.
  • ...1 more annotation...
  • The idea that all living beings stem from a primordial cell dating back two billion years is, in my opinion, a true paradigm. It does not have a heuristic value, unlike paradigms in physics such as gravitation or Einstein's famous equation, but it has a fundamental aspect.
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

The Philosopher Whose Fingerprints Are All Over the FTC's New Approach to Privacy - Ale... - 0 views

  • The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.
  • Nissenbaum gets us past thinking about privacy as a binary: either something is private or something is public. Nissenbaum puts the context -- or social situation -- back into the equation. What you tell your bank, you might not tell your doctor.
  • Furthermore, these differences in information sharing are not bad or good; they are just the norms.
  • ...8 more annotations...
  • any privacy regulation that's going to make it through Congress has to provide clear ways for companies to continue profiting from data tracking. The key is coming up with an ethical framework in which they can do so, and Nissenbaum may have done just that. 
  • The traditional model of how this works says that your information is something like a currency and when you visit a website that collects data on you for one reason or another, you enter into a contract with that site. As long as the site gives you "notice" that data collection occurs -- usually via a privacy policy located through a link at the bottom of the page -- and you give "consent" by continuing to use the site, then no harm has been done. No matter how much data a site collects, if all they do is use it to show you advertising they hope is more relevant to you, then they've done nothing wrong.
  • let companies do standard data collection but require them to tell people when they are doing things with data that are inconsistent with the "context of the interaction" between a company and a person.
  • How can anyone make a reasonable determination of how their information might be used when there are more than 50 or 100 or 200 tools in play on a single website in a single month?
  • Nissenbaum doesn't think it's possible to explain the current online advertising ecosystem in a useful way without resorting to a lot of detail. She calls this the "transparency paradox," and considers it insoluble.
  • she wants to import the norms from the offline world into the online world. When you go to a bank, she says, you have expectations of what might happen to your communications with that bank. That should be true whether you're online, on the phone, or at the teller.  Companies can use your data to do bank stuff, but they can't sell your data to car dealers looking for people with a lot of cash on hand.
  • Nevermind that if you actually read all the privacy policies you encounter in a year, it would take 76 work days. And that calculation doesn't even account for all the 3rd parties that drain data from your visits to other websites. Even more to the point: there is no obvious way to discriminate between two separate webpages on the basis of their data collection policies. While tools have emerged to tell you how many data trackers are being deployed at any site at a given moment, the dynamic nature of Internet advertising means that it is nearly impossible to know the story through time
  • here's the big downside: it rests on the "norms" that people expect. While that may be socially optimal, it's actually quite difficult to figure out what the norms for a given situation might be. After all, there is someone else who depends on norms for his thinking about privacy.
Javier E

Does Facebook Turn People Into Narcissists? - NYTimes.com - 0 views

  • Those who frequently updated their Facebook status, tagged themselves in photos and had large numbers of virtual friends, were more likely to exhibit narcissistic traits, the study found. Another study found that people with high levels of narcissism were more likely to spend more than an hour a day on Facebook, and they were also more likely to post digitally enhanced personal photos. But what the research doesn’t answer is whether Facebook attracts narcissists or turns us into them.
  • researchers found, to their surprise, that frequency of Facebook use, whether it was for personal status updates or to connect with friends, was not associated with narcissism. Narcissism per se was associated with only one type of Facebook user — those who amassed unrealistically large numbers of Facebook friends.
  • frequent Facebook users were more likely to score high on “openness” and were less concerned about privacy. So what seems like self-promoting behavior may just reflect a generation growing up in the digital age, where information — including details about personal lives — flows freely and connects us.
  • ...1 more annotation...
  • The social medium of choice for the self-absorbed appears to be Twitter. The researchers found an association between tweeting about oneself and high narcissism scores.
Javier E

SOPA Boycotts and the False Ideals of the Web - NYTimes.com - 1 views

  • Those rare tech companies that have come out in support of SOPA are not merely criticized but barred from industry events and subject to boycotts. We, the keepers of the flame of free speech, are banishing people for their speech. The result is a chilling atmosphere, with people afraid to speak their minds.
  • Our melodrama is driven by a vision of an open Internet that has already been distorted, though not by the old industries that fear piracy. For instance, until a year ago, I enjoyed a certain kind of user-generated content very much: I participated in forums in which musicians talked about musical instruments.
  • proprietary social networking — is ending my freedom to participate in the forums I used to love, at least on terms I accept. Like many other forms of contact, the musical conversations are moving into private sites, particularly Facebook. To continue to participate, I’d have to accept Facebook’s philosophy, under which it analyzes me, and is searching for new ways to charge third parties for the use of that analysis.
  • ...6 more annotations...
  • You might object that it’s all based on individual choice. That argument ignores the consequences of networks, and the way they function. After a certain point choice is reduced.
  • What if ordinary users routinely earned micropayments for their contributions? If all content were valued instead of only mogul content, perhaps an information economy would elevate success for all. But under the current terms of debate that idea can barely be whispered.
  • Once networks are established, it is hard to reduce their power. Google’s advertisers, for instance, know what will happen if they move away. The next-highest bidder for each position in Google’s auction-based model for selling ads will inherit that position if the top bidder goes elsewhere. So Google’s advertisers tend to stay put because the consequences of leaving are obvious to them
  • The obvious strategy in the fight for a piece of the advertising pie is to close off substantial parts of the Internet so Google doesn’t see it all anymore. That’s how Facebook hopes to make money, by sealing off a huge amount of user-generated information into a separate, non-Google world.
  • it’s not Facebook’s fault! We, the idealists, insisted that information be able to flow freely online, which meant that services relating to information, instead of the information itself, would be the main profit centers. Some businesses do sell content, but that doesn’t address the business side of everyday user-generated content. The adulation of “free content” inevitably meant that “advertising” would become the biggest business in the open part of the information economy
  • We in Silicon Valley undermined copyright to make commerce become more about services instead of content — more about our code instead of their files. The inevitable endgame was always that we would lose control of our own personal content, our own files.
Duncan H

Rick Santorum Campaigning Against the Modern World - NYTimes.com - 0 views

  • As a journalist who covered Rick Santorum in Pennsylvania for years, I can understand the Tea Party’s infatuation with him. It’s his anger. It is in perfect synch with the constituency he is wooing.
  • Even at the height of his political success, when he had a lot to be happy about, Santorum was an angry man. I found it odd. I was used to covering politicians who had good dispositions — or were good at pretending they had good dispositions.
  • You could easily get him revved by bringing up the wrong topic or taking an opposing point of view. His nostrils would flare, his eyes would glare and he would launch into a disquisition on how, deep down, you were a shallow guy who could not grasp the truth and rightness of his positions.
  • ...11 more annotations...
  • “It’s just a curious bias of the media around here. It’s wonderful. One person says something negative and the media rushes and covers that. The wonderful balanced media that I love in this community.”
  • Santorum had reason to be peeved. He was running against the Democrat Bob Casey. He was trailing by double digits and knew he was going to lose. He was not a happy camper, but then he rarely is.
  • As he has shown in the Republican debates, Santorum can be equable. The anger usually flares on matters closest to his heart: faith, family and morals. And if, by chance, you get him started on the role of religion in American life, get ready for a Vesuvius moment.
  • Outside of these areas, he was more pragmatic. Then and now, Santorum held predictably conservative views, but he was astute enough to bend on some issues and be — as he put it in the Arizona debate — “a team player.”
  • In the Senate, he represented a state with a relentlessly moderate-to-centrist electorate so when campaigning he emphasized the good deeds he did in Washington. Editorial board meetings with Santorum usually began with him listing federal money he had brought in for local projects.People who don’t know him — and just see the angry Rick — don’t realize what a clever politician Santorum is. He didn’t rise to become a Washington insider through the power of prayer. He may say the Rosary, but he knows his Machiavelli.
  • That said, Santorum’s anger is not an act.  It is genuine. It has its roots in the fact that he had the misfortune to be born in the second half of the 20th century. In his view, it was an era when moral relativism and anti-religious feeling held sway, where traditional values were ignored or mocked, where heretics ruled civic and political life. If anything, it’s gotten worse in the 21st, with the election of Barack Obama.Leave it to Santorum to attack Obama on his theology, of all things. He sees the president as an exemplar of mushy, feel-good Christianity that emphasizes tolerance over rectitude, and the love of Jesus over the wrath of God.
  • Like many American Catholics, I struggle with the church’s teachings as they apply to the modern world. Santorum does not.
  • I once wrote that Santorum has one of the finest minds of the 13th century. It was meant to elicit a laugh, but there’s truth behind the remark. No Vatican II for Santorum. His belief system is the fixed and firm Catholicism of the Council of Trent in the mid-16th century. And Santorum is a warrior for those beliefs.
  • During the campaign, he has regularly criticized the media for harping on his public statements on homosexuality, contraception, abortion, the decline in American morals. Still, he can’t resist talking about them. These are the issues that get his juices flowing, not the deficit or federal energy policy.
  • Santorum went to Houston not to praise Kennedy but to bash him. To Santorum, the Kennedy speech did permanent damage because it led to secularization of American politics. He said it laid the foundation for attacks on religion by the secular left that has led to denial of free speech rights to religious people. “John F. Kennedy chose not to just dispel fear,” Santorum said, “he chose to expel faith.”
  • Ultimately Kennedy’s attempt to reassure Protestants that the Catholic Church would not control the government and suborn its independence advanced a philosophy of strict separation that would create a purely secular public square cleansed of all religious wisdom and the voice of religious people of all faiths. He laid the foundation for attacks on religious freedom and freedom of speech by the secular left and its political arms like the A.C.L.U and the People for the American Way. This has and will continue to create dissension and division in this country as people of faith increasingly feel like second-class citizens.One consequence of Kennedy’s speech, Santorum said,is the debasement of our First Amendment right of religious freedom. Of all the great and necessary freedoms listed in the First Amendment, freedom to exercise religion (not just to believe, but to live out that belief) is the most important; before freedom of speech, before freedom of the press, before freedom of assembly, before freedom to petition the government for redress of grievances, before all others. This freedom of religion, freedom of conscience, is the trunk from which all other branches of freedom on our great tree of liberty get their life.As so it went for 5,000 words. It is a revelatory critique of the modern world and Santorum quoted G.K. Chesterton, Edmund Burke, St. Thomas Aquinas and Martin Luther King to give heft to his assertions.That said, it was an angry speech, conjuring up images of people of faith cowering before leftist thought police. Who could rescue us from this predicament? Who could banish the secularists and restore religious morality to its throne?
  •  
    An interesting critique of Santorum and his religious beliefs.
Javier E

The Dangers of Certainty: A Lesson From Auschwitz - NYTimes.com - 0 views

  • in 1973, the BBC aired an extraordinary documentary series called “The Ascent of Man,” hosted by one Dr. Jacob Bronowski
  • It was not an account of human biological evolution, but cultural evolution — from the origins of human life in the Rift Valley to the shifts from hunter/gatherer societies,  to nomadism and then settlement and civilization, from agriculture and metallurgy to the rise and fall of empires: Assyria, Egypt, Rome.
  • The tone of the programs was rigorous yet permissive, playful yet precise, and always urgent, open and exploratory. I remember in particular the programs on the trial of Galileo, Darwin’s hesitancy about publishing his theory of evolution and the dizzying consequences of Einstein’s theory of relativity.
  • ...11 more annotations...
  • For Bronowski, science and art were two neighboring mighty rivers that flowed from a common source: the human imagination.
  • For Dr. Bronowski, there was no absolute knowledge and anyone who claims it — whether a scientist, a politician or a religious believer — opens the door to tragedy. All scientific information is imperfect and we have to treat it with humility. Such, for him, was the human condition.
  • This is the condition for what we can know, but it is also, crucially, a moral lesson. It is the lesson of 20th-century painting from Cubism onwards, but also that of quantum physics. All we can do is to push deeper and deeper into better approximations of an ever-evasive reality
  • Errors are inextricably bound up with pursuit of human knowledge, which requires not just mathematical calculation but insight, interpretation and a personal act of judgment for which we are responsible.
  • Dr. Bronowski insisted that the principle of uncertainty was a misnomer, because it gives the impression that in science (and outside of it) we are always uncertain. But this is wrong. Knowledge is precise, but that precision is confined within a certain toleration of uncertainty.
  • The emphasis on the moral responsibility of knowledge was essential for all of Dr. Bronowski’s work. The acquisition of knowledge entails a responsibility for the integrity of what we are as ethical creatures.
  • Pursuing knowledge means accepting uncertainty. Heisenberg’s principle has the consequence that no physical events can ultimately be described with absolute certainty or with “zero tolerance,” as it were. The more we know, the less certain we are.
  • Our relations with others also require a principle of tolerance. We encounter other people across a gray area of negotiation and approximation. Such is the business of listening and the back and forth of conversation and social interaction.
  • For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion.
  • The play of tolerance opposes the principle of monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism. When we think we have certainty, when we aspire to the knowledge of the gods, then Auschwitz can happen and can repeat itself.
  • The pursuit of scientific knowledge is as personal an act as lifting a paintbrush or writing a poem, and they are both profoundly human. If the human condition is defined by limitedness, then this is a glorious fact because it is a moral limitedness rooted in a faith in the power of the imagination, our sense of responsibility and our acceptance of our fallibility. We always have to acknowledge that we might be mistaken.
‹ Previous 21 - 40 of 123 Next › Last »
Showing 20 items per page