Skip to main content

Home/ TOK Friends/ Group items tagged than

Rss Feed Group items tagged

alliefulginiti1

The Water in Your Glass Might Be Older Than the Sun - The New York Times - 0 views

  • Earth is old. The sun is old. But do you know what may be even older than both? Water.
  • more than 4.6 billion years ago.
  • That means the same liquid we drink and that fills the oceans may be millions of years older than the solar system itself.
  • ...3 more annotations...
  • researchers analyzed water molecules in oceans for indicators of their ancient past.
  • “heavy water.” Water, as you know, is made up of two hydrogen atoms and one oxygen atom. But some water molecules contain hydrogen’s chunky twin, deuterium. (It contains a neutron in its nucleus, whereas regular hydrogen does not.)
  • They concluded that remnants of that ancient ice remain scattered across the solar system: on the moon, in comets, at Mercury’s poles, in the remains of Mars’ melts, on Jupiter’s moon Europa — and even in your water bottle.
sissij

Woman Drivers: Worse Than Men? Yes... and No | Reader's Digest - 0 views

  • In studies, men as a whole display less cautious behavior than women, such as driving at higher speeds and closer to other cars, not wearing seat belts, and driving while intoxicated more often.
  • However, this slight edge in ability doesn’t translate into better driving records.
  • According to one study, men are more than three times as likely to be ticketed for “aggressive driving” than women, and more than 25 percent as likely to be at fault in an accident.
  • ...1 more annotation...
  • Similarly, the stereotype that women are weaker drivers may negatively affect their performance behind the wheel.
  •  
    I found it very interesting that although there is a stereotype that women drive poorly, men actually cause more accidents in researches. I very agree with the author because he make the term "better" clear in the passage, which most other claims mentioned vaguely. In his writing, "better" has two meanings: how safe one drives and how well one drives. I think looking at the problem through the two very different meaning of "better" can lead to very different conclusions. I think that's why different article held opposite views on this issue. They don't speak in the same language; they are speaking of different "better"s. --Sissi (11/10/2016)
Javier E

Study: Your friends really are happier, more popular than you - 1 views

  • it turns out social networks are not at fault: Your friends really are richer, happier and more popular than you, according to a depressing new study from researchers in Finland and France.
  • This little mobius strip of a phenomenon is called the “generalized friendship paradox,” and at first glance it makes no sense. Everyone’s friends can’t be richer and more popular — that would just escalate until everyone’s a socialite billionaire.
  • The paradox arises because numbers of friends people have are distributed in a way that follows a power law rather than an ordinary linear relationship. So most people have a few friends while a small number of people have lots of friends. It’s this second small group that causes the paradox. People with lots of friends are more likely to number among your friends in the first place. And when they do, they significantly raise the average number of friends that your friends have. That’s the reason that, on average, your friends have more friends than you do.
  • ...2 more annotations...
  • Essentially, the “generalized friendship paradox” applies to all interpersonal networks, regardless of whether they’re set in real life or online.
  • Whenever we interact with other people, we glimpse lives far more glamorous than our own.
Adam Clark

The 12 cognitive biases that prevent you from being rational - 0 views

  •  
    "The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn't mean our brains don't have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless - plus, we're subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about."
Javier E

Love People, Not Pleasure - NYTimes.com - 0 views

  • Fame, riches and pleasure beyond imagination. Sound great? He went on to write:“I have diligently numbered the days of pure and genuine happiness which have fallen to my lot: They amount to 14.”Abd al-Rahman’s problem wasn’t happiness, as he believed — it was unhappiness
  • Happiness and unhappiness are certainly related, but they are not actually opposites.
  • Circumstances are certainly important. No doubt Abd al-Rahman could point to a few in his life. But paradoxically, a better explanation for his unhappiness may have been his own search for well-being. And the same might go for you.Continue reading the main story
  • ...22 more annotations...
  • As strange as it seems, being happier than average does not mean that one can’t also be unhappier than average.
  • In 2009, researchers from the University of Rochester conducted a study tracking the success of 147 recent graduates in reaching their stated goals after graduation. Some had “intrinsic” goals, such as deep, enduring relationships. Others had “extrinsic” goals, such as achieving reputation or fame. The scholars found that intrinsic goals were associated with happier lives. But the people who pursued extrinsic goals experienced more negative emotions, such as shame and fear. They even suffered more physical maladies.
  • the paradox of fame. Just like drugs and alcohol, once you become addicted, you can’t live without it. But you can’t live with it, either.
  • That impulse to fame by everyday people has generated some astonishing innovations.
  • Today, each of us can build a personal little fan base, thanks to Facebook, YouTube, Twitter and the like. We can broadcast the details of our lives to friends and strangers in an astonishingly efficient way. That’s good for staying in touch with friends, but it also puts a minor form of fame-seeking within each person’s reach. And several studies show that it can make us unhappy.
  • It makes sense. What do you post to Facebook? Pictures of yourself yelling at your kids, or having a hard time at work? No, you post smiling photos of a hiking trip with friends. You build a fake life — or at least an incomplete one — and share it. Furthermore, you consume almost exclusively the fake lives of your social media “friends.” Unless you are extraordinarily self-aware, how could it not make you feel worse to spend part of your time pretending to be happier than you are, and the other part of your time seeing how much happier others seem to be than you?Continue reading the main story
  • the bulk of the studies point toward the same important conclusion: People who rate materialistic goals like wealth as top personal priorities are significantly likelier to be more anxious, more depressed and more frequent drug users, and even to have more physical ailments than those who set their sights on more intrinsic values.
  • as the Dalai Lama pithily suggests, it is better to want what you have than to have what you want.
  • In 2004, two economists looked into whether more sexual variety led to greater well-being. They looked at data from about 16,000 adult Americans who were asked confidentially how many sex partners they had had in the preceding year, and about their happiness. Across men and women alike, the data show that the optimal number of partners is one.
  • This might seem totally counterintuitive. After all, we are unambiguously driven to accumulate material goods, to seek fame, to look for pleasure. How can it be that these very things can give us unhappiness instead of happiness? There are two explanations, one biological and the other philosophical.
  • From an evolutionary perspective, it makes sense that we are wired to seek fame, wealth and sexual variety. These things make us more likely to pass on our DNA.
  • here’s where the evolutionary cables have crossed: We assume that things we are attracted to will relieve our suffering and raise our happiness.
  • that is Mother Nature’s cruel hoax. She doesn’t really care either way whether you are unhappy — she just wants you to want to pass on your genetic material. If you conflate intergenerational survival with well-being, that’s your problem, not nature’s.
  • More philosophically, the problem stems from dissatisfaction — the sense that nothing has full flavor, and we want more. We can’t quite pin down what it is that we seek. Without a great deal of reflection and spiritual hard work, the likely candidates seem to be material things, physical pleasures or favor among friends and strangers.
  • We look for these things to fill an inner emptiness. They may bring a brief satisfaction, but it never lasts, and it is never enough. And so we crave more.
  • This search for fame, the lust for material things and the objectification of others — that is, the cycle of grasping and craving — follows a formula that is elegant, simple and deadly:Love things, use people.
  • This was Abd al-Rahman’s formula as he sleepwalked through life. It is the worldly snake oil peddled by the culture makers from Hollywood to Madison Avenue.
  • Simply invert the deadly formula and render it virtuous:Love people, use things.
  • It requires the courage to repudiate pride and the strength to love others — family, friends, colleagues, acquaintances, God and even strangers and enemies. Only deny love to things that actually are objects. The practice that achieves this is charity. Few things are as liberating as giving away to others that which we hold dear.
  • This also requires a condemnation of materialism.
  • Finally, it requires a deep skepticism of our own basic desires. Of course you are driven to seek admiration, splendor and physical license.
  • Declaring war on these destructive impulses is not about asceticism or Puritanism. It is about being a prudent person who seeks to avoid unnecessary suffering.
Javier E

Ta-Nehisi Coates's 'Letter to My Son' - The Atlantic - 0 views

  • The question is not whether Lincoln truly meant “government of the people” but what our country has, throughout its history, taken the political term “people” to actually mean. In 1863 it did not mean your mother or your grandmother, and it did not mean you and me.
  • When the journalist asked me about my body, it was like she was asking me to awaken her from the most gorgeous dream. I have seen that dream all my life. It is perfect houses with nice lawns. It is Memorial Day cookouts, block associations, and driveways. The Dream is tree houses and the Cub Scouts. And for so long I have wanted to escape into the Dream, to fold my country over my head like a blanket. But this has never been an option, because the Dream rests on our backs, the bedding made from our bodies.
  • The destroyers will rarely be held accountable. Mostly they will receive pensions.
  • ...41 more annotations...
  • you know now, if you did not before, that the police departments of your country have been endowed with the authority to destroy your body. It does not matter if the destruction is the result of an unfortunate overreaction. It does not matter if it originates in a misunderstanding. It does not matter if the destruction springs from a foolish policy
  • But a society that protects some people through a safety net of schools, government-backed home loans, and ancestral wealth but can only protect you with the club of criminal justice has either failed at enforcing its good intentions or has succeeded at something much darker.
  • It is hard to face this. But all our phrasing—race relations, racial chasm, racial justice, racial profiling, white privilege, even white supremacy—serves to obscure that racism is a visceral experience, that it dislodges brains, blocks airways, rips muscle, extracts organs, cracks bones, breaks teeth
  • ou must never look away from this. You must always remember that the sociology, the history, the economics, the graphs, the charts, the regressions all land, with great violence, upon the body.
  • And should one live in such a body? What should be our aim beyond meager survival of constant, generational, ongoing battery and assault? I have asked this question all my life.
  • The question is unanswerable, which is not to say futile. The greatest reward of this constant interrogation, of confrontation with the brutality of my country, is that it has freed me from ghosts and myths.
  • I was afraid long before you, and in this I was unoriginal. When I was your age the only people I knew were black, and all of them were powerfully, adamantly, dangerously afraid. It was always right in front of me. The fear was there in the extravagant boys of my West Baltimore neighborhood
  • The fear lived on in their practiced bop, their slouching denim, their big T- shirts, the calculated angle of their baseball caps, a catalog of behaviors and garments enlisted to inspire the belief that these boys were in firm possession of everything they desired.
  • To be black in the Baltimore of my youth was to be naked before the elements of the world, before all the guns, fists, knives, crack, rape, and disease. The law did not protect us. And now, in your time, the law has become an excuse for stopping and frisking you, which is to say, for furthering the assault on your body
  • I remember being amazed that death could so easily rise up from the nothing of a boyish afternoon, billow up like fog. I knew that West Baltimore, where I lived; that the north side of Philadelphia, where my cousins lived; that the South Side of Chicago, where friends of my father lived, comprised a world apart. Somewhere out there beyond the firmament, past the asteroid belt, there were other worlds where children did not regularly fear for their bodies
  • here will surely always be people with straight hair and blue eyes, as there have been for all history. But some of these straight-haired people with blue eyes have been “black,” and this points to the great difference between their world and ours. We did not choose our fences. They were imposed on us by Virginia planters obsessed with enslaving as many Americans as possible. Now I saw that we had made something down here, in slavery, in Jim Crow, in ghettoes. At The Mecca I saw how we had taken their one-drop rule and flipped it. They made us into a race. We made ourselves into a people.
  • I came to understand that my country was a galaxy, and this galaxy stretched from the pandemonium of West Baltimore to the happy hunting grounds of Mr. Belvedere. I obsessed over the distance between that other sector of space and my own. I knew that my portion of the American galaxy, where bodies were enslaved by a tenacious gravity, was black and that the other, liberated portion was not. I knew that some inscrutable energy preserved the breach. I felt, but did not yet understand, the relation between that other world and me. And I felt in this a cosmic injustice, a profound cruelty, which infused an abiding, irrepressible desire to unshackle my body and achieve the velocity of escape.
  • Before I could escape, I had to survive, and this could only mean a clash with the streets, by which I mean not just physical blocks, nor simply the people packed into them, but the array of lethal puzzles and strange perils which seem to rise up from the asphalt itself. The streets transform every ordinary day into a series of trick questions, and every incorrect answer risks a beat-down, a shooting, or a pregnancy. No one survives unscathed
  • When I was your age, fully one-third of my brain was concerned with who I was walking to school with, our precise number, the manner of our walk, the number of times I smiled, who or what I smiled at, who offered a pound and who did not—all of which is to say that I practiced the culture of the streets, a culture concerned chiefly with securing the body.
  • Why were only our heroes nonviolent? Back then all I could do was measure these freedom-lovers by what I knew. Which is to say, I measured them against children pulling out in the 7-Eleven parking lot, against parents wielding extension cords, and the threatening intonations of armed black gangs saying, “Yeah, nigger, what’s up now?” I judged them against the country I knew, which had acquired the land through murder and tamed it under slavery, against the country whose armies fanned out across the world to extend their dominion. The world, the real one, was civilization secured and ruled by savage means. How could the schools valorize men and women whose values society actively scorned? How could they send us out into the streets of Baltimore, knowing all that they were, and then speak of nonviolence?
  • the beauty of the black body was never celebrated in movies, in television, or in the textbooks I’d seen as a child. Everyone of any import, from Jesus to George Washington, was white. This was why your grandparents banned Tarzan and the Lone Ranger and toys with white faces from the house. They were rebelling against the history books that spoke of black people only as sentimental “firsts”—first black four-star general, first black congressman, first black mayor—always presented in the bemused manner of a category of Trivial Pursuit.
  • erious history was the West, and the West was white. This was all distilled for me in a quote I once read, from the novelist Saul Bellow. I can’t remember where I read it, or when—only that I was already at Howard. “Who is the Tolstoy of the Zulus?,” Bellow quipped
  • this view of things was connected to the fear that passed through the generations, to the sense of dispossession. We were black, beyond the visible spectrum, beyond civilization. Our history was inferior because we were inferior, which is to say our bodies were inferior. And our inferior bodies could not possibly be accorded the same respect as those that built the West. Would it not be better, then, if our bodies were civilized, improved, and put to some legitimate Christian use?
  • now I looked back on my need for a trophy case, on the desire to live by the standards of Saul Bellow, and I felt that this need was not an escape but fear again—fear that “they,” the alleged authors and heirs of the universe, were right. And this fear ran so deep that we accepted their standards of civilization and humanity.
  • “Tolstoy is the Tolstoy of the Zulus,” wrote Wiley. “Unless you find a profit in fencing off universal properties of mankind into exclusive tribal ownership.” And there it was. I had accepted Bellow’s premise. In fact, Bellow was no closer to Tolstoy than I was to Nzinga. And if I were closer it would be because I chose to be, not because of destiny written in DNA. My great error was not that I had accepted someone else’s dream but that I had accepted the fact of dreams, the need for escape, and the invention of racecraft.
  • still and all I knew that we were something, that we were a tribe—on one hand, invented, and on the other, no less real. The reality was out there on the Yard, on the first warm day of spring when it seemed that every sector, borough, affiliation, county, and corner of the broad diaspora had sent a delegate to the great world party
  • I could see now that that world was more than a photonegative of that of the people who believe they are white. “White America” is a syndicate arrayed to protect its exclusive power to dominate and control our bodies. Sometimes this power is direct (lynching), and sometimes it is insidious (redlining). But however it appears, the power of domination and exclusion is central to the belief in being white, and without it, “white people” would cease to exist for want of reasons
  • There is nothing uniquely evil in these destroyers or even in this moment. The destroyers are merely men enforcing the whims of our country, correctly interpreting its heritage and legacy. This legacy aspires to the shackling of black bodies
  • Think of all the embraces, all the private jokes, customs, greetings, names, dreams, all the shared knowledge and capacity of a black family injected into that vessel of flesh and bone. And think of how that vessel was taken, shattered on the concrete, and all its holy contents, all that had gone into each of them, was sent flowing back to the earth. It is terrible to truly see our particular beauty, Samori, because then you see the scope of the loss. But you must push even further. You must see that this loss is mandated by the history of your country, by the Dream of living white.
  • I don’t know if you remember how the film we saw at the Petersburg Battlefield ended as though the fall of the Confederacy were the onset of a tragedy, not jubilee. I doubt you remember the man on our tour dressed in the gray wool of the Confederacy, or how every visitor seemed most interested in flanking maneuvers, hardtack, smoothbore rifles, grapeshot, and ironclads, but virtually no one was interested in what all of this engineering, invention, and design had been marshaled to achieve. You were only 10 years old. But even then I knew that I must trouble you, and this meant taking you into rooms where people would insult your intelligence, where thieves would try to enlist you in your own robbery and disguise their burning and looting as Christian charity. But robbery is what this is, what it always was.
  • American reunion was built on a comfortable narrative that made enslavement into benevolence, white knights of body snatchers, and the mass slaughter of the war into a kind of sport in which one could conclude that both sides conducted their affairs with courage, honor, and élan. This lie of the Civil War is the lie of innocence, is the Dream.
  • I, like every kid I knew, loved The Dukes of Hazzard. But I would have done well to think more about why two outlaws, driving a car named the General Lee, must necessarily be portrayed as “just some good ole boys, never meanin’ no harm”—a mantra for the Dreamers if there ever was one. But what one “means” is neither important nor relevant. It is not necessary that you believe that the officer who choked Eric Garner set out that day to destroy a body. All you need to understand is that the officer carries with him the power of the American state and the weight of an American legacy, and they necessitate that of the bodies destroyed every year, some wild and disproportionate number of them will be black.
  • Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage. Enslavement was not merely the antiseptic borrowing of labor—it is not so easy to get a human being to commit their body against its own elemental interest. And so enslavement must be casual wrath and random manglings, the gashing of heads and brains blown out over the river as the body seeks to escape. It must be rape so regular as to be industrial. There is no uplifting way to say this.
  • It had to be blood. It had to be the thrashing of kitchen hands for the crime of churning butter at a leisurely clip. It had to be some woman “chear’d ... with thirty lashes a Saturday last and as many more a Tuesday again.” It could only be the employment of carriage whips, tongs, iron pokers, handsaws, stones, paperweights, or whatever might be handy to break the black body, the black family, the black community, the black nation. The bodies were pulverized into stock and marked with insurance. And the bodies were an aspiration, lucrative as Indian land, a veranda, a beautiful wife, or a summer home in the mountains. For the men who needed to believe themselves white, the bodies were the key to a social club, and the right to break the bodies was the mark of civilization.
  • “The two great divisions of society are not the rich and poor, but white and black,” said the great South Carolina senator John C. Calhoun. “And all the former, the poor as well as the rich, belong to the upper class, and are respected and treated as equals.” And there it is—the right to break the black body as the meaning of their sacred equality. And that right has always given them meaning, has always meant that there was someone down in the valley because a mountain is not a mountain if there is nothing below.
  • There is no them without you, and without the right to break you they must necessarily fall from the mountain, lose their divinity, and tumble out of the Dream. And then they would have to determine how to build their suburbs on something other than human bones, how to angle their jails toward something other than a human stockyard, how to erect a democracy independent of cannibalism. I would like to tell you that such a day approaches when the people who believe themselves to be white renounce this demon religion and begin to think of themselves as human. But I can see no real promise of such a day. We are captured, brother, surrounded by the majoritarian bandits of America. And this has happened here, in our only home, and the terrible truth is that we cannot will ourselves to an escape on our own.
  • I think now of the old rule that held that should a boy be set upon in someone else’s chancy hood, his friends must stand with him, and they must all take their beating together. I now know that within this edict lay the key to all living. None of us were promised to end the fight on our feet, fists raised to the sky. We could not control our enemies’ number, strength, or weaponry. Sometimes you just caught a bad one. But whether you fought or ran, you did it together, because that is the part that was in our control. What we must never do is willingly hand over our own bodies or the bodies of our friends. That was the wisdom: We knew we did not lay down the direction of the street, but despite that, we could—and must—fashion the way of our walk. And that is the deeper meaning of your name—that the struggle, in and of itself, has meaning.
  • I have raised you to respect every human being as singular, and you must extend that same respect into the past. Slavery is not an indefinable mass of flesh. It is a particular, specific enslaved woman, whose mind is as active as your own, whose range of feeling is as vast as your own; who prefers the way the light falls in one particular spot in the woods, who enjoys fishing where the water eddies in a nearby stream, who loves her mother in her own complicated way, thinks her sister talks too loud, has a favorite cousin, a favorite season, who excels at dressmaking and knows, inside herself, that she is as intelligent and capable as anyone. “Slavery” is this same woman born in a world that loudly proclaims its love of freedom and inscribes this love in its essential texts, a world in which these same professors hold this woman a slave, hold her mother a slave, her father a slave, her daughter a slave, and when this woman peers back into the generations all she sees is the enslaved. She can hope for more. She can imagine some future for her grandchildren. But when she dies, the world—which is really the only world she can ever know—ends. For this woman, enslavement is not a parable. It is damnation. It is the never-ending night. And the length of that night is most of our history. Never forget that we were enslaved in this country longer than we have been free. Never forget that for 250 years black people were born into chains—whole generations followed by more generations who knew nothing but chains.
  • You must resist the common urge toward the comforting narrative of divine law, toward fairy tales that imply some irrepressible justice. The enslaved were not bricks in your road, and their lives were not chapters in your redemptive history. They were people turned to fuel for the American machine. Enslavement was not destined to end, and it is wrong to claim our present circumstance—no matter how improved—as the redemption for the lives of people who never asked for the posthumous, untouchable glory of dying for their children. Our triumphs can never redeem this. Perhaps our triumphs are not even the point. Perhaps struggle is all we have
  • I am not a cynic. I love you, and I love the world, and I love it more with every new inch I discover. But you are a black boy, and you must be responsible for your body in a way that other boys cannot know. Indeed, you must be responsible for the worst actions of other black bodies, which, somehow, will always be assigned to you. And you must be responsible for the bodies of the powerful—the policeman who cracks you with a nightstick will quickly find his excuse in your furtive movements. You have to make your peace with the chaos, but you cannot lie.
  • “I could have you arrested,” he said. Which is to say: “One of your son’s earliest memories will be watching the men who sodomized Abner Louima and choked Anthony Baez cuff, club, tase, and break you.” I had forgotten the rules, an error as dangerous on the Upper West Side of Manhattan as on the West Side of Baltimore. One must be without error out here. Walk in single file. Work quietly. Pack an extra No. 2 pencil. Make no mistakes.
  • the price of error is higher for you than it is for your countrymen, and so that America might justify itself, the story of a black body’s destruction must always begin with his or her error, real or imagined—with Eric Garner’s anger, with Trayvon Martin’s mythical words (“You are gonna die tonight”), with Sean Bell’s mistake of running with the wrong crowd, with me standing too close to the small-eyed boy pulling out.
  • You are called to struggle, not because it assures you victory but because it assures you an honorable and sane life
  • I am sorry that I cannot save you—but not that sorry. Part of me thinks that your very vulnerability brings you closer to the meaning of life, just as for others, the quest to believe oneself white divides them from it. The fact is that despite their dreams, their lives are also not inviolable. When their own vulnerability becomes real—when the police decide that tactics intended for the ghetto should enjoy wider usage, when their armed society shoots down their children, when nature sends hurricanes against their cities—they are shocked by the rages of logic and the natural world in a way that those of us who were born and bred to understand cause and effect can never be.
  • I would not have you live like them. You have been cast into a race in which the wind is always at your face and the hounds are always at your heels. And to varying degrees this is true of all life. The difference is that you do not have the privilege of living in ignorance of this essential fact.
  • I never wanted you to be twice as good as them, so much as I have always wanted you to attack every day of your brief bright life determined to struggle. The people who must believe they are white can never be your measuring stick. I would not have you descend into your own dream. I would have you be a conscious citizen of this terrible and beautiful world.
Javier E

The Moral Ill Effects of Teaching Economics | Amitai Etzioni - 1 views

  • the hypothesis that teaching economics is debasing people's morality
  • They designed a game where participants were given an allotment of tokens to divide between a private account and a public fund
  • the game was designed to promote free-riding: the socially optimal behavior would be to contribute to the public fund, but the personal advantage was in investing everything in the private fund (as long as the others did not catch on or make the same move).
  • ...10 more annotations...
  • most subjects divided their tokens nearly equally between the public and private accounts
  • Economics students, by contrast, invested only 20 percent of their tokens in the public fund, on average.
  • Three quarters of non-economists reported that a "fair" investment of tokens would necessitate putting at least half of their tokens in the public fund. A third of economists didn't answer the question or gave "complex, uncodable responses." The remaining economics students were much more likely than their non-economist peers to say that "little or no contribution was 'fair'."
  • Other studies have found economics students to exhibit a stronger tendency towards anti-social positions compared to their peers.
  • Carter and Irons had both economics students and non-economics students play the "ultimatum" game -- a two-player game where one player is given a sum of money to divide between the two. The other player is then given a chance to accept or reject the offer; if she accepts it, then each player receives the portion of money proposed by the offerer. If she declines, then neither player gets any money. Carter and Irons found that, relative to non-economics students, economics students were much more likely to offer their partners small sums, and, thus, deviate from a "fair" 50/50 spilt.
  • Finally, researchers had both economics and non-economics students fill out two "honesty surveys" -- one at the start of the semester and one at the conclusion -- regarding how likely they were to either report being undercharged for a purchase or return found money to its owner. The authors found that, after taking an economics class, students' responses to the end-of-the-semester survey were more likely to reflect a decline in honest behavior than students who studied astronomy.
  • Other studies supported these key findings. They found that economics students are less likely to consider a vendor who increases the price of bottled water on a hot day to be acting "unfairly." Economics students who played a lottery game were willing to commit less of their potential winnings to fund a consolation prize for losers than were their peers. And such students were significantly more willing to accept bribes than other students. Moreover, economics students valued personal achievement and power more than their peers while attributing less importance to social justice and equality.
  • results show that it is not just selection that is responsible for the reported increase in immoral attitudes
  • Later studies support this conclusion. They found ideological differences between lower-level economics students and upper-level economics students that are similar in kind to the measured differences between the ideology of economics students as a whole and their peers. He finds that upper-level students are even less likely to support egalitarian solutions to distribution problems than lower-level students, suggesting that time spent studying economics does have an indoctrination effect.
  • The problem is not only that students are exposed to such views, but that there are no "balancing" courses taught in typical American colleges, in which a different view of economics is presented. Moreover, while practically all economic classes are taught in the "neoclassical" (libertarian, self centered) viewpoint, in classes by non-economists -- e.g., in social philosophy, political science, and sociology -- a thousand flowers bloom such that a great variety of approaches are advanced, thereby leaving students with a cacophony of conflicting pro-social views. What is needed is a systematic pro-social economics, that combines appreciation for the common good and for others as well as for the service of self.
qkirkpatrick

Stanford psychologist: People from different cultures express sympathy differently - 0 views

  • Sympathy is influenced by cultural differences, new Stanford research shows.
  • The research showed that how much people wanted to avoid negative emotion influenced their expressions of sympathy more than how negative they actually felt, wrote Stanford psychology
  • American sympathy cards contain less negative and more positive content than German sympathy cards.
  • ...5 more annotations...
  • Americans want to avoid negative states of mind more than Germans do.
  • Cultural differences in how much people want to avoid negative emotions play a key role in how Americans and Germans feel about focusing on the negative rather than the positive when expressing sympathy.
  • When people desire to avoid negative emotions, they focus less on the negative and more on the positive when responding to another person's suffering.
  • suffering, according to the researchers. However, until now, Tsai said, no studies have specifically examined how culture shapes "different ways in which sympathy, compassion or other feelings of concern for another's suffering might be expressed."
  • Unlike when Americans talk about illness, Germans primarily focus on the negative, Tsai and Koopmann-Holm wrote. For example, the "Sturm und Drang" ("Storm and Drive") literary and musical movement in 18th-century Germany went beyond merely accepting negative emotions to actually glorifying them.
  •  
    How Culture affects someones willingness to express the negatives in a situation rather than the positives.
Javier E

How Humans Ended Up With Freakishly Huge Brains | WIRED - 0 views

  • paleontologists documented one of the most dramatic transitions in human evolution. We might call it the Brain Boom. Humans, chimps and bonobos split from their last common ancestor between 6 and 8 million years ago.
  • Starting around 3 million years ago, however, the hominin brain began a massive expansion. By the time our species, Homo sapiens, emerged about 200,000 years ago, the human brain had swelled from about 350 grams to more than 1,300 grams.
  • n that 3-million-year sprint, the human brain almost quadrupled the size its predecessors had attained over the previous 60 million years of primate evolution.
  • ...19 more annotations...
  • There are plenty of theories, of course, especially regarding why: increasingly complex social networks, a culture built around tool use and collaboration, the challenge of adapting to a mercurial and often harsh climate
  • Although these possibilities are fascinating, they are extremely difficult to test.
  • Although it makes up only 2 percent of body weight, the human brain consumes a whopping 20 percent of the body’s total energy at rest. In contrast, the chimpanzee brain needs only half that.
  • contrary to long-standing assumptions, larger mammalian brains do not always have more neurons, and the ones they do have are not always distributed in the same way.
  • The human brain has 86 billion neurons in all: 69 billion in the cerebellum, a dense lump at the back of the brain that helps orchestrate basic bodily functions and movement; 16 billion in the cerebral cortex, the brain’s thick corona and the seat of our most sophisticated mental talents, such as self-awareness, language, problem solving and abstract thought; and 1 billion in the brain stem and its extensions into the core of the brain
  • In contrast, the elephant brain, which is three times the size of our own, has 251 billion neurons in its cerebellum, which helps manage a giant, versatile trunk, and only 5.6 billion in its cortex
  • primates evolved a way to pack far more neurons into the cerebral cortex than other mammals did
  • The great apes are tiny compared to elephants and whales, yet their cortices are far denser: Orangutans and gorillas have 9 billion cortical neurons, and chimps have 6 billion. Of all the great apes, we have the largest brains, so we come out on top with our 16 billion neurons in the cortex.
  • “What kinds of mutations occurred, and what did they do? We’re starting to get answers and a deeper appreciation for just how complicated this process was.”
  • there was a strong evolutionary pressure to modify the human regulatory regions in a way that sapped energy from muscle and channeled it to the brain.
  • Accounting for body size and weight, the chimps and macaques were twice as strong as the humans. It’s not entirely clear why, but it is possible that our primate cousins get more power out of their muscles than we get out of ours because they feed their muscles more energy. “Compared to other primates, we lost muscle power in favor of sparing energy for our brains,” Bozek said. “It doesn’t mean that our muscles are inherently weaker. We might just have a different metabolism.
  • a pioneering experiment. Not only were they going to identify relevant genetic mutations from our brain’s evolutionary past, they were also going to weave those mutations into the genomes of lab mice and observe the consequences.
  • Silver and Wray introduced the chimpanzee copy of HARE5 into one group of mice and the human edition into a separate group. They then observed how the embryonic mice brains grew.
  • After nine days of development, mice embryos begin to form a cortex, the outer wrinkly layer of the brain associated with the most sophisticated mental talents. On day 10, the human version of HARE5 was much more active in the budding mice brains than the chimp copy, ultimately producing a brain that was 12 percent larger
  • “It wasn’t just a couple mutations and—bam!—you get a bigger brain. As we learn more about the changes between human and chimp brains, we realize there will be lots and lots of genes involved, each contributing a piece to that. The door is now open to get in there and really start understanding. The brain is modified in so many subtle and nonobvious ways.”
  • As recent research on whale and elephant brains makes clear, size is not everything, but it certainly counts for something. The reason we have so many more cortical neurons than our great-ape cousins is not that we have denser brains, but rather that we evolved ways to support brains that are large enough to accommodate all those extra cells.
  • There’s a danger, though, in becoming too enamored with our own big heads. Yes, a large brain packed with neurons is essential to what we consider high intelligence. But it’s not sufficient
  • No matter how large the human brain grew, or how much energy we lavished upon it, it would have been useless without the right body. Three particularly crucial adaptations worked in tandem with our burgeoning brain to dramatically increase our overall intelligence: bipedalism, which freed up our hands for tool making, fire building and hunting; manual dexterity surpassing that of any other animal; and a vocal tract that allowed us to speak and sing.
  • Human intelligence, then, cannot be traced to a single organ, no matter how large; it emerged from a serendipitous confluence of adaptations throughout the body. Despite our ongoing obsession with the size of our noggins, the fact is that our intelligence has always been so much bigger than our brain.
kushnerha

The Joy of Psyching Myself Out­ - The New York Times - 0 views

  • IS it possible to think scientifically and creatively at once? Can you be both a psychologist and a writer?
  • “A writer must be as objective as a chemist,” Anton Chekhov wrote in 1887. “He must abandon the subjective line; he must know that dung heaps play a very reasonable part in a landscape.”Chekhov’s chemist is a naturalist — someone who sees reality for what it is, rather than what it should be. In that sense, the starting point of the psychologist and the writer is the same: a curiosity that leads you to observe life in all its dimensions.
  • Without verification, we can’t always trust what we see — or rather, what we think we see. Whether we’re psychologists or writers (or anything else), our eyes are never the impartial eyes of Chekhov’s chemist. Our expectations, our wants and shoulds, get in the way. Take, once again, lying. Why do we think we know how liars behave? Liars should divert their eyes. They should feel ashamed and guilty and show the signs of discomfort that such feelings engender. And because they should, we think they do.
  • ...6 more annotations...
  • The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way. It’s also visible when psychologists choose to study one thing rather than another, dismiss evidence that doesn’t mesh with their worldview while embracing that which does. The subjectivity we tend to associate with the writerly way of looking may simply be more visible in that realm rather than exclusive to it.
  • “There is no other source of knowledge of the universe but the intellectual manipulation of carefully verified observations,” he said.
  • Intuition and inspiration, he went on, “can safely be counted as illusions, as fulfillments of wishes.” They are not to be relied on as evidence of any sort. “Science takes account of the fact that the mind of man creates such demands and is ready to trace their source, but it has not the slightest ground for thinking them justified.”
  • That is what both the psychologist and the writer should strive for: a self-knowledge that allows you to look in order to discover, without agenda, without preconception, without knowing or caring if what you’re seeing is wrong or right in your scheme of the world. It’s harder than it sounds. For one thing, you have to possess the self-knowledge that will allow you to admit when you’re wrong.
  • most new inquiries never happened — in a sense, it meant that objectivity was more an ideal than a reality. Each study was selected for a reason other than intrinsic interest.
  • Isolation precludes objectivity. It’s in the merging not simply of ways of seeing but also of modes of thought that a truly whole perception of reality may eventually emerge. Or at least that way we can realize its ultimate impossibility — and that’s not nothing, either.
Javier E

Dating Study: At What Age Are Men, Women Most Desirable? - The Atlantic - 0 views

  • “Three-quarters, or more, of people are dating aspirationally,” she says. And according to a new study, users of online-dating sites spend most of their time trying to contact people “out of their league.”In fact, most online-dating users tend to message people exactly 25 percent more desirable than they are.
  • “There’s so much folk wisdom about dating and courtship, and very little scientific evidence,” she told me recently. “My research comes out of realizing that with these large-scale data sets, we can shed light on a lot of these old dating aphorisms
  • Bruch and her colleagues analyzed thousands of messages exchanged on a “popular, free online-dating service” between more than 186,000 straight men and women. They looked only at four metro areas—New York, Boston, Chicago, and Seattle—and only at messages from January 2014.
  • ...19 more annotations...
  • imagine that you are a very desirable user. Your specific desirability rank would have been generated by two figures: whether other desirable people contacted you, and whether other desirable people responded when you contacted them. If you contacted a much less desirable person, their desirability score would rise; if they contacted you and you replied, then your score would fall.
  • The team had to analyze both first messages and first replies, because, well, men usually make the first move. “A defining feature of heterosexual online dating is that, in the vast majority of cases, it is men who establish the first contact—more than 80 percent of first messages are from men in our data set,” the study says. But “women reply very selectively to the messages they receive from men—their average reply rate is less than 20 percent—so women’s replies … can give us significant insight about who they are interested in.”
  • It found that—insofar as dating “leagues” are not different tiers of hotness, but a single ascending hierarchy of desirability—then they do seem to exist in the data. But people do not seem universally locked into them—and they can occasionally find success escaping from theirs.
  • “I mean, everybody knows—and as a sociologist, it’s been shown—that older women have a harder time in the dating market. But I hadn’t expected to see their desirability drop off from the time they’re 18 to the time they’re 65,”
  • Across the four cities and the thousands of users, consistent patterns around age, race, and education level emerge. White men and Asian women are consistently more desired than other users, while black women rank anomalously lower.
  • “what we are seeing is overwhelmingly the effect of white preferences,” she cautioned. “This site is predominantly white, 70 percent white. If this was a site that was 20 percent white, we may see a totally different desirability hierarchy.”
  • And Bruch emphasized that the hierarchy did not just depend on race, age, and education level: Because it is derived from user behavior, it “captures whatever traits people are responding to when they pursue partners. This will include traits like wittiness, genetic factors, or whatever else drives people to message,” she said.
  • Here are seven other not entirely happy takeaways from Bruch’s study:- In the study, men’s desirability peaks at age 50. But women’s desirability starts high at age 18 and falls throughout their lifespan.
  • The key, Bruch said, is that “persistence pays off.”“Reply rates [to the average message] are between zero percent and 10 percent,” she told me. Her advice: People should note those extremely low reply rates and send out more greetings.
  • I was also surprised to see how flat men’s desirability was over the age distribution,” she said. “For men, it peaks around age 40 or 50. Especially in New York.”
  • “New York is a special case for men,” Bruch told me. “It’s the market with the highest fraction of women. But it’s also about it being an incredibly dense market.”
  • “Seattle presents the most unfavorable dating climate for men, with as many as two men for every woman in some segments,” the study says
  • Across all four cities, men and women generally tended to send longer messages to people who were more desirable than them. Women, especially, deployed this strategy.
  • But the only place it paid off—and the only people for whom it worked with statistically significant success—were men in Seattle. The longest messages in the study were sent by Seattle men, the study says,“and only Seattle men experience a payoff to writing longer messages.”
  • A more educated man is almost always more desirable, on average: Men with postgraduate degrees outperform men with bachelor’s degrees; men with bachelor’s degrees beat high-school graduates.
  • “But for women, an undergraduate degree is most desirable,” the study says. “Postgraduate education is associated with decreased desirability among women.
  • men tended to use less positive language when messaging more desirable women. They may have stumbled upon this strategy through trial and error because “in all four cities, men experience slightly lower reply rates when they write more positively worded messages.”
  • “The most common behavior for both men and women is to contact members of the opposite sex who on average have roughly the same ranking as themselves,
  • “The most popular individual in our four cities, a 30-year-old woman living in New York, received 1504 messages during the period of observation,” the study says. This is “equivalent to one message every 30 min, day and night, for the entire month.”
Javier E

Look At Me by Patricia Snow | Articles | First Things - 0 views

  • Maurice stumbles upon what is still the gold standard for the treatment of infantile autism: an intensive course of behavioral therapy called applied behavioral analysis that was developed by psychologist O. Ivar Lovaas at UCLA in the 1970s
  • in a little over a year’s time she recovers her daughter to the point that she is indistinguishable from her peers.
  • Let Me Hear Your Voice is not a particularly religious or pious work. It is not the story of a miracle or a faith healing
  • ...54 more annotations...
  • Maurice discloses her Catholicism, and the reader is aware that prayer undergirds the therapy, but the book is about the therapy, not the prayer. Specifically, it is about the importance of choosing methods of treatment that are supported by scientific data. Applied behavioral analysis is all about data: its daily collection and interpretation. The method is empirical, hard-headed, and results-oriented.
  • on a deeper level, the book is profoundly religious, more religious perhaps than its author intended. In this reading of the book, autism is not only a developmental disorder afflicting particular individuals, but a metaphor for the spiritual condition of fallen man.
  • Maurice’s autistic daughter is indifferent to her mother
  • In this reading of the book, the mother is God, watching a child of his wander away from him into darkness: a heartbroken but also a determined God, determined at any cost to bring the child back
  • the mother doesn’t turn back, concedes nothing to the condition that has overtaken her daughter. There is no political correctness in Maurice’s attitude to autism; no nod to “neurodiversity.” Like the God in Donne’s sonnet, “Batter my heart, three-personed God,” she storms the walls of her daughter’s condition
  • Like God, she sets her sights high, commits both herself and her child to a demanding, sometimes painful therapy (life!), and receives back in the end a fully alive, loving, talking, and laughing child
  • the reader realizes that for God, the harrowing drama of recovery is never a singular, or even a twice-told tale, but a perennial one. Every child of his, every child of Adam and Eve, wanders away from him into darkness
  • we have an epidemic of autism, or “autism spectrum disorder,” which includes classic autism (Maurice’s children’s diagnosis); atypical autism, which exhibits some but not all of the defects of autism; and Asperger’s syndrome, which is much more common in boys than in girls and is characterized by average or above average language skills but impaired social skills.
  • At the same time, all around us, we have an epidemic of something else. On the street and in the office, at the dinner table and on a remote hiking trail, in line at the deli and pushing a stroller through the park, people go about their business bent over a small glowing screen, as if praying.
  • This latter epidemic, or experiment, has been going on long enough that people are beginning to worry about its effects.
  • for a comprehensive survey of the emerging situation on the ground, the interested reader might look at Sherry Turkle’s recent book, Reclaiming Conversation: The Power of Talk in a Digital Age.
  • she also describes in exhaustive, chilling detail the mostly horrifying effects recent technology has had on families and workplaces, educational institutions, friendships and romance.
  • many of the promises of technology have not only not been realized, they have backfired. If technology promised greater connection, it has delivered greater alienation. If it promised greater cohesion, it has led to greater fragmentation, both on a communal and individual level.
  • If thinking that the grass is always greener somewhere else used to be a marker of human foolishness and a temptation to be resisted, today it is simply a possibility to be checked out. The new phones, especially, turn out to be portable Pied Pipers, irresistibly pulling people away from the people in front of them and the tasks at hand.
  • all it takes is a single phone on a table, even if that phone is turned off, for the conversations in the room to fade in number, duration, and emotional depth.
  • an infinitely malleable screen isn’t an invitation to stability, but to restlessness
  • Current media, and the fear of missing out that they foster (a motivator now so common it has its own acronym, FOMO), drive lives of continual interruption and distraction, of virtual rather than real relationships, and of “little” rather than “big” talk
  • if you may be interrupted at any time, it makes sense, as a student explains to Turkle, to “keep things light.”
  • we are reaping deficits in emotional intelligence and empathy; loneliness, but also fears of unrehearsed conversations and intimacy; difficulties forming attachments but also difficulties tolerating solitude and boredom
  • consider the testimony of the faculty at a reputable middle school where Turkle is called in as a consultant
  • The teachers tell Turkle that their students don’t make eye contact or read body language, have trouble listening, and don’t seem interested in each other, all markers of autism spectrum disorder
  • Like much younger children, they engage in parallel play, usually on their phones. Like autistic savants, they can call up endless information on their phones, but have no larger context or overarching narrative in which to situate it
  • Students are so caught up in their phones, one teacher says, “they don’t know how to pay attention to class or to themselves or to another person or to look in each other’s eyes and see what is going on.
  • “It is as though they all have some signs of being on an Asperger’s spectrum. But that’s impossible. We are talking about a schoolwide problem.”
  • Can technology cause Asperger’
  • “It is not necessary to settle this debate to state the obvious. If we don’t look at our children and engage them in conversation, it is not surprising if they grow up awkward and withdrawn.”
  • In the protocols developed by Ivar Lovaas for treating autism spectrum disorder, every discrete trial in the therapy, every drill, every interaction with the child, however seemingly innocuous, is prefaced by this clear command: “Look at me!”
  • If absence of relationship is a defining feature of autism, connecting with the child is both the means and the whole goal of the therapy. Applied behavioral analysis does not concern itself with when exactly, how, or why a child becomes autistic, but tries instead to correct, do over, and even perhaps actually rewire what went wrong, by going back to the beginning
  • Eye contact—which we know is essential for brain development, emotional stability, and social fluency—is the indispensable prerequisite of the therapy, the sine qua non of everything that happens.
  • There are no shortcuts to this method; no medications or apps to speed things up; no machines that can do the work for us. This is work that only human beings can do
  • it must not only be started early and be sufficiently intensive, but it must also be carried out in large part by parents themselves. Parents must be trained and involved, so that the treatment carries over into the home and continues for most of the child’s waking hours.
  • there are foundational relationships that are templates for all other relationships, and for learning itself.
  • Maurice’s book, in other words, is not fundamentally the story of a child acquiring skills, though she acquires them perforce. It is the story of the restoration of a child’s relationship with her parents
  • it is also impossible to overstate the time and commitment that were required to bring it about, especially today, when we have so little time, and such a faltering, diminished capacity for sustained engagement with small children
  • The very qualities that such engagement requires, whether our children are sick or well, are the same qualities being bred out of us by technologies that condition us to crave stimulation and distraction, and by a culture that, through a perverse alchemy, has changed what was supposed to be the freedom to work anywhere into an obligation to work everywhere.
  • In this world of total work (the phrase is Josef Pieper’s), the work of helping another person become fully human may be work that is passing beyond our reach, as our priorities, and the technologies that enable and reinforce them, steadily unfit us for the work of raising our own young.
  • in Turkle’s book, as often as not, it is young people who are distressed because their parents are unreachable. Some of the most painful testimony in Reclaiming Conversation is the testimony of teenagers who hope to do things differently when they have children, who hope someday to learn to have a real conversation, and so o
  • it was an older generation that first fell under technology’s spell. At the middle school Turkle visits, as at many other schools across the country, it is the grown-ups who decide to give every child a computer and deliver all course content electronically, meaning that they require their students to work from the very medium that distracts them, a decision the grown-ups are unwilling to reverse, even as they lament its consequences.
  • we have approached what Turkle calls the robotic moment, when we will have made ourselves into the kind of people who are ready for what robots have to offer. When people give each other less, machines seem less inhuman.
  • robot babysitters may not seem so bad. The robots, at least, will be reliable!
  • If human conversations are endangered, what of prayer, a conversation like no other? All of the qualities that human conversation requires—patience and commitment, an ability to listen and a tolerance for aridity—prayer requires in greater measure.
  • this conversation—the Church exists to restore. Everything in the traditional Church is there to facilitate and nourish this relationship. Everything breathes, “Look at me!”
  • there is a second path to God, equally enjoined by the Church, and that is the way of charity to the neighbor, but not the neighbor in the abstract.
  • “Who is my neighbor?” a lawyer asks Jesus in the Gospel of Luke. Jesus’s answer is, the one you encounter on the way.
  • Virtue is either concrete or it is nothing. Man’s path to God, like Jesus’s path on the earth, always passes through what the Jesuit Jean Pierre de Caussade called “the sacrament of the present moment,” which we could equally call “the sacrament of the present person,” the way of the Incarnation, the way of humility, or the Way of the Cross.
  • The tradition of Zen Buddhism expresses the same idea in positive terms: Be here now.
  • Both of these privileged paths to God, equally dependent on a quality of undivided attention and real presence, are vulnerable to the distracting eye-candy of our technologies
  • Turkle is at pains to show that multitasking is a myth, that anyone trying to do more than one thing at a time is doing nothing well. We could also call what she was doing multi-relating, another temptation or illusion widespread in the digital age. Turkle’s book is full of people who are online at the same time that they are with friends, who are texting other potential partners while they are on dates, and so on.
  • This is the situation in which many people find themselves today: thinking that they are special to someone because of something that transpired, only to discover that the other person is spread so thin, the interaction was meaningless. There is a new kind of promiscuity in the world, in other words, that turns out to be as hurtful as the old kind.
  • Who can actually multitask and multi-relate? Who can love everyone without diluting or cheapening the quality of love given to each individual? Who can love everyone without fomenting insecurity and jealousy? Only God can do this.
  • When an individual needs to be healed of the effects of screens and machines, it is real presence that he needs: real people in a real world, ideally a world of God’s own making
  • Nature is restorative, but it is conversation itself, unfolding in real time, that strikes these boys with the force of revelation. More even than the physical vistas surrounding them on a wilderness hike, unrehearsed conversation opens up for them new territory, open-ended adventures. “It was like a stream,” one boy says, “very ongoing. It wouldn’t break apart.”
  • in the waters of baptism, the new man is born, restored to his true parent, and a conversation begins that over the course of his whole life reminds man of who he is, that he is loved, and that someone watches over him always.
  • Even if the Church could keep screens out of her sanctuaries, people strongly attached to them would still be people poorly positioned to take advantage of what the Church has to offer. Anxious people, unable to sit alone with their thoughts. Compulsive people, accustomed to checking their phones, on average, every five and a half minutes. As these behaviors increase in the Church, what is at stake is man’s relationship with truth itself.
Javier E

The Virtues of Reality - The New York Times - 1 views

  • SINCE the 1990s, we’ve seen two broad social changes that few observers would have expected to happen together.
  • First, youth culture has become less violent, less promiscuous and more responsible. American childhood is safer than ever before. Teenagers drink and smoke less than previous generations. The millennial generation has fewer sexual partners than its parents, and the teen birthrate has traced a two-decade decline. Violent crime — a young person’s temptation — fell for 25 years before the recent post-Ferguson homicide spike. Young people are half as likely to have been in a fight than a generation ago. Teen suicides, binge drinking, hard drug use — all are down.
  • But over the same period, adulthood has become less responsible, less obviously adult. For the first time in over a century, more 20-somethings live with their parents than in any other arrangement. The marriage rate is way down, and despite a high out-of-wedlock birthrate American fertility just hit an all-time low. More and more prime-age workers are dropping out of the work force — men especially, and younger men more so than older men, though female work force participation has dipped as well.
  • ...6 more annotations...
  • I want to advance a technology-driven hypothesis: This mix of youthful safety and adult immaturity may be a feature of life in a society increasingly shaped by the internet’s virtual realities.
  • It is easy to see how online culture would make adolescent life less dangerous. Pornography to take the edge off teenage sexual appetite. Video games instead of fisticuffs or contact sports as an outlet for hormonal aggression. (Once it was feared that porn and violent media would encourage real-world aggression; instead they seem to be replacing it.) Sexting and selfie-enabled masturbation as a safer alternative to hooking up. Online hangouts instead of keggers in the field. More texting and driving, but less driving — one of the most dangerous teen activities — overall.
  • The question is whether this substitution is habit-forming and soul-shaping, and whether it extends beyond dangerous teen behavior to include things essential to long-term human flourishing — marriage, work, family, all that old-fashioned “meatspace” stuff.
  • It wasn’t so long ago that people worried about a digital divide, in which online access would be a luxury good that left the bottom half behind. But if anything, the virtual world looks more like an opiate for the masses
  • trends in the marketplace — ever-more-customized pornography, virtual realities that feel more and more immersive, devices and apps customized for addictive behavior — seem likely to overwhelm most attempts to enjoy the virtual only within limits.
  • Patricia Snow (yes, even columnists have mothers), in an essay for First Things earlier this year, suggested that any effective resistance to virtual reality’s encroachments would need to be moral and religious, not just pragmatic and managerial
Javier E

The Panic of 2020? Oh, I Made a Ton of Money-and So Did You - WSJ - 0 views

  • In a classic experiment in 1972, researchers asked people to estimate the likelihood that various positive and negative outcomes might result from President Richard Nixon’s upcoming trips to China and Russia that year. We now call those visits “historic” because they thawed decades of hostility between the U.S. and the communist powers. In advance, no one knew whether the trips would accomplish anything.
  • About two weeks after Nixon’s visits, 71% of people recalled putting better odds on his success than they had at the time. Four months on, 81% remembered being more sure Nixon would succeed than they had said beforehand.
  • One week after the verdict in the 1995 murder trial of O.J. Simpson, 58% of people in a study recalled predicting he would be found not guilty; a year afterward, 68% remembered saying he would be acquitted. In fact, only 48% of them had said so before the verdict.
  • ...5 more annotations...
  • In short, learning what did happen impedes you from retrieving what you thought would happen.
  • In 2002, psychologists asked nearly 1,000 Americans to recall how likely they had expected terrorism-related incidents—and other risky events—to be in the immediate aftermath of Sept. 11, 2001. After a year in which fears had mostly subsided, they remembered being much less pessimistic than they had been at the time.
  • “We’re biased to see ourselves in a positive light,” says Deborah Small, a psychologist at the Wharton School at the University of Pennsylvania. “We want to believe that we’re rational and smart. We’ll recall our past actions as more sensible than they were. We also give ourselves too much credit and don’t remember our mistakes as well as we do our successes.”
  • take what psychologist Daniel Kahneman calls “the outside view.” Rather than try to figure out exactly how bad this crisis will be, look at the broader set of historical precedents.
  • Since 1929, the S&P 500 has suffered 14 bear markets, defined by S&P Dow Jones Indices as losses of at least 20%. The shortest and shallowest was the 20% drop that lasted less than three months in late 1990. The deepest was the 86.2% collapse from September 1929 to June 1932; the longest, the 60% plunge from March 1937 to April 1942. On average, bear markets lasted 19 months and dealt a 39% loss.
Javier E

The meaning of life in a world without work | Technology | The Guardian - 0 views

  • As artificial intelligence outperforms humans in more and more tasks, it will replace humans in more and more jobs.
  • Many new professions are likely to appear: virtual-world designers, for example. But such professions will probably require more creativity and flexibility, and it is unclear whether 40-year-old unemployed taxi drivers or insurance agents will be able to reinvent themselves as virtual-world designers
  • The crucial problem isn’t creating new jobs. The crucial problem is creating new jobs that humans perform better than algorithms. Consequently, by 2050 a new class of people might emerge – the useless class. People who are not just unemployed, but unemployable.
  • ...15 more annotations...
  • The same technology that renders humans useless might also make it feasible to feed and support the unemployable masses through some scheme of universal basic income.
  • The real problem will then be to keep the masses occupied and content. People must engage in purposeful activities, or they go crazy. So what will the useless class do all day?
  • One answer might be computer games. Economically redundant people might spend increasing amounts of time within 3D virtual reality worlds, which would provide them with far more excitement and emotional engagement than the “real world” outside.
  • This, in fact, is a very old solution. For thousands of years, billions of people have found meaning in playing virtual reality games. In the past, we have called these virtual reality games “religions”.
  • Muslims and Christians go through life trying to gain points in their favorite virtual reality game. If you pray every day, you get points. If you forget to pray, you lose points. If by the end of your life you gain enough points, then after you die you go to the next level of the game (aka heaven).
  • As religions show us, the virtual reality need not be encased inside an isolated box. Rather, it can be superimposed on the physical reality. In the past this was done with the human imagination and with sacred books, and in the 21st century it can be done with smartphones.
  • Consumerism too is a virtual reality game. You gain points by acquiring new cars, buying expensive brands and taking vacations abroad, and if you have more points than everybody else, you tell yourself you won the game.
  • we saw two others kids on the street who were hunting the same Pokémon, and we almost got into a fight with them. It struck me how similar the situation was to the conflict between Jews and Muslims about the holy city of Jerusalem. When you look at the objective reality of Jerusalem, all you see are stones and buildings. There is no holiness anywhere. But when you look through the medium of smartbooks (such as the Bible and the Qur’an), you see holy places and angels everywhere.
  • In the end, the real action always takes place inside the human brain. Does it matter whether the neurons are stimulated by observing pixels on a computer screen, by looking outside the windows of a Caribbean resort, or by seeing heaven in our mind’s eyes?
  • Indeed, one particularly interesting section of Israeli society provides a unique laboratory for how to live a contented life in a post-work world. In Israel, a significant percentage of ultra-orthodox Jewish men never work. They spend their entire lives studying holy scriptures and performing religion rituals. They and their families don’t starve to death partly because the wives often work, and partly because the government provides them with generous subsidies. Though they usually live in poverty, government support means that they never lack for the basic necessities of life.
  • That’s universal basic income in action. Though they are poor and never work, in survey after survey these ultra-orthodox Jewish men report higher levels of life-satisfaction than any other section of Israeli society.
  • Hence virtual realities are likely to be key to providing meaning to the useless class of the post-work world. Maybe these virtual realities will be generated inside computers. Maybe they will be generated outside computers, in the shape of new religions and ideologies. Maybe it will be a combination of the two. The possibilities are endless
  • In any case, the end of work will not necessarily mean the end of meaning, because meaning is generated by imagining rather than by working.
  • People in 2050 will probably be able to play deeper games and to construct more complex virtual worlds than in any previous time in history.
  • But what about truth? What about reality? Do we really want to live in a world in which billions of people are immersed in fantasies, pursuing make-believe goals and obeying imaginary laws? Well, like it or not, that’s the world we have been living in for thousands of years already.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Javier E

The best time of day - and year - to work most effectively - The Washington Post - 0 views

  • Some of us are larks -- some of us are owls. But if you look at distribution, most of us are a little bit of both — what I call “third birds.”
  • There's a period of day when we’re at our peak, and that's best for doing analytic tasks things like writing a report or auditing a financial statement. There's the trough, which is the dip -- that’s not good for anything. And then there’s recovery, which is less optimal, but we do better at insight and creativity tasks.
  • the bigger issue here is that we have thought of "when" as a second order question. We take questions of how we do things, what we do, and who I do it with very seriously, but we stick the "when" questions over at the kids’ table.
  • ...14 more annotations...
  • What is it about a new year? How does our psychology influence how we think about that and making fresh starts? We do what social psychologists call temporal accounting -- that is, we have a ledger in our head of how we are spending our time. What we’re trying to do, in some cases, is relegate our previous selves to the past: This year we’re going to do a lot better.
  • breaks are much more important than we realize.
  • Many hard-core workplaces think of breaks as a deviation from performance, when in fact the science of breaks tells us they’re a part of performance.
  • Research shows us that social breaks are better than solo breaks -- taking a break with somebody else is more restorative than doing it on your own. A break that involves movement is better than a stationary one. And then there's the restorative power in nature. Simply going outside outside rather than being inside, simply being able to look out a window during a break is better. And there's the importance of being fully detached,
  • Every day I write down two breaks that I’m going to take. I make a 'break list,' and I try to treat them with the same reverence with which I’d treat scheduled meetings. We would never skip a meeting.
  • When you're giving feedback to employees, should you give good news or bad news first?
  • Here’s where you should go first: If you’re not the default choice
  • If you are the default choice, you’re better off not going first. What happens is that early in a process, people are more likely to be open-minded, to challenge assumptions. But over time, they wear out, and they’re more likely to go with the default choice.
  • Also, if you’re operating in an uncertain environment -- and this is actually really important -- where the criteria for selections are not fully fully sharp, you’re better off going at the end. In the beginning, the judges are still trying to figure out what they want.
  • In fact, what researchers have found is that at the beginning, project teams pretty much do nothing. They bicker, they dicker. Yet astonishingly, many project teams she followed ended up really getting started in earnest at the exact midpoint. If you give a team 34 days, they’ll get started in earnest on day 17. This is actually a big shift in the way organizational scholars thought about how teams work.
  • There are two key things a leader can do at a midpoint. One is to identify it to make it salient: Say "ok guys, it’s day 17 of this 35 day project. We better get going."
  • The second comes from research on basketball. It shows that when teams are ahead at the midpoint, they get complacent. When they’re way behind at the midpoint, they get demoralized. But when they’re a little behind, it can be galvanizing. So what leaders can do is suggest hey, we’re a little bit behind.
  • One of the issues you explore is when it pays to go first — whether you’re up for a competitive pitch or trying to get a job. When is it good to go first
  • If you ask people what they prefer, four out of five prefer getting the bad news first. The reason has to do with endings. Given the choice, human beings prefer endings that elevate. We prefer endings that go up, that have a rising sequence rather than a declining sequence.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
caelengrubb

11 mind-blowing facts about the US economy | Markets Insider - 1 views

  • For more than a century, the United States has been the world's economic powerhouse.
  • The US is on the verge of its longest economic expansion on record
  • Last May, the US economy's streak of more than eight years of economic growth became the nation's second longest on record. It's been a slow climb following the Great Recession, but it's growth nonetheless.
  • ...22 more annotations...
  • But the US also just hit a record 13 straight years without 3% real GDP growth
  • While the US has had a record period of economic expansion, it's not setting the world on fire. It's been a record 13 straight years without reaching 3% real gross domestic product growth. The US has come close, hitting 2.9% growth in 2018, but America hasn't hit a real GDP growth of 3% since 2005, when it grew 3.5%
  • The decade-long expansion has generated 20 million jobs
  • With economic growth stretching the past decade, key figures continue to get better. A 3.4% year-over-year wage growth is the strongest in more than a decade, a good sign as stagnant wages have kept the US middle class at bay
  • Still, the jobless rate fell to 3.8%
  • Sleep deprivation costs the US economy billions of dollars
  • More than a third of the US adult population doesn't get enough sleep, and that costs the US $411 billion through the loss of 1.2 million work days each year.
  • The lack of sleep can come from a variety of factors, whether it's overworking, poor health habits, or even the horrid blue light from electronics
  • About $100,000 separates the middle class from the upper class
  • In 2011, 51% of Americans were considered middle class, and that number grew slightly to 52% in 2016
  • Generation Z might spend as much as $143 billion next year
  • Generation Z, the population born between 1997 and 2012, will make up 40% of US consumers by next year.
  • The average car part crosses into Mexico and Canada eight times in production
  • Mexico is the top trade partner, with the US exporting $21.9 billion worth of products to its southern neighbor and importing $27.7 billion, making up 14.8% of all US trade. Canada, meanwhile, makes up 13.8% of US trade as it imports $22.6 billion worth of American goods and sends in $23.4 billion
  • If California were a country, it would have the fifth highest GDP in the world
  • With a gross domestic product of $2.747 trillion, California would only trail Germany, Japan, China, and the US as a whole.
  • The US spends more on defense than the next seven nations combined
  • That $610 billion is good for 15% of all federal spending
  • The US national debt is at an all-time high
  • In February, US government debt hit an all-time high of $22 trillion
  • The sports industry is worth nearly $75 billion
  • A sports-industry report back in 2015 predicted the market in North America would be worth more than $73.5 billion by this year.
caelengrubb

The Linguistic Evolution of 'Like' - The Atlantic - 0 views

  • In our mouths or in print, in villages or in cities, in buildings or in caves, a language doesn’t sit still. It can’t. Language change has preceded apace even in places known for preserving a language in amber
  • Because we think of like as meaning “akin to” or “similar to,” kids decorating every sentence or two with it seems like overuse. After all, how often should a coherently minded person need to note that something is similar to something rather than just being that something?
  • First, let’s take like in just its traditional, accepted forms. Even in its dictionary definition, like is the product of stark changes in meaning that no one would ever guess.
  • ...19 more annotations...
  • To an Old English speaker, the word that later became like was the word for, of all things, “body.”
  • The word was lic, and lic was part of a word, gelic, that meant “with the body,” as in “with the body of,” which was a way of saying “similar to”—as in like
  • It was just that, step by step, the syllable lic, which to an Old English speaker meant “body,” came to mean, when uttered by people centuries later, “similar to”—and life went on.
  • Like has become a piece of grammar: It is the source of the suffix -ly.
  • Like has become a part of compounds. Likewise began as like plus a word, wise, which was different from the one meaning “smart when either a child or getting old.”
  • Dictionaries tell us it’s pronounced “like-MINE-did,” but I, for one, say “LIKE- minded” and have heard many others do so
  • Therefore, like is ever so much more than some isolated thing clinically described in a dictionary with a definition like “(preposition) ‘having the same characteristics or qualities as; similar to.’”
  • What we are seeing in like’s transformations today are just the latest chapters in a story that began with an ancient word that was supposed to mean “body.”
  • It’s under this view of language—as something becoming rather than being, a film rather than a photo, in motion rather than at rest—that we should consider the way young people use (drum roll, please) like
  • The new like, then, is associated with hesitation.
  • So today’s like did not spring mysteriously from a crowd on the margins of unusual mind-set and then somehow jump the rails from them into the general population.
  • The problem with the hesitation analysis is that this was a thoroughly confident speaker.
  • It’s real-life usage of this kind—to linguists it is data, just like climate patterns are to meteorologists—that suggests that the idea of like as the linguistic equivalent to slumped shoulders is off.
  • Understandably so, of course—the meaning of like suggests that people are claiming that everything is “like” itself rather than itself.
  • The new like acknowledges unspoken objection while underlining one’s own point (the factuality). Like grandparents translates here as “There were, despite what you might think, actually grandparents.”
  • Then there is a second new like, which is closer to what people tend to think of all its new uses: it is indeed a hedge.
  • Then, the two likes I have mentioned must be distinguished from yet a third usage, the quotative like—as in “And she was like, ‘I didn’t even invite him.’
  • This is yet another way that like has become grammar. The meaning “similar to” is as natural a source here as it was for -ly: mimicking people’s utterances is talking similarly to, as in “like,” them.
  • Thus the modern American English speaker has mastered not just two, but actually three different new usages of like.
« First ‹ Previous 41 - 60 of 2327 Next › Last »
Showing 20 items per page