Skip to main content

Home/ TOK Friends/ Group items tagged habits

Rss Feed Group items tagged

Emily Horwitz

Could A 'Brain Pacemaker' Someday Treat Severe Anorexia? : Shots - Health News : NPR - 0 views

  • Many people who get anorexia recover after therapy and counseling. But in about 20 to 30 percent of cases, the disease becomes a chronic condition that gets tougher and tougher to treat.
  • Neurosurgeons from the University of Toronto tried a technique called deep brain stimulation to see if it might help patients with severe anorexia.
  • The results didn't meet the statistical tests for significance.
  • ...7 more annotations...
  • "But since we don't have anything that works well for these individuals — that have a high risk of mortality – it warrants cautious optimism and further study."
  • doctors implant tiny electrodes next to a region of the brain thought to be dysfunctional. A device, similar to a heart pacemaker, then sends waves of electricity through a wire to the electrodes.
  • In the latest study, neurosurgeons in Toronto implanted the electrodes in the brains of six women with chronic anorexia. Five of them had been struggling with the disease for over a decade. All of them had experienced serious health problems from it, including heart attacks in some cases.
  • "My symptoms were so severe. I would wake up in the middle of night and run up and down the stairs for hours or go for a five-hour run," she tells Shots. "I became very isolated. I didn't want to be around anyone because they kept me from exercising."
  • "It was brain surgery! But I had had a heart attack at 28 and two strokes, " she says. "My mom was in the midst of planning my funeral. If I didn't take this chance, I knew my path would probably lead to death."
  • Rollins admits that the deep brain stimulation wasn't a magic bullet. She's had to continue her anorexia treatment to get where she is. "I still see a psychiatrist regularly and a dietitian. It [the deep brain stimulation] enables me to do the work that I need to do a lot easy."
  • Deep brain stimulation can cause serious side effects, Lipsman says, like seizures, and more milder ones, like pain and nausea. "This is a brain surgery – there's no sugarcoating that," he says. "The primary objective of this study was to establish that this a safe procedure for these patients who have been quite ill before the surgery. That's all we can say right now."
  •  
    an interesting article that seems to pose the question: can our habits/perceptions be changed by brain stimulation?
Javier E

Covering politics in a "post-truth" America | Brookings Institution - 0 views

  • The media scandal of 2016 isn’t so much about what reporters failed to tell the American public; it’s about what they did report on, and the fact that it didn’t seem to matter.
  • Facebook and Snapchat and the other social media sites should rightfully be doing a lot of soul-searching about their role as the most efficient distribution network for conspiracy theories, hatred, and outright falsehoods ever invented.
  • I’ve been obsessively looking back over our coverage, too, trying to figure out what we missed along the way to the upset of the century
  • ...28 more annotations...
  • (An early conclusion: while we were late to understand how angry white voters were, a perhaps even more serious lapse was in failing to recognize how many disaffected Democrats there were who would stay home rather than support their party’s flawed candidate.)
  • Stories that would have killed any other politician—truly worrisome revelations about everything from the federal taxes Trump dodged to the charitable donations he lied about, the women he insulted and allegedly assaulted, and the mob ties that have long dogged him—did not stop Trump from thriving in this election year
  • the Oxford Dictionaries announced that “post-truth” had been chosen as the 2016 word of the year, defining it as a condition “in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”
  • Meantime, Trump personally blacklisted news organizations like Politico and The Washington Post when they published articles he didn’t like during the campaign, has openly mused about rolling back press freedoms enshrined by the U.S. Supreme Court, and has now named Stephen Bannon, until recently the executive chairman of Breitbart—a right-wing fringe website with a penchant for conspiracy theories and anti-Semitic tropes—to serve as one of his top White House advisers.
  • none of this has any modern precedent. And what makes it unique has nothing to do with the outcome of the election. This time, the victor was a right-wing demagogue; next time, it may be a left-wing populist who learns the lessons of Trump’s win.
  • This is no mere academic argument. The election of 2016 showed us that Americans are increasingly choosing to live in a cloud of like-minded spin, surrounded by the partisan political hackery and fake news that poisons their Facebook feeds.
  • To help us understand it all, there were choices, but not that many: three TV networks that mattered, ABC, CBS, and NBC; two papers for serious journalism, The New York Times and The Washington Post; and two giant-circulation weekly newsmagazines, Time and Newsweek. That, plus whatever was your local daily newspaper, pretty much constituted the news.
  • Fake news is thriving In the final three months of the presidential campaign, the 20 top-performing fake election news stories generated more engagement on Facebook than the top stories from major news outlets such as The New York Times.
  • Eventually, I came to think of the major media outlets of that era as something very similar to the big suburban shopping malls we flocked to in the age of shoulder pads and supply-side economics: We could choose among Kmart and Macy’s and Saks Fifth Avenue as our budgets and tastes allowed, but in the end the media were all essentially department stores, selling us sports and stock tables and foreign news alongside our politics, whether we wanted them or not. It may not have been a monopoly, but it was something pretty close.
  • This was still journalism in the scarcity era, and it affected everything from what stories we wrote to how fast we could produce them. Presidents could launch global thermonuclear war with the Russians in a matter of minutes, but news from the American hinterlands often took weeks to reach their sleepy capital. Even information within that capital was virtually unobtainable without a major investment of time and effort. Want to know how much a campaign was raising and spending from the new special-interest PACs that had proliferated? Prepare to spend a day holed up at the Federal Election Commission’s headquarters down on E Street across from the hulking concrete FBI building, and be sure to bring a bunch of quarters for the copy machine.
  • I am writing this in the immediate, shocking aftermath of a 2016 presidential election in which the Pew Research Center found that a higher percentage of Americans got their information about the campaign from late-night TV comedy shows than from a national newspaper. Don Graham sold the Post three years ago and though its online audience has been skyrocketing with new investments from Amazon.com founder Jeff Bezos, it will never be what it was in the ‘80s. That same Pew survey reported that a mere 2 percent of Americans today turned to such newspapers as the “most helpful” guides to the presidential campaign.
  • In 2013, Mark Leibovich wrote a bestselling book called This Town about the party-hopping, lobbyist-enabling nexus between Washington journalists and the political world they cover. A key character was Politico’s Mike Allen, whose morning email newsletter “Playbook” had become a Washington ritual, offering all the news and tidbits a power player might want to read before breakfast—and Politico’s most successful ad franchise to boot. In many ways, even that world of just a few years ago now seems quaint: the notion that anyone could be a single, once-a-day town crier in This Town (or any other) has been utterly exploded by the move to Twitter, Facebook, and all the rest. We are living, as Mark put it to me recently, “in a 24-hour scrolling version of what ‘Playbook’ was.”
  • Whether it was Walter Cronkite or The New York Times, they preached journalistic “objectivity” and spoke with authority when they pronounced on the day’s developments—but not always with the depth and expertise that real competition or deep specialization might have provided. They were great—but they were generalists.
  • I remained convinced that reporting would hold its value, especially as our other advantages—like access to information and the expensive means to distribute it—dwindled. It was all well and good to root for your political team, but when it mattered to your business (or the country, for that matter), I reasoned, you wouldn’t want cheerleading but real reporting about real facts. Besides, the new tools might be coming at us with dizzying speed—remember when that radical new video app Meerkat was going to change absolutely everything about how we cover elections?—but we would still need reporters to find a way inside Washington’s closed doors and back rooms, to figure out what was happening when the cameras weren’t rolling.
  • And if the world was suffering from information overload—well, so much the better for us editors; we would be all the more needed to figure out what to listen to amid the noise.
  • Trump turned out to be more correct than we editors were: the more relevant point of the Access Hollywood tape was not about the censure Trump would now face but the political reality that he, like Bill Clinton, could survive this—or perhaps any scandal. Yes, we were wrong about the Access Hollywood tape, and so much else.
  • These days, Politico has a newsroom of 200-odd journalists, a glossy award-winning magazine, dozens of daily email newsletters, and 16 subscription policy verticals. It’s a major player in coverage not only of Capitol Hill but many other key parts of the capital, and some months during this election year we had well over 30 million unique visitors to our website, a far cry from the controlled congressional circulation of 35,000 that I remember Roll Call touting in our long-ago sales materials.
  • , we journalists were still able to cover the public theater of politics while spending more of our time, resources, and mental energy on really original reporting, on digging up stories you couldn’t read anywhere else. Between Trump’s long and checkered business past, his habit of serial lying, his voluminous and contradictory tweets, and his revision of even his own biography, there was lots to work with. No one can say that Trump was elected without the press telling us all about his checkered past.
  • politics was NEVER more choose-your-own-adventure than in 2016, when entire news ecosystems for partisans existed wholly outside the reach of those who at least aim for truth
  • Pew found that nearly 50 percent of self-described conservatives now rely on a single news source, Fox, for political information they trust.
  • As for the liberals, they trust only that they should never watch Fox, and have MSNBC and Media Matters and the remnants of the big boys to confirm their biases.
  • And then there are the conspiracy-peddling Breitbarts and the overtly fake-news outlets of this overwhelming new world; untethered from even the pretense of fact-based reporting, their version of the campaign got more traffic on Facebook in the race’s final weeks than all the traditional news outlets combined.
  • When we assigned a team of reporters at Politico during the primary season to listen to every single word of Trump’s speeches, we found that he offered a lie, half-truth, or outright exaggeration approximately once every five minutes—for an entire week. And it didn’t hinder him in the least from winning the Republican presidential nomination.
  • when we repeated the exercise this fall, in the midst of the general election campaign, Trump had progressed to fibs of various magnitudes just about once every three minutes!
  • By the time Trump in September issued his half-hearted disavowal of the Obama “birther” whopper he had done so much to create and perpetuate, one national survey found that only 1 in 4 Republicans was sure that Obama was born in the U.S., and various polls found that somewhere between a quarter and a half of Republicans believed he’s Muslim. So not only did Trump think he was entitled to his own facts, so did his supporters. It didn’t stop them at all from voting for him.
  • in part, it’s not just because they disagree with the facts as reporters have presented them but because there’s so damn many reporters, and from such a wide array of outlets, that it’s often impossible to evaluate their standards and practices, biases and preconceptions. Even we journalists are increasingly overwhelmed.
  • So much terrific reporting and writing and digging over the years and … Trump? What happened to consequences? Reporting that matters? Sunlight, they used to tell us, was the best disinfectant for what ails our politics.
  • 2016 suggests a different outcome: We’ve achieved a lot more transparency in today’s Washington—without the accountability that was supposed to come with it.
sissij

Scientists have a theory on why you break eye contact | Fox News - 1 views

  • they write that eye contact actually "disrupts resources available to cognitive control processes during verb generation." In other words, when you need to come up with certain words under certain circumstances, maintaining eye contact depletes the very brain resources you need to find the word.
  • but Scientific American suggests that if looking away while thinking is cross-cultural, "perhaps cultures with less emphasis on eye contact enable deeper thinking during a given conversation."
  •  
    The hypothesis that people with different culture may have different habits is very interesting. People always avoid eye contact with unfamiliar people. Probably in our nature, we think that making eye contact is an action of provocation. When I am speaking, I seldom make eye contact with others and eye contacts make me nervous and I sometimes feel that my brain was shut down and couldn't come up with a word. I feel like the primitive human nature still plays a huge role on our brain and reaction. --Sissi (12/31/2016)
katrinaskibicki

One paragraph that puts the white-black life expectancy gap in (horrifying) context - Vox - 0 views

  • It is generally well-reported that there is a life expectancy gap between white and black Americans of about four years. But it can be hard to visualize exactly what this number means. In a recent conversation, David Williams, public health researcher at Harvard, described the racial gap to me in stark terms: One of the ways to think of the racial gap in health is to think of how many black people die prematurely every year who wouldn't die if there were no racial differences in health. The answer to that from a carefully done [2001] scientific study is 96,800 black people die prematurely every year. Divide it by 365 [days], that's 265 people dying prematurely every day. Imagine a jumbo jet — with 265 passengers and crew — crashing at Reagan Washington Airport today, and the same thing happening tomorrow and every day next week and every day next month. That's what we're talking about when we say there are racial disparities in health.
  • I asked Williams why there is such a tremendous gap in black and white life expectancy. He said there's no single issue to blame; it instead comes down to many factors, largely related to where people live.
  •  
    Again, this isn't just because of one single variable. It's a mix of issues, including how walkable a neighborhood is, how clean the air, water, and soil are, the availability of healthy foods, public health policies that push people away from bad habits or foods, and so on. Geographic location just reflects the place all those ideas come together - often in a way that affects certain groups more than others. And it shows why it's important to take a comprehensive view toward public health policy, tackling a variety of issues at once, instead of focusing solely on just one or two problems in a community.
sissij

Turning Negative Thinkers Into Positive Ones - The New York Times - 0 views

  • I leave the Y grinning from ear to ear, uplifted not just by my own workout but even more so by my interaction with these darling representatives of the next generation.
  • I lived for half a century with a man who suffered from periodic bouts of depression, so I understand how challenging negativism can be.
  • “micro-moments of positivity,”
  • ...6 more annotations...
  • The research that Dr. Fredrickson and others have done demonstrates that the extent to which we can generate positive emotions from even everyday activities can determine who flourishes and who doesn’t.
  • Clearly, there are times and situations that naturally result in negative feelings in the most upbeat of individuals. Worry, sadness, anger and other such “downers” have their place in any normal life.
  • Negative feelings activate a region of the brain called the amygdala, which is involved in processing fear and anxiety and other emotions.
  • Both he and Dr. Fredrickson and their colleagues have demonstrated that the brain is “plastic,” or capable of generating new cells and pathways, and it is possible to train the circuitry in the brain to promote more positive responses.
  • reinforce positivity
  • Practice mindfulness. Ruminating on past problems or future difficulties drains mental resources and steals attention from current pleasures.
  •  
    The distance between negative attitude and positive attitude is not that far away. Just by changing a few wordings in the sentence, we can describe an event in a really positive manner. From my personal experience, attitude is like a habit. If you always like to think negatively, then you brain tends to give pessimistic response to events. So sometimes, you have to train your brain into positive thinkers. As we learned in TOK, we tends to see things and think in pattern, so it is very importantly to create a good pattern for our thinking. --Sissi (4/3/2017)
Javier E

The Flight From Conversation - NYTimes.com - 0 views

  • we have sacrificed conversation for mere connection.
  • the little devices most of us carry around are so powerful that they change not only what we do, but also who we are.
  • A businessman laments that he no longer has colleagues at work. He doesn’t stop by to talk; he doesn’t call. He says that he doesn’t want to interrupt them. He says they’re “too busy on their e-mail.”
  • ...19 more annotations...
  • We want to customize our lives. We want to move in and out of where we are because the thing we value most is control over where we focus our attention. We have gotten used to the idea of being in a tribe of one, loyal to our own party.
  • Human relationships are rich; they’re messy and demanding. We have learned the habit of cleaning them up with technology.
  • “Someday, someday, but certainly not now, I’d like to learn how to have a conversation.”
  • We can’t get enough of one another if we can use technology to keep one another at distances we can control: not too close, not too far, just right. I think of it as a Goldilocks effect. Texting and e-mail and posting let us present the self we want to be. This means we can edit. And if we wish to, we can delete. Or retouch: the voice, the flesh, the face, the body. Not too much, not too little — just right.
  • We are tempted to think that our little “sips” of online connection add up to a big gulp of real conversation. But they don’t.
  • I have often heard the sentiment “No one is listening to me.” I believe this feeling helps explain why it is so appealing to have a Facebook page or a Twitter feed — each provides so many automatic listeners. And it helps explain why — against all reason — so many of us are willing to talk to machines that seem to care about us. Researchers around the world are busy inventing sociable robots, designed to be companions to the elderly, to children, to all of us.
  • Connecting in sips may work for gathering discrete bits of information or for saying, “I am thinking about you.” Or even for saying, “I love you.” But connecting in sips doesn’t work as well when it comes to understanding and knowing one another. In conversation we tend to one another.
  • We can attend to tone and nuance. In conversation, we are called upon to see things from another’s point of view.
  • FACE-TO-FACE conversation unfolds slowly. It teaches patience. When we communicate on our digital devices, we learn different habits. As we ramp up the volume and velocity of online connections, we start to expect faster answers. To get these, we ask one another simpler questions; we dumb down our communications, even on the most important matters.
  • And we use conversation with others to learn to converse with ourselves. So our flight from conversation can mean diminished chances to learn skills of self-reflection
  • we have little motivation to say something truly self-reflective. Self-reflection in conversation requires trust. It’s hard to do anything with 3,000 Facebook friends except connect.
  • we seem almost willing to dispense with people altogether. Serious people muse about the future of computer programs as psychiatrists. A high school sophomore confides to me that he wishes he could talk to an artificial intelligence program instead of his dad about dating; he says the A.I. would have so much more in its database. Indeed, many people tell me they hope that as Siri, the digital assistant on Apple’s iPhone, becomes more advanced, “she” will be more and more like a best friend — one who will listen when others won’t.
  • I’m the one who doesn’t want to be interrupted. I think I should. But I’d rather just do things on my BlackBerry.
  • WE expect more from technology and less from one another and seem increasingly drawn to technologies that provide the illusion of companionship without the demands of relationship. Always-on/always-on-you devices provide three powerful fantasies: that we will always be heard; that we can put our attention wherever we want it to be; and that we never have to be alone. Indeed our new devices have turned being alone into a problem that can be solved.
  • When people are alone, even for a few moments, they fidget and reach for a device. Here connection works like a symptom, not a cure, and our constant, reflexive impulse to connect shapes a new way of being.
  • Think of it as “I share, therefore I am.” We use technology to define ourselves by sharing our thoughts and feelings as we’re having them. We used to think, “I have a feeling; I want to make a call.” Now our impulse is, “I want to have a feeling; I need to send a text.”
  • Lacking the capacity for solitude, we turn to other people but don’t experience them as they are. It is as though we use them, need them as spare parts to support our increasingly fragile selves.
  • If we are unable to be alone, we are far more likely to be lonely. If we don’t teach our children to be alone, they will know only how to be lonely.
  • I am a partisan for conversation. To make room for it, I see some first, deliberate steps. At home, we can create sacred spaces: the kitchen, the dining room. We can make our cars “device-free zones.”
Javier E

The Amygdala Made Me Do It - NYTimes.com - 1 views

  • It’s the invasion of the Can’t-Help-Yourself books. Unlike most pop self-help books, these are about life as we know it — the one you can change, but only a little, and with a ton of work. Professor Kahneman, who won the Nobel Prize in economic science a decade ago, has synthesized a lifetime’s research in neurobiology, economics and psychology. “Thinking, Fast and Slow” goes to the heart of the matter: How aware are we of the invisible forces of brain chemistry, social cues and temperament that determine how we think and act?
  • The choices we make in day-to-day life are prompted by impulses lodged deep within the nervous system. Not only are we not masters of our fate; we are captives of biological determinism. Once we enter the portals of the strange neuronal world known as the brain, we discover that — to put the matter plainly — we have no idea what we’re doing.
  • Mr. Duhigg’s thesis is that we can’t change our habits, we can only acquire new ones. Alcoholics can’t stop drinking through willpower alone: they need to alter behavior — going to A.A. meetings instead of bars, for instance — that triggers the impulse to drink.
  • ...1 more annotation...
  • they’re full of stories about people who accomplished amazing things in life by, in effect, rewiring themselves
Javier E

Breathing In vs. Spacing Out - NYTimes.com - 0 views

  • Although pioneers like Jon Kabat-Zinn, now emeritus professor at the University of Massachusetts Medical Center, began teaching mindfulness meditation as a means of reducing stress as far back as the 1970s, all but a dozen or so of the nearly 100 randomized clinical trials have been published since 2005.
  • Michael Posner, of the University of Oregon, and Yi-Yuan Tang, of Texas Tech University, used functional M.R.I.’s before and after participants spent a combined 11 hours over two weeks practicing a form of mindfulness meditation developed by Tang. They found that it enhanced the integrity and efficiency of the brain’s white matter, the tissue that connects and protects neurons emanating from the anterior cingulate cortex, a region of particular importance for rational decision-making and effortful problem-solving.
  • Perhaps that is why mindfulness has proved beneficial to prospective graduate students. In May, the journal Psychological Science published the results of a randomized trial showing that undergraduates instructed to spend a mere 10 minutes a day for two weeks practicing mindfulness made significant improvement on the verbal portion of the Graduate Record Exam — a gain of 16 percentile points. They also significantly increased their working memory capacity, the ability to maintain and manipulate multiple items of attention.
  • ...7 more annotations...
  • By emphasizing a focus on the here and now, it trains the mind to stay on task and avoid distraction.
  • “Your ability to recognize what your mind is engaging with, and control that, is really a core strength,” said Peter Malinowski, a psychologist and neuroscientist at Liverpool John Moores University in England. “For some people who begin mindfulness training, it’s the first time in their life where they realize that a thought or emotion is not their only reality, that they have the ability to stay focused on something else, for instance their breathing, and let that emotion or thought just pass by.”
  • the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness.
  • he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.
  • The trick is knowing when mindfulness is called for and when it’s not.
  • one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies.
  • “There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning
grayton downing

BBC News - Exoplanet tally soars above 1,000 - 0 views

  • The number of observed exoplanets - worlds circling distant stars - has passed 1,000.
  • These new worlds are listed in the Extrasolar Planets Encyclopaedia.
  • The Kepler space telescope, which spotted many of these worlds in recent years, broke down earlier this year. Scientists still have to trawl through more than 3,500 other candidates from this mission so the number could rapidly increase.
  • ...8 more annotations...
  • In January 2013, astronomers used Kepler's data to estimate that there could be at least 17 billion Earth-sized exoplanets in the Milky Way galaxy.
  • The number of confirmed planets frequently increases because as scientists analyse the data they are able publish their results online immediately. But as the finds are not yet peer reviewed, the total figure remains subject to change.
  • "That's why the other catalogues just lag behind. The review is reliable as it's exactly the same as what the journals do."
  • "no consensus for the definition of a planet" a
  • "Some objects, like some Kepler planets, are declared 'confirmed planets' but have not been published in [referenced] articles. It does not mean that they will not be published later on, but it introduces another fuzziness in the tally," he added.
  • "I don't just want to know where the exoplanets are, I want to understand the stars, because they are the hosts for the planets. I want to understand the whole galaxy and the distribution of the stars because everything is connected," he explained.
  • For him, the most exciting discoveries are Earth-like planets which could be habitable.
  • This planet likely has the same mass as Earth but is outside the "habitable zone" as it circles its star far closer than Mercury orbits our Sun.
Javier E

Big Think Interview With Nicholas Carr | Nicholas Carr | Big Think - 0 views

  • Neurologically, how does our brain adapt itself to new technologies? Nicholas Carr: A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses. And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections. And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways. On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
  • And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading.
  • What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
  • ...12 more annotations...
  • we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information.
  • Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit.
  • With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words.
  • If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that.
  • if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
  • The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking.
  • we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
  • the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
  • On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
  • the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
  • So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions.
  • what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
Javier E

The Cost of Relativism - NYTimes.com - 0 views

  • One of America’s leading political scientists, Robert Putnam, has just come out with a book called “Our Kids” about the growing chasm between those who live in college-educated America and those who live in high-school-educated America
  • Reintroducing norms will require, first, a moral vocabulary. These norms weren’t destroyed because of people with bad values. They were destroyed by a plague of nonjudgmentalism, which refused to assert that one way of behaving was better than another. People got out of the habit of setting standards or understanding how they were set.
  • sympathy is not enough. It’s not only money and better policy that are missing in these circles; it’s norms.
  • ...7 more annotations...
  • The health of society is primarily determined by the habits and virtues of its citizens.
  • In many parts of America there are no minimally agreed upon standards for what it means to be a father. There are no basic codes and rules woven into daily life, which people can absorb unconsciously and follow automatically.
  • Roughly 10 percent of the children born to college grads grow up in single-parent households. Nearly 70 percent of children born to high school grads do. There are a bunch of charts that look like open scissors. In the 1960s or 1970s, college-educated and noncollege-educated families behaved roughly the same. But since then, behavior patterns have ever more sharply diverged. High-school-educated parents dine with their children less than college-educated parents, read to them less, talk to them less, take them to church less, encourage them less and spend less time engaging in developmental activity.
  • Next it will require holding people responsible. People born into the most chaotic situations can still be asked the same questions: Are you living for short-term pleasure or long-term good? Are you living for yourself or for your children? Do you have the freedom of self-control or are you in bondage to your desires?
  • Next it will require holding everybody responsible. America is obviously not a country in which the less educated are behaving irresponsibly and the more educated are beacons of virtue. America is a country in which privileged people suffer from their own characteristic forms of self-indulgence: the tendency to self-segregate, the comprehensive failures of leadership in government and industry.
  • People sometimes wonder why I’ve taken this column in a spiritual and moral direction of late. It’s in part because we won’t have social repair unless we are more morally articulate, unless we have clearer definitions of how we should be behaving at all levels.
  • History is full of examples of moral revival, when social chaos was reversed, when behavior was tightened and norms reasserted. It happened in England in the 1830s and in the U.S. amid economic stress in the 1930s.
Javier E

A smarter way to think about willpower - The Washington Post - 0 views

  • in a self-report questionnaire completed by more than 80,000 American adults, self-control ranked lowest among 24 strengths of character.
  • three out of four parents said they thought self-control has declined in the past half-century.
  • Without a time machine that allows us to travel backward and compare Americans from different decades on the same self-control measures, we can’t be sure. Indeed, the scant scientific evidence on the question suggests that if anything, the capacity to delay gratification may be increasing.
  • ...18 more annotations...
  • there are plenty of behaviors that require self-control that have held steady or even improved in recent decades
  • Cigarette smoking has fallen sharply since the Mad Men days.
  • Alcohol consumption peaked in 1980 and has fallen back to the same level as 1960
  • Seat belts,
  • Nevertheless, like every generation before us, we crave more self-control.
  • the ratio of household consumption to household net worth just hit a postwar low: In 2018 consumption was 13.2 percent of net worth, down from 16.3 percent in 1946.
  • it isn’t clear that savings habits have worsened since World War II.
  • are now used by 9 out of 10 motorists.
  • science shows that helping people do better in the internal tug-of-war of self-control depends on creating the right external environment.
  • some temptations require hard paternalism
  • some choices are not in our best interest. Taxing, regulating, restricting or even banning especially addictive drugs may lead to more freedom
  • Cellphones and soda
  • the benefits of constraining access may, in some cases, justify the costs
  • we recommend nudges — subtle changes in how choices are framed that make doing what’s in our long-term interest more obvious, easier or more attractiv
  • deploy science-backed strategies that make self-control easier.
  • putting temptations out of sight and out of reach:
  • disabling apps that, upon reflection, do more harm than good.
  • Anything you can do to put time and effort between you and indulgence makes self-control easier.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

Ditch the GPS. It's ruining your brain. - The Washington Post - 0 views

  • they also affect perception and judgment. When people are told which way to turn, it relieves them of the need to create their own routes and remember them. They pay less attention to their surroundings. And neuroscientists can now see that brain behavior changes when people rely on turn-by-turn directions.
  • 2017, researchers asked subjects to navigate a virtual simulation of London’s Soho neighborhood and monitored their brain activity, specifically the hippocampus, which is integral to spatial navigation
  • The hippocampus makes an internal map of the environment and this map becomes active only when you are engaged in navigating and not using GPS,
  • ...5 more annotations...
  • The hippocampus is crucial to many aspects of daily life. It allows us to orient in space and know where we are by creating cognitive maps. It also allows us to recall events from the past, what is known as episodic memory. And, remarkably, it is the part of the brain that neuroscientists believe gives us the ability to imagine ourselves in the future.
  • “when people use tools such as GPS, they tend to engage less with navigation. Therefore, brain area responsible for navigation is less used, and consequently their brain areas involved in navigation tend to shrink.”
  • avigation aptitude appears to peak around age 19, and after that, most people slowly stop using spatial memory strategies to find their way, relying on habit instead.
  • “If we are paying attention to our environment, we are stimulating our hippocampus, and a bigger hippocampus seems to be protective against Alzheimer’s disease,” Bohbot told me in an email. “When we get lost, it activates the hippocampus, it gets us completely out of the habit mode. Getting lost is good!”
  • practicing navigation is a powerful form of engagement with the environment that can inspire a greater sense of stewardship
Javier E

Mistakes in the Paleo Diet - The Atlantic - 0 views

  • a high-fiber diet came with a 40-percent lower than average risk of heart disease. Fiber also seems to protect against metabolic syndrome.
  • One of the mechanisms behind these benefits appears to be that fiber essentially feeds the microbes in our guts, encouraging diverse populations. Those microbes are implicated in a vast array of illnesses and wellbeing. A diet heavy on meat and dairy is necessarily lower on fiber.
  • The basic idea behind Paleo is that humans evolved under certain circumstances over millennia, and then those circumstances changed tremendously in the last century, and our bodies did not keep pace. We find ourselves sedentary and overfed on amalgamations that distort our body’s expectations of “food.”
  • ...5 more annotations...
  • it’s important not to lose focus on the fact that for all its problems, our modern food system has us living longer and less deprived than centuries past. The challenge is striking balance.
  • whole grains have consistently shown to be parts of the diets of the longest-lived, healthiest people.
  • Changing the way we eat is a major change. It will involve multiple decisions every day. Presumably our old habits existed for reasons—convenience, enjoyment, availability, cost, marketing, etc. Modifying the habits that these conditions created means hard work and requires dedication to a cause. I’m not convinced that concern for the health of our bodies years in the future is sufficient.
  • Viktor Frankl wrote in Man’s Search for Meaning that the key is to avoid the temptation to pursue happiness—like that being sold to us through all of the new-year deals—but to pursue meaning. Piles of research have shown that a sense of purpose is a central to long, healthy life.
  • There’s purpose to be had in how we eat—in how conscientious we can be, how minimally we can disrupt the world for those that will come after us and those working to produce and procure our food. I think this is a sustainable and worthy resolution for a healthier way to eat, if you’re intent on making one. It works for the mind and body at once, and, most importantly, not just our own.
Javier E

Less cramming. More Frisbee. At Yale, students learn how to live the good life. - The W... - 0 views

  • Santos designed this class after she realized, as the head of a residential college at Yale, that many students were stressed out and unhappy, grinding through long days that seemed to her far more crushing and joyless than her own college years. Her perception was backed up by statistics
  • a national survey that found nearly half of college students reported overwhelming anxiety and feeling hopeless.
  • “They feel they’re in this crazy rat race, they’re working so hard they can’t take a single hour off — that’s awful.”
  • ...15 more annotations...
  • The idea behind the class is deceptively simple, and many of the lessons — such as gratitude, helping others, getting enough sleep — are familiar.
  • “A lot of people are waking up, realizing that we’re struggling,
  • All semester, hundreds of students tried to rewire themselves — to exercise more, to thank their mothers, to care less about the final grade and more about the ideas.
  • But in ways small and large, silly and heartbreakingly earnest, simple and profound, this class changed the conversation at Yale. It surfaced in late-night talks in dorms, it was dissected in newspaper columns, it popped up, again and again, in memes.
  • It’s the application that’s difficult, a point Santos made repeatedly: Our brains often lead us to bad choices, and even when we realize the choices are bad, it’s hard to break habits.
  • In a way, the class is the very essence of a liberal-arts education: learning, exploration, insight into oneself and the world. But many students described it as entirely unlike any class they had ever taken — nothing to do with academics, and everything to do with life.
  • There’s no longer the same stigma around mental-health issues, he said. “Now, so many people are admitting they want to lead happier lives.”
  • The impact is not limited to Yale. Stories about PSYC157 spread around the world. Santos created a pared-down version of the class and offered it to anyone on the online education site Coursera.
  • She taught students about cognitive biases.
  • “We called it the ways in which our minds suck,” Forti said. “Our minds make us think that certain things make us happy, but they don’t.”
  • Then, they had to apply the lessons
  • There was a palpable difference on campus, several students said, during the week when they performed random acts of kindness.
  • The biggest misconception people have about the class is that Santos is offering some kind of easy happiness fix. “It’s something you have to work on every day. . . . If I keep using these skills, they’ll, over time, help me develop better habits and be happier.
  • So many students have told her the class changed their lives. “If you’re really grateful, show me that,” she told them. “Change the culture.
  • for now students stood and clapped and clapped and clapped, beaming, drowning out even Kanye with their standing ovation. As if they had nothing but time.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

How to Get Your Mind to Read - The New York Times - 1 views

  • Americans’ trouble with reading predates digital technologies. The problem is not bad reading habits engendered by smartphones, but bad education habits engendered by a misunderstanding of how the mind reads.
  • Just how bad is our reading problem? The last National Assessment of Adult Literacy from 2003 is a bit dated, but it offers a picture of Americans’ ability to read in everyday situations: using an almanac to find a particular fact, for example, or explaining the meaning of a metaphor used in a story. Of those who finished high school but did not continue their education, 13 percent could not perform simple tasks like these.
  • When things got more complex — in comparing two newspaper editorials with different interpretations of scientific evidence or examining a table to evaluate credit card offers — 95 percent failed.
  • ...17 more annotations...
  • poor readers can sound out words from print, so in that sense, they can read. Yet they are functionally illiterate — they comprehend very little of what they can sound out. So what does comprehension require? Broad vocabulary, obviously. Equally important, but more subtle, is the role played by factual knowledge.
  • All prose has factual gaps that must be filled by the reader.
  • Knowledge also provides context.
  • You might think, then, that authors should include all the information needed to understand what they write.
  • Current education practices show that reading comprehension is misunderstood. It’s treated like a general skill that can be applied with equal success to all texts. Rather, comprehension is intimately intertwined with knowledge.
  • students who score well on reading tests are those with broad knowledge; they usually know at least a little about the topics of the passages on the test.
  • One experiment tested 11th graders’ general knowledge with questions from science (“pneumonia affects which part of the body?”), history (“which American president resigned because of the Watergate scandal?”), as well as the arts, civics, geography, athletics and literature. Scores on this general knowledge test were highly associated with reading test scores.
  • But those details would make prose long and tedious for readers who already know the information. “Write for your audience” means, in part, gambling on what they know.
  • That suggests three significant changes in schooling.
  • First, it points to decreasing the time spent on literacy instruction in early grades.
  • Third-graders spend 56 percent of their time on literacy activities but 6 percent each on science and social studies. This disproportionate emphasis on literacy backfires in later grades, when children’s lack of subject matter knowledge impedes comprehension.
  • Another positive step would be to use high-information texts in early elementary grades. Historically, they have been light in content.
  • Second, understanding the importance of knowledge to reading ought to make us think differently about year-end standardized tests. If a child has studied New Zealand, she ought to be good at reading and thinking about passages on New Zealand. Why test her reading with a passage about spiders, or the Titanic?
  • Third, the systematic building of knowledge must be a priority in curriculum design.
  • The Common Core Standards for reading specify nearly nothing by way of content that children are supposed to know — the document valorizes reading skills. State officials should go beyond the Common Core Standards by writing content-rich grade-level standards
  • Don’t blame the internet, or smartphones, or fake news for Americans’ poor reading. Blame ignorance. Turning the tide will require profound changes in how reading is taught, in standardized testing and in school curriculums. Underlying all these changes must be a better understanding of how the mind comprehends what it reads.
  • Daniel T. Willingham (@DTWillingham) is a professor of psychology at the University of Virginia and the author, most recently, of “The Reading Mind: A Cognitive Approach to Understanding How the Mind Reads.”
‹ Previous 21 - 40 of 177 Next › Last »
Showing 20 items per page