Skip to main content

Home/ TOK Friends/ Group items tagged i

Rss Feed Group items tagged

sissij

I know they've seen my message - so why haven't they replied? | Culture | The Guardian - 0 views

  • Ah, the tyranny of read receipts – enough to put you off digital communication for good.
  • It sounds straightforward enough, even perfunctory, and indeed it is if it’s only a blip in the back-and-forth. But when a message lingers on “seen” without explanation for anything beyond a few minutes, you’ve been “left on read”. It’s enough to make even the most self-assured individuals question their worth.
  • It works both ways, too: if you’ve read a message that you’re either unable or unwilling to respond to immediately, the countdown has already started.
  • ...10 more annotations...
  • You never picture them driving, or in the bath, or with relatives who don’t believe in phones at the table. In the moment, the likelihood of their secretly resenting you, or agonising over a reply that is certain to disappoint, seems far greater than it actually is.
  • The anxiety of being left on read is silly but it is real, and unique to this time. There is no analog equivalent.
  • but in that case I’d like to think you’d give them the benefit of the doubt, and assume they’d fallen over in the shower or something.
  • There’s no such goodwill in web 2.0, when everyone is assumed to be available at all times. And if not – it’s personal.
  • well, is it any wonder anxiety is so rife among Generation Y?
  • People will go to some lengths to avoid being seen to have “seen” a message – on Snapchat and Facebook, downloading the message then turning on flight mode and opening it can stop it from registering as opened.
  • Turning on “previews” that display on the lock screen will, in many cases, show enough to get the gist of a message (“I think we should break ... ”) without opening it.
  • But while some people contort themselves to avoid being seen to have “seen”, others manipulate that anxiety to their own ends.
  • But maybe read receipts and the games people play with them have just ruined my ability to trust.
  • When we’re used to good things happening instantly, time taken to craft a thoughtful reply is considered a bad thing.
  •  
    I totally agree with the author that the read receipts should be optional. I personally have some issue with the read receipt because I don't like to reply instantly except it is a urgent message. I like to take some time to think about what I want to comment on or write back. Although the society now likes to be fast and instant, I am still a slow person. I feel the read receipt is forcing me and giving me pressure to be fast and instant. 
Javier E

As a Doctor, I Was Skeptical About the Covid Vaccine. Then I Reviewed the Science. - Th... - 0 views

  • Until last week, I wasn’t sure I would get the vaccine. Some media reports highlight that mRNA vaccines have never been approved for use in humans outside clinical trials, making it seem like a new technology that has not been tested before. The vaccines were developed at such speed, I couldn’t be sure that major side effects hadn’t been overlooked. I worried about autoimmunity caused by expressing the coronavirus spike proteins on my own cells.
  • Every day in the emergency department, patients walk away from essential care against medical advice, and we watch them go with a shake of our heads and a rueful smile. Just like them, isolated with my doubts, I was ready to exercise my right to free will and refuse the vaccine.
  • When my non-medical friends asked me about it, I was torn between telling them my concerns and playacting the doctor who recommends the latest proven therapy.
  • ...8 more annotations...
  • The guilt I felt about this compelled me to objectively review the literature on mRNA vaccines. Not being an expert in virology or biochemistry, I realized I had to quickly master unfamiliar words like “transfection” and concepts about gene sequences. Slowly, the information I was devouring started changing my beliefs.
  • I learned that research into using mRNA for vaccinations and cancer therapies has been ongoing for the past 30 years. Trial and error have refined this modality so that it was almost fully fledged by the time Covid hit
  • The mRNA from the vaccine is broken down quickly in our cells, and the coronavirus spike protein is expressed only transiently on the cell surface.
  • Furthermore, this type of vaccine is harnessing a technique that viruses already use.
  • It was humbling to have to change my mind. As I booked my vaccination time slot, I realized how lucky I am to have access to all this research, as well as the training to understand it.
  • As medical professionals, we cannot afford to be paternalistic and trust that people will follow advice without all the facts. This is especially true in Australia, where the vast majority of us have never witnessed firsthand the ravages that this disease can inflict.
  • Like all new converts, I am now a true believer: I’d like everyone to be vaccinated. But autonomy is a precious tenet of a free society, and I’m glad the ethicists have advised against mandating the vaccine
  • just hope that with more robust discussion and the wider dissemination of scientific knowledge, we may sway people like me — who have what may be valid reservations — to get the vaccine.
anonymous

Beverly Cleary, Beloved Children's Book Author, Dies at 104 - The New York Times - 0 views

  • Beverly Cleary, Beloved Children’s Book Author, Dies at 104
  • Her funny stories about Henry Huggins and his dog Ribsy, the sisters Ramona and Beezus Quimby, and a motorcycling mouse named Ralph never talked down to readers.
  • Beverly Cleary, who enthralled tens of millions of young readers with the adventures and mishaps of Henry Huggins and his dog Ribsy, the bratty Ramona Quimby and her older sister Beezus, and other residents of Klickitat Street, died on Thursday in Carmel, Calif
  • ...20 more annotations...
  • She was 104.
  • Always sympathetic, never condescending, she presented her readers with characters they knew and understood, the 20th-century equivalents of Huck Finn or Louisa May Alcott’s little women, and every bit as popular: Her books sold more than 85 million copies
  • “Cleary is funny in a very sophisticated way,
  • At her library job in Yakima, Ms. Cleary had become dissatisfied with the books being offered to her young patrons
  • The protagonists tended to be aristocratic English children who had nannies and pony carts, or poor children whose problems disappeared when a long-lost rich relative turned up in the last chapter.
  • “I wanted to read funny stories about the sort of children I knew,” she wrote, “and I decided that someday when I grew up I would write them.”
  • After marrying Clarence Cleary, a graduate student she had met at Berkeley, she moved to San Francisco and, while her husband served in the military, sold children’s books at the Sather Gate
  • Book Shop in Berkeley and worked as a librarian at Camp John T. Knight in Oakland.
  • “She gets very close to satire, which I think is why adults like her, but she’s still deeply respectful of her characters — nobody gets a laugh at the expense of another. I think kids appreciate that they’re on a level playing field with adults.”
  • She had been particularly touched by the plight of a group of boys who asked her, “Where are the books about us?”
  • “Why didn’t authors write books about everyday problems that children could solve by themselves?”
  • “Why weren’t there more stories about children playing? Why couldn’t I find more books that would make me laugh? These were the bo
  • oks I wanted to read, and the books I was eventually to write.”
  • “When I began ‘Henry Huggins’ I did not know how to write a book, so I mentally told the stories that I remembered and wrote them down as I told them,”
  • Ramona Quimby, introduced in a small role as the annoying younger sister of Henry’s friend Beatrice, better known as Beezus, emerged as a superstar.
  • “I thought like Ramona, but I was a very well-behaved little girl.”
  • By the time “Beezus and Ramona” was published, Ms. Cleary had twins, Malcolm and Marianne, to provide her with fresh material. They survive her, along with three grandchildren and one great-grandchild. Her husband died in 2004.
  • Ramona mounts a campaign to have her father quit smoking, a habit he abuses after losing his job.
  • That book won the Newbery Medal in 1984. A sequel, “Strider,” followed in 1991.
  • “That little girl, who has remained with me, prevents me from writing down to children, from poking fun at my characters, and from writing an adult reminiscence about childhood instead of a book to be enjoyed by children.”
sanderk

Is Pain a Construct of the Mind? - Scientific American - 0 views

  • Clear Lake found that Rogers could recruit an abnormally high number of muscle fibers. But was this ability because of a freak genetic mutation? Another possibility, which Rogers thinks is more likely, is the way he processes pain when he strains those muscles.
  • What if, instead of superpowered muscles, Rogers has a normal—though extremely well exercised—body, and his abilities arise because he can withstand more pain than most mere mortals?
  • Rogers reasons that, unlike in the dentist's office—where he has no control over the pain that is inflicted on him—he has direct executive control over pain that he inflicts on himself. “I know it's coming, I have an idea of what to expect and I can decide to ignore it,” he says. Confronted with severe pain, most people fear that they will damage their body permanently if they persist, so they stop well before they are in real danger, Rogers explains.
  • ...2 more annotations...
  • Maybe Rogers's muscle cells are normal, and he experiences pain as most of us do but chooses to disregard it when he feels in command.
  • An illusion is a perception that does not match the physical reality. Is pain, then, as with illusions, a mind construct that some people can decide to turn off? As you will see in the studies that follow, pain varies as a function of mood, attentiveness and circumstances, lending support to the theory that pain is an emotion.
  •  
    During practice, my coaches always say that I need to overcome pain but I never knew how. I found this article interesting because it says that pain is an emotion. I have never thought that pain could even be close to an emotion. It is interesting how the world's strongest man can just ignore the pain and keep doing these incredible feats. Controlling one's pain must either be a skill or just a natural-born gift. Like in TOK, we must practice skills such as controlling our emotions. If pain is an emotion, then theoretically, I could control it. I am going to test this out in my life and see if I can control my own pain during exercise.
blythewallick

Out-Of-Body Experiences: Mine Is Finally Explained | Psychology Today - 0 views

  • Sleep deprivation had disturbed my vestibular system, making me feel drifting or floating, and had especially interfered with my right TPJ and with it my body schema (Chee & Chua 2007, Quarck et al 2006). Nearly four hours of holding out my arm for the Ouija board had confused my body schema even more. My attention kept wandering and my short term memory was reduced by cannabis (Earleywine 2002).
  • With my hyperexcitable cortex (Braithwaite et al 2013) already disinhibited by the combination of sleep deprivation and cannabis, it went into random firing, producing an illusory central light and the form constants of spirals and tunnels (Cowan 1982). Disinhibited motion detectors produced illusory movement and as the light grew bigger I seemed to move towards it
  • My auditory cortex was similarly hyperactive, producing random low-frequency repetitive sounds that drowned out the music. It sounded to me like the pounding of horses’ hooves. I was galloping fast down the tunnel towards the light.
  • ...4 more annotations...
  • ‘Where are you, Sue?’ I was brought up short. I tried to picture my own body and where it really was, but my prefrontal cortex was deactivated as the brain hovered on the edge of sleep (Muzur et al 2002). With my TPJ disturbed it was impossible to combine a body schema with vestibular and sensory input to give a firm sense of an embodied self (Blanke et al 2002).
  • The roofs, gutters and chimneys I saw were just as I imagined them, not as they were. So were the cities, lakes, oceans and islands I saw. I laughed at the vivid ‘star-shaped island with a hundred trees’, believing it was a thought-form in the astral plane (Besent 1896, Findlay 1931) because that was the only theory I knew.
  • I was too tired to do more than glimpse this new vastness. In exhaustion, I seemed to face a choice, to stay in this marvelous, right-seeming, perfect state, or return to ordinary life. The choice made itself and the struggle began. After more than two hours of serious disturbance, this brain took some time to reinstate both body schema and self-image and even then confused my own body with others. When I opened my eyes I felt and saw greyish body-shapes around the others as well as myself; displaced body schemas that gradually faded until I was (more or less) back to normal. Yet nothing was ever quite the same again.”
  • But that’s the joy of doing science at all. I have not, in these posts, covered the tunnel experience, the silver cord and several other features more commonly found during near-death experiences, but I may return to them in future. For now I hope you have enjoyed this series of OBE stories.
Javier E

Opinion | Elle Mills: Why I Quit YouTube - The New York Times - 0 views

  • The peak of my YouTube career didn’t always match my childhood fantasy of what this sort of fame might look like. Instead, I was constantly terrified of losing my audience and the validation that came with it. My self-worth had become so intertwined with my career that maintaining it genuinely felt life-or-death. I was stuck in a never-ending cycle of constantly trying to top myself to remain relevant.
  • YouTube soon became a game of, “What’s the craziest thing you’d do for attention?”
  • there’s an overwhelming guilt I feel when I look back at all those who naïvely participated in my videos. A part of me feels like I took advantage of their own longing to be seen. I gained fame and success from the exploitation of their lives. They didn’t.
  • ...6 more annotations...
  • I knew that my audience wanted to feel authenticity from me. To give that to them, I revealed pieces of myself that I might have been wiser to keep private.
  • when metrics substitute for self-worth, it’s easy to fall into the trap of giving precious pieces of yourself away to feed an audience that’s always hungry for more and more.
  • In 2018, I impulsively released a video about my struggle with burnout, which featured intimate footage of my emotional breakdowns. Those breakdowns were, in part, a product of severe anxiety and depression brought about by chasing the exact success for which many other teenagers yearn.
  • I was entering adulthood and trying to live my childhood dream, but now, to be “authentic,” I had to be the product I had long been posting online, as opposed to the person I was growing up to be.
  • Online culture encourages young people to turn themselves into a product at an age when they’re only starting to discover who they are. When an audience becomes emotionally invested in a version of you that you outgrow, keeping the product you’ve made aligned with yourself becomes an impossible dilemma.
  • Sometimes, I barely recognize the person I used to be. Although a part of me resents that I’ll never be able to forget her, I’m also grateful to her. My YouTube channel, for all the trouble it brought me, connected me to the people who wanted to hear my stories and prepared me for a real shot at a directing career. In the last year, I’ve directed a short film and am writing a feature, which showed me new ways of creating that aren’t at the expense of my privacy.
Javier E

'The Godfather of AI' Quits Google and Warns of Danger Ahead - The New York Times - 0 views

  • he officially joined a growing chorus of critics who say those companies are racing toward danger with their aggressive campaign to create products based on generative artificial intelligence, the technology that powers popular chatbots like ChatGPT.
  • Dr. Hinton said he has quit his job at Google, where he has worked for more than decade and became one of the most respected voices in the field, so he can freely speak out about the risks of A.I. A part of him, he said, now regrets his life’s work.
  • “I console myself with the normal excuse: If I hadn’t done it, somebody else would have,”
  • ...24 more annotations...
  • Industry leaders believe the new A.I. systems could be as important as the introduction of the web browser in the early 1990s and could lead to breakthroughs in areas ranging from drug research to education.
  • But gnawing at many industry insiders is a fear that they are releasing something dangerous into the wild. Generative A.I. can already be a tool for misinformation. Soon, it could be a risk to jobs. Somewhere down the line, tech’s biggest worriers say, it could be a risk to humanity.
  • “It is hard to see how you can prevent the bad actors from using it for bad things,” Dr. Hinton said.
  • After the San Francisco start-up OpenAI released a new version of ChatGPT in March, more than 1,000 technology leaders and researchers signed an open letter calling for a six-month moratorium on the development of new systems because A.I technologies pose “profound risks to society and humanity.
  • Several days later, 19 current and former leaders of the Association for the Advancement of Artificial Intelligence, a 40-year-old academic society, released their own letter warning of the risks of A.I. That group included Eric Horvitz, chief scientific officer at Microsoft, which has deployed OpenAI’s technology across a wide range of products, including its Bing search engine.
  • Dr. Hinton, often called “the Godfather of A.I.,” did not sign either of those letters and said he did not want to publicly criticize Google or other companies until he had quit his job
  • Dr. Hinton, a 75-year-old British expatriate, is a lifelong academic whose career was driven by his personal convictions about the development and use of A.I. In 1972, as a graduate student at the University of Edinburgh, Dr. Hinton embraced an idea called a neural network. A neural network is a mathematical system that learns skills by analyzing data. At the time, few researchers believed in the idea. But it became his life’s work.
  • Dr. Hinton is deeply opposed to the use of artificial intelligence on the battlefield — what he calls “robot soldiers.”
  • In 2012, Dr. Hinton and two of his students in Toronto, Ilya Sutskever and Alex Krishevsky, built a neural network that could analyze thousands of photos and teach itself to identify common objects, such as flowers, dogs and cars.
  • In 2018, Dr. Hinton and two other longtime collaborators received the Turing Award, often called “the Nobel Prize of computing,” for their work on neural networks.
  • Around the same time, Google, OpenAI and other companies began building neural networks that learned from huge amounts of digital text. Dr. Hinton thought it was a powerful way for machines to understand and generate language, but it was inferior to the way humans handled language.
  • Then, last year, as Google and OpenAI built systems using much larger amounts of data, his view changed. He still believed the systems were inferior to the human brain in some ways but he thought they were eclipsing human intelligence in others.
  • “Maybe what is going on in these systems,” he said, “is actually a lot better than what is going on in the brain.”
  • As companies improve their A.I. systems, he believes, they become increasingly dangerous. “Look at how it was five years ago and how it is now,” he said of A.I. technology. “Take the difference and propagate it forwards. That’s scary.”
  • Until last year, he said, Google acted as a “proper steward” for the technology, careful not to release something that might cause harm. But now that Microsoft has augmented its Bing search engine with a chatbot — challenging Google’s core business — Google is racing to deploy the same kind of technology. The tech giants are locked in a competition that might be impossible to stop, Dr. Hinton said.
  • His immediate concern is that the internet will be flooded with false photos, videos and text, and the average person will “not be able to know what is true anymore.”
  • He is also worried that A.I. technologies will in time upend the job market. Today, chatbots like ChatGPT tend to complement human workers, but they could replace paralegals, personal assistants, translators and others who handle rote tasks. “It takes away the drudge work,” he said. “It might take away more than that.”
  • Down the road, he is worried that future versions of the technology pose a threat to humanity because they often learn unexpected behavior from the vast amounts of data they analyze. This becomes an issue, he said, as individuals and companies allow A.I. systems not only to generate their own computer code but actually run that code on their ow
  • And he fears a day when truly autonomous weapons — those killer robots — become reality.
  • “The idea that this stuff could actually get smarter than people — a few people believed that,” he said. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.”
  • Many other experts, including many of his students and colleagues, say this threat is hypothetical. But Dr. Hinton believes that the race between Google and Microsoft and others will escalate into a global race that will not stop without some sort of global regulation.
  • But that may be impossible, he said. Unlike with nuclear weapons, he said, there is no way of knowing whether companies or countries are working on the technology in secret. The best hope is for the world’s leading scientists to collaborate on ways of controlling the technology. “I don’t think they should scale this up more until they have understood whether they can control it,” he said.
  • Dr. Hinton said that when people used to ask him how he could work on technology that was potentially dangerous, he would paraphrase Robert Oppenheimer, who led the U.S. effort to build the atomic bomb: “When you see something that is technically sweet, you go ahead and do it.”
  • He does not say that anymore.
Emily Horwitz

Upside of Distraction - NYTimes.com - 0 views

  • Writing a book consists largely of avoiding distractions. If you can forget your real circumstances and submerge yourself in your subject for hours every day, characters become more human, sentences become clearer and prettier. But utter devotion to the principle that distraction is Satan and writing is paramount can be just as poisonous as an excess of diversion.
  • Monomania is what it sounds like: a pathologically intense focus on one thing.
  • It’s the opposite of the problem you have, in other words, if you are a normal, contemporary, non-agrarian 30-something.
  • ...8 more annotations...
  • There was nothing to do besides read, write, reflect on God and drink. It was a circumstance favorable to writing fiction. But it was also conducive to depravity, the old Calvinist definition thereof: a warping of the spirit.
  • When I socialized, it was often with poets, who confirmed by their very existence that I had landed in a better, vanished time. Even their physical ailments were of the 19th century. One day, in the depths of winter, I came upon one of them picking his way across the snow and ice on crutches, pausing to drag on his cigarette.
  • The disaster unfolded slowly. The professors and students were diplomatic, but a pall of boredom fell over the seminar table when my work was under discussion. I could see everyone struggling to care. And then, trying feverishly to write something that would engage people, I got worse. First my writing became overthought, and then it went rank with the odor of desperation. It got to the point that every chapter, short story, every essay was trash.
  • It took me a long time to realize that the utter domination of my consciousness by the desire to write well was itself the problem.
  • When good writing was my only goal, I made the quality of my work the measure of my worth. For this reason, I wasn’t able to read my own writing well. I couldn’t tell whether something I had just written was good or bad, because I needed it to be good in order to feel sane.
  • I purged myself of monomania — slowly, and somewhat unwittingly. I fell in love, an overpowering diversion, and began to spend more time at my girlfriend’s place, where she had Wi-Fi, a flat-screen TV and a DVD player.
  • One morning, after I diversified my mania, my writing no longer stank of decay.
  • I’m glad I went to 19th-century Russia. But I wish I had been more careful, more humble, and kept one foot in modernity. The thing about 19th-century Russia is that if you race in, heedless of all but conquest and glory, you get stuck.
  •  
    An interesting article about the need for distractions - if we focus too much on one thing at a time, we lose the capacity to tell whether it is good or not.
sissij

What is Russell's paradox? - Scientific American - 1 views

  • Russell's paradox is based on examples like this: Consider a group of barbers who shave only those men who do not shave themselves. Suppose there is a barber in this collection who does not shave himself; then by the definition of the collection, he must shave himself. But no barber in the collection can shave himself.
  • We write this description of the set formally as x = { n: n is an integer and 3 < n < 7} . The objects in the set don't have to be numbers. We might let y ={x: x is a male resident of the United States }.
  • What became of the effort to develop a logical foundation for all of mathematics? Mathematicians now recognize that the field can be formalized using so-called Zermelo-Fraenkel set theory. The formal language contains symbols such as e to express "is a member of," = for equality and to denote the set with no elements.
  •  
    I found this very interesting because it shows that even in math, there can be illogic paradox. There is no perfect logic. I think this paradox is a circular logic because the premise and assumption is used in the argument. Also, the definition and limitation of "those men" is too vague. I think to make this premise valid, we need to state that barbers are not in the reference of the term "those men" in the premise. --Sissi (11/12/2016)
sissij

A Beginners Guide To Parkinson's Law: How To Do More Stuff By Giving Yourself... - 1 views

  • Parkinson’s Law: “Work expands so as to fill the time available for its completion.”
  • You had all week to finalize a proposal, but waited to do it until 4:30pm on the Friday.
  • He found that even a series of simple tasks increased in complexity to fill up the time allotted to it. As the length of time allocated to a task became shorter, the task became simpler and easier to solve.
  • ...4 more annotations...
  • Interestingly enough, I worked more but got less done. On top of that, I was stressed all the time.
  • I was an addict, not to work, but to thinking that I was working.
  • Specificity and restrictions create freedom and nourish creativity. Add them to your arsenal of tools as you become an uber productive and efficient creator.
  • I’ve referred to this in the past as “shoot first, aim later” or “jump … and then figure it out on the way down.” Pick a big goal, commit to it, and you’ll probably find that you’re able to figure out a way to achieve it.
  •  
    I found this law very interesting. It also reminds me of the TedTalk we had during advisory on procrastination. Being busy and being busy and efficient is completely different. I have a personally experience that agree on this law. Last week, my days were completely filled with cross-country practice, musical rehearsal, and school work, so everyday I sleep around eleven o'clock. This week, I suddenly have a lot more time as the cross-country season ends and the musical is over, but I still go to bed at eleven o'clock and fill my time still not enough for me. I was busy this week but obviously, my efficiency is much lower than last week. --Sissi (11/18/2016)
Javier E

The Flight From Conversation - NYTimes.com - 0 views

  • we have sacrificed conversation for mere connection.
  • the little devices most of us carry around are so powerful that they change not only what we do, but also who we are.
  • A businessman laments that he no longer has colleagues at work. He doesn’t stop by to talk; he doesn’t call. He says that he doesn’t want to interrupt them. He says they’re “too busy on their e-mail.”
  • ...19 more annotations...
  • We want to customize our lives. We want to move in and out of where we are because the thing we value most is control over where we focus our attention. We have gotten used to the idea of being in a tribe of one, loyal to our own party.
  • We are tempted to think that our little “sips” of online connection add up to a big gulp of real conversation. But they don’t.
  • “Someday, someday, but certainly not now, I’d like to learn how to have a conversation.”
  • We can’t get enough of one another if we can use technology to keep one another at distances we can control: not too close, not too far, just right. I think of it as a Goldilocks effect. Texting and e-mail and posting let us present the self we want to be. This means we can edit. And if we wish to, we can delete. Or retouch: the voice, the flesh, the face, the body. Not too much, not too little — just right.
  • Human relationships are rich; they’re messy and demanding. We have learned the habit of cleaning them up with technology.
  • I have often heard the sentiment “No one is listening to me.” I believe this feeling helps explain why it is so appealing to have a Facebook page or a Twitter feed — each provides so many automatic listeners. And it helps explain why — against all reason — so many of us are willing to talk to machines that seem to care about us. Researchers around the world are busy inventing sociable robots, designed to be companions to the elderly, to children, to all of us.
  • Connecting in sips may work for gathering discrete bits of information or for saying, “I am thinking about you.” Or even for saying, “I love you.” But connecting in sips doesn’t work as well when it comes to understanding and knowing one another. In conversation we tend to one another.
  • We can attend to tone and nuance. In conversation, we are called upon to see things from another’s point of view.
  • I’m the one who doesn’t want to be interrupted. I think I should. But I’d rather just do things on my BlackBerry.
  • And we use conversation with others to learn to converse with ourselves. So our flight from conversation can mean diminished chances to learn skills of self-reflection
  • we have little motivation to say something truly self-reflective. Self-reflection in conversation requires trust. It’s hard to do anything with 3,000 Facebook friends except connect.
  • we seem almost willing to dispense with people altogether. Serious people muse about the future of computer programs as psychiatrists. A high school sophomore confides to me that he wishes he could talk to an artificial intelligence program instead of his dad about dating; he says the A.I. would have so much more in its database. Indeed, many people tell me they hope that as Siri, the digital assistant on Apple’s iPhone, becomes more advanced, “she” will be more and more like a best friend — one who will listen when others won’t.
  • FACE-TO-FACE conversation unfolds slowly. It teaches patience. When we communicate on our digital devices, we learn different habits. As we ramp up the volume and velocity of online connections, we start to expect faster answers. To get these, we ask one another simpler questions; we dumb down our communications, even on the most important matters.
  • WE expect more from technology and less from one another and seem increasingly drawn to technologies that provide the illusion of companionship without the demands of relationship. Always-on/always-on-you devices provide three powerful fantasies: that we will always be heard; that we can put our attention wherever we want it to be; and that we never have to be alone. Indeed our new devices have turned being alone into a problem that can be solved.
  • When people are alone, even for a few moments, they fidget and reach for a device. Here connection works like a symptom, not a cure, and our constant, reflexive impulse to connect shapes a new way of being.
  • Think of it as “I share, therefore I am.” We use technology to define ourselves by sharing our thoughts and feelings as we’re having them. We used to think, “I have a feeling; I want to make a call.” Now our impulse is, “I want to have a feeling; I need to send a text.”
  • Lacking the capacity for solitude, we turn to other people but don’t experience them as they are. It is as though we use them, need them as spare parts to support our increasingly fragile selves.
  • If we are unable to be alone, we are far more likely to be lonely. If we don’t teach our children to be alone, they will know only how to be lonely.
  • I am a partisan for conversation. To make room for it, I see some first, deliberate steps. At home, we can create sacred spaces: the kitchen, the dining room. We can make our cars “device-free zones.”
Ellie McGinnis

The Drugs of Work-Performance Enhancement - Steven Petrow - The Atlantic - 1 views

  • Adderall makes everything easier to understand; it makes you more alert and focused. Some college students scarf them like M&Ms and think they’re more effective at cognitive enhancement than energy drinks and safer than a smoke or a beer.
  • 4.4 percent of the adult U.S. population has ADHD, which if left untreated is associated with significant morbidity, divorce, employment, and substance abuse.
  • Nonetheless, for untold healthy adults (those whom researchers refer to as “mentally competent”) the cognitive-enhancing drug has led to positive changes in their lives.
  • ...7 more annotations...
  • “[Adderall] makes me so happy I can be at a family function or out socializing and not get too distracted by other events/conversations around me. I can hear them, but am not taken in by them.”
  • “Since being on Adderall, I have been insanely productive… I have paid all my outstanding bills and parking tickets (and even renewed my car's registration before it was due). I'm not late for things anymore… I have not spent a single day lying around my house doing nothing in the past few months. I have a budget, and a scheduler that I actually use.”
  • When she asked me why I needed it, I replied just as the college kids had on 60 Minutes: “For focus.”  
  • Did it make me smarter? No. Did it make me a faster writer? Yes. Previously, when I’d sit down at my desk, I felt adrift at sea. It was as though my MacBook and research materials, piled high, swayed from left to right and then back again. It was dizzying; I just couldn’t get a grip.
  • My metaphoric double vision snapped to mono and I could see and think as clearly as if I’d stepped out of a fog. I’d never had such concentration and it showed in the number of well-written pages I produced daily
  • But with Adderall, I had knowledge aplenty and knew that once I stopped it, my depression would quickly lift. I also know that not everyone has that kind of previous experience or perspective, which is when folks get into deep trouble.
  • “Under medical supervision, stimulant medications are considered safe.” I’d add, as the Nature authors did, especially for “mentally competent adults.”
Javier E

Psych, Lies, and Audiotape: The Tarnished Legacy of the Milgram Shock Experiments | - 2 views

  • subjects — 780 New Haven residents who volunteered — helped make an untenured assistant professor named Stanley Milgram a national celebrity. Over the next five decades, his obedience experiments provided the inspiration for films, fiction, plays, documentaries, pop music, prime-time dramas, and reality television. Today, the Milgram experiments are considered among the most famous and most controversial experiments of all time. They are also often used in expert testimony in cases where situational obedience leads to crime
  • Perry’s evidence raises larger questions regarding a study that is still firmly entrenched in American scientific and popular culture: if Milgram lied once about his compromised neutrality, to what extent can we trust anything he said? And how could a blatant breach in objectivity in one of the most analyzed experiments in history go undetected for so long?
  • the debate has never addressed this question: to what extent can we trust his raw data in the first place? In her riveting new book, Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments, Australian psychologist Gina Perry tackles this very topic, taking nothing for granted
  • ...10 more annotations...
  • Her chilling investigation of the experiments and their aftereffects suggests that Milgram manipulated results, misled the public, and flat out lied in order to deflect criticism and further the thesis for which he would become famous
  • She contends that serious factual inaccuracies cloud our understanding of Milgram’s work, inaccuracies which she believes arose “partly because of Milgram’s presentation of his findings — his downplaying of contradictions and inconsistencies — and partly because it was the heart-attack variation that was embraced by the popular media
  • Perry reveals that Milgram massaged the facts in order to deliver the outcome he sought. When Milgram presented his finding — namely, high levels of obedience — both in early papers and in his 1974 book, Obedience to Authority, he stated that if the subject refused the lab coat’s commands more than four times, the subject would be classified as disobedient. But Perry finds that this isn’t what really happened. The further Milgram got in his research, the more he pushed participants to obey.
  • only after criticism of his ethics surfaced, and long after the completion of the studies, did Milgram claim that “a careful post-experimental treatment was administered to all subjects,” in which “at the very least all subjects were told that the victim had not received dangerous electric shocks.” This was, quite simply, a lie. Milgram didn’t want word to spread through New Haven that he was duping his subjects, which could taint the results of his future trials.
  • If the Milgram of Obedience to Authority were the narrator in a novel, I wouldn’t have found him terribly reliable. So why had I believed such a narrator in a work of nonfiction?
  • The answer, I found, was disturbingly simple: I trust scientists
  • I do trust them not to lie about the rules or results of their experiments. And if a scientist does lie, especially in such a famous experiment, I trust that another scientist will quickly uncover the deception. Or at least I used to.
  • At the time, Milgram was 27, fresh out of grad school and needing to make a name for himself in a hyper-competitive department, and Perry suggests that his “career depended on [the subjects’] obedience; all his preparations were aimed at making them obey.”
  • Milgram’s studies — which suggest that nearly two-thirds of subjects will, under certain conditions, administer dangerously powerful electrical shocks to a stranger when commanded to do so by an authority figure — have become a staple of psychology departments around the world. They have even helped shape the rules that govern experiments on human subjects. Along with Zimbardo’s 1971 Stanford prison experiment, which showed that college students assigned the role of “prison guard” quickly started abusing college students assigned the role of “prisoner,” Milgram’s experiments are the starting point for any meaningful discussion of the “I was only following orders” defense, and for determining how the relationship between situational factors and obedience can lead seemingly good people to do horrible things.
  • While Milgram’s defenders point to subsequent recreations of his experiments that have replicated his findings, the unethical nature, not to mention the scope and cost, of the original version have not allowed for full duplications.
Javier E

Ta-Nehisi Coates defines a new race beat - Columbia Journalism Review - 0 views

  • “The Case for Reparations,” Coates’ 16,000-word cover story for The Atlantic, where he is a national correspondent. Published online in May, it was a close look at housing discrimination, such as redlining, that was really about the need for America to take a brutally honest look in the mirror and acknowledge its deep racial divisions.
  • The story broke a single-day traffic record for a magazine story on The Atlantic’s website, and in its wake, Politico named him to its list of 50 thinkers changing American politics
  • Coates believes that if there is an answer to contemporary racism, it lies in confronting the pas
  • ...24 more annotations...
  • For Coates, true equality means “black people in this country have the right to be as mediocre as white people,” he says. “Not that individual black people will be as excellent, or more excellent, than other white people.”
  • he came to see black respectability—the idea that, to succeed, African-Americans must stoically prevail against the odds and be “twice as good” as white people to get the same rights—as deeply immoral.
  • He is no soothsayer, telling people what to think from on high, but rather is refreshingly open about what he doesn’t know, inviting readers to learn with him. Coates is not merely an ivory-tower pontificator or a shiny Web 2.0 brand. He is a public intellectual for the digital age.
  • we miss the real question of why there is a systemic, historical difference in the way police treat blacks versus whites.
  • Another term for that road is “white supremacy.” This refers not so much to hate groups, but, as Coates defines it, a system of policies and beliefs that aims to keep African-Americans as “a peon class.”
  • To be “white” in this sense does not refer merely to skin color but to the degree that someone qualifies as “normal,” and thus worthy of the same rights as all Americans
  • The pool where all these ideas eventually arrive is a question: “How big-hearted can democracy be?” he says. “How many people can it actually include and sustain itself? That is the question I’m asking over and over again.”
  • it is a question of empathy. Are humans capable of forming a society where everyone can flourish?
  • there was the coverage of Michael Brown (or Jordan Davis, or Renisha McBride, or Eric Garner): unarmed African-Americans killed by police or others under controversial circumstances. In each case, the storyline was that these horrific encounters were caused either by genuine provocation, or by race-fueled fear or hatred. Either way, they were stories of personal failings.
  • When an event becomes news, there is often an implication that it is an exception—that the world is mostly working as it should and this event is newsworthy because it’s an aberration. If the race-related stories we see most often in the media are about personal bigotry, then our conception of racism is limited to the bigoted remarks or actions—racism becomes little more than uttering the n-word.
  • he cites research that in 1860 slaves were the largest asset in the US economy. “It is almost impossible to think of democracy, as it was formed in America, without the enslavement of African-Americans,” he says. “Not that these things were bumps in the road along the way, but that they were the road.”
  • a lack of historical perspective in the media’s approach to race. “Journalism privileges what’s happening now over the long reasons for things happening,” he says. “And for African-Americans, that has a particular effect.”
  • Even the very existence of racism is questioned: A recent study published by the Association of Psychological Science has shown that whites think they are discriminated against due to race as much if not more than blacks.
  • “So when you’re talking about something like institutional racism and prejudice, how do you talk about that as an objective reality?”
  • Coates’ strength is in connecting contemporary problems to historical scholarship. “I think if I bring anything to the table it’s the ability to synthesize all of that into something that people find emotionally moving,” he says. The irony of the reparations piece, as unoriginal as it may have been to scholars, is that it was news to many people.
  • Reporting on race requires simultaneously understanding multiple, contradictory worlds, with contradictory narratives. Widespread black poverty exists; so do a black middle class and a black president
  • Progress is key to the myth of American Exceptionalism, and the notion that America is built on slavery and on freedom are discordant ideas that threaten any simple storyline. Coates, together with others who join him, is trying to claim the frontier of a new narrative.
  • reading Coates is like building a worldview, piece by piece, on an area of contemporary life that’s otherwise difficult to grasp.
  • “To come and tell someone may not be as effective in convincing them as allowing them to learn on their own. If you believe you come to a conclusion on your own, you’re more likely to agree.”
  • It’s brave to bare yourself intellectually on the Web, and to acknowledge mistakes, especially when the capital that public intellectuals appear to have is their ability to be “right.”
  • Coates is equally demanding of his followers. Online he is blunt, and willing to call people out. He cares enough to be rigorous
  • despite being a master of online engagement, Coates insists he does not write for others, an idea he explained in a recent post: “I have long believed that the best part of writing is not the communication of knowledge to other people, but the acquisition and synthesizing of knowledge for oneself. The best thing I can say about the reparations piece is that I now understand.”
  • To him, it’s an open question whether or not America will ever be capable of fostering true equality. “How big-hearted can democracy be? It points to a very ugly answer: maybe not that big-hearted at all. That in fact America is not exceptional. That it’s just like every other country. That it passes its democracy and it passes all these allegedly big-hearted programs [the New Deal, the G.I. Bill] but still excludes other people,
  • In a 2010 post about antebellum America, Coates mentioned feminist and abolitionist Angelina Grimke. “Suffice to say that much like Abe Lincoln, and Ulysses Grant, Angelina Grimke was a Walker,” he wrote. “What was the Walker reference?” Rosemartian asked in the comments section. “Just someone who spends their life evolving, or, walking,” Coates replied. “Grant and Lincoln fit in there for me. Malcolm X was another Walker. Walkers tend to be sometimes—even often—wrong. But they are rarely bigots, in the sense of nakedly clinging to ignorance.”
Javier E

Opinion | The Spoken Argument Is a Valuable Form of Expression - The New York Times - 0 views

  • I am ever more perplexed by why we make students learn to write the classic five-paragraph essay but have so much less interest in developing their spoken argument skills.
  • As much as I love writing, I wonder if there is something arbitrary in the idea that education must focus more on the written than the spoken word.
  • Back in the day, people would clear their throat and deliver. They weren’t winging it. They would plan their remarks, without writing them out word for word. They knew their topic and, from that, they spoke.
  • ...22 more annotations...
  • Our sense of a spoken presentation is less formal, more personal, looser. But more formal oratory has its uses.
  • I also think, as I read a book about 19th-century England, of the way parliamentarians used to communicate. The men regularly made their points to their colleagues in speeches that could run far beyond what anyone could write out and memorize word for word
  • Black people of letters, such as W.E.B. Du Bois and Maya Angelou, engaged in oratory contests when they were young, competing for prizes according to how gracefully and how convincingly they made a case for some proposition. The tradition of such contests continues in the Black community.
  • When I have given oral presentations, I reach people more directly than if I’d written everything down for them to read. When people can see your face and hear the melody of your voice, your point gets across more vividly. Language evolved, after all, for face-to-face contact, not rendered as glyphs on paper.
  • The question is why oratory of this kind is so much less central to the culture than it once was.
  • Imagine a square divided into four smaller ones. The top left square is casual speech; the top right square is formal speech. The bottom left square is casual writing; the bottom right square is formal writing. We have, as it were, an empty square in our grid.
  • what about that upper right square, formal speech?
  • When we communicate formally, we moderns think first of getting language down on a page in written form, perhaps out of a sense that this is how to deck language out in its Sunday best.
  • Perhaps it seems that to organize our thoughts properly beyond the level of “Want mustard with that?” we need to tie them down with the yoke of writing.
  • But the ancients didn’t think so. Even with a fully developed writing culture, the Greeks and Romans valued the ability to stand and pose and pace in front of an audience and make their point through speaking it — and formally, not colloquially
  • I imagine a different universe in which academics would be expected to present most of their ideas in solid PowerPoint versions, narrated in formal language, getting across the amount of information a person can actually absorb in 20 to 30 minutes.
  • I wish students had the choice of either writing essays or speaking them. We would train them in the ability to speak carefully and coherently with the same goal of making a point that we require in writing.
  • A lot of people really hate writing. It’s an unnatural activity, as humanity goes.
  • If we imagine that speech has existed for 24 hours, then according to all modern estimates, writing came along only sometime around 11:30 p.m. Writing is an artifice, and given a choice, most people would rather talk (or text).
  • For students who prefer it — and most of them likely would — the idea would be to give an oral presentation to the class, going from a memorized outline of planned remarks but expressing its points spontaneously. They would be graded on the quality of both the delivery and the content.
  • It is unclear to me that there is a reason to classify oral suasion as something lesser than the written version, as long as students are instructed that they are to maintain a basic, tempered poise, without relying on volume or colorful rhetoric to stand in for logic.
  • Some will object that students will need to be able to craft arguments in writing in their future endeavors. But to channel the modern kind of skeptical response: Will they, though?
  • An alternate universe would be one in which students who thought of themselves as likely to need such a skill in the future, such as in the law, would be the ones who choose written over oral expression.
  • When I am asked to speak about something, I do some written preparation to organize my thoughts, but I don’t craft sentences. I fashion my ideas into exactly three basic points.
  • In terms of realistic expectations of human attention span, especially in our eternally distracted era, even four points is too many, but two isn’t enough
  • Three points, each expressed with about three subpoints. I consider it my job to be able to hold this much in my memory, along with intentions of an introduction and a conclusion.
  • when it comes to individuals expressing their intelligence for assignments or teaching, I cannot see that writing is the only legitimate and effective vehicle. We are a society that values speaking engagingly but places less of a value on speaking precisely. This is a mere matter of cultural preference; I wish it would change.
sissij

Earning a Degree, and Her Daughters' Admiration - The New York Times - 0 views

  • Now Ms. Hopewell, 37, is ebullient, and full of smiles, hugs and laughs. After spending the last three and a half years studying forensic psychology at the John Jay College of Criminal Justice in Midtown, it was official: She was a college graduate, the first in her family.
  • “But at the end of the day I know it’s beneficial for my family and I want bigger and better things, and I have to do it.”
  • I want to leave a legacy for my kids when I leave this earth, and living paycheck to paycheck is not going to get it.
  • ...3 more annotations...
  • The straight-A student said she hoped to study neuroscience at Harvard one day.
  • She hopes they see from her experiences that education is the best way to avoid repeating her struggles.
  • “This experience showed me that I’m raising well-rounded young ladies who can adapt to any situation and make the best of it,” Ms. Hopewell said. “Their capabilities are endless.”
  •  
    This article is very inspiring. This is an example of how education can get someone to a higher and better place. Education is something that's worth investing. I really like one thing that she said: "I want to leave a legacy for my kids when I leave this earth, and living paycheck to paycheck is not going to get it." Although for many of us here, attending to college is a must-to-do thing, for many other people, attending colleges a dream, an ultimate goal. Many of us go to college and just waste another four years there. But for Ms. Hopewell, the college education polished her and made her a complete new person. I just think it's interesting that why we get completely different outcomes from having college education?  I think it's because we never put much effort in getting a college education as Ms. Hopewell did, so the force effect doesn't give us the pride of commitment. --Sissi (1/23/2017)
Javier E

A Crush on God | Commonweal magazine - 0 views

  • Ignatius taught the Jesuits to end each day doing something called the Examen. You start by acknowledging that God is there with you; then you give thanks for the good parts of your day (mine usually include food); and finally, you run through the events of the day from morning to the moment you sat down to pray, stopping to consider when you felt consolation, the closeness of God, or desolation, when you ignored God or when you felt like God bailed on you. Then you ask for forgiveness for anything shitty you did, and for guidance tomorrow. I realize I’ve spent most of my life saying “thanks” to people in a perfunctory, whatever kind of way. Now when I say it I really mean it, even if it’s to the guy who makes those lattes I love getting in the morning, because I stopped and appreciated his latte-making skills the night before. If you are lucky and prone to belief, the Examen will also help you start really feeling God in your life.
  • My church hosts a monthly dinner for the homeless. Serious work is involved; volunteers pull multiple shifts shopping, prepping, cooking, serving food, and cleaning. I show up for the first time and am shuttled into the kitchen by a harried young woman with a pen stuck into her ponytail, who asks me if I can lift heavy weights before putting me in front of two bins of potato salad and handing me an ice cream scoop. For three hours, I scoop potato salad onto plates, heft vats of potato salad, and scrape leftover potato salad into the compost cans. I never want to eat potato salad again, but I learn something about the homeless people I’ve been avoiding for years: some are mentally a mess, many—judging from the smell—are drunk off their asses, but on the whole, they are polite, intelligent, and, more than anything else, grateful. As I walk back to my car, I’m stopped several times by many of them who want to thank me, saying how good the food was, how much they enjoyed it. “I didn’t do anything,” I say in return. “You were there,” one of them replies. It’s enough to make me go back the next month, and the month after that. And in between, when I see people I feed on the street, instead of focusing my eyes in the sidewalk and hoping they go away, we have conversations. It’s those conversations that move me from intellectual distance toward a greater sense of gratitude for the work of God.
carolinewren

Sarah Palin dives in poll ratings as Tina Fey impersonates her on Saturday Night Live -... - 0 views

  • Palin's poll ratings are telling a more devastating story.
  • engage with the process much earlier on – not least with their Sunday morning political talk shows
  • It currently commands 10 million viewers – a creditable figure for a primetime drama, let alone a late-night sketch show.
  • ...13 more annotations...
  • Other satirical shows, such as The Daily Show with Jon Stewart and The Colbert Report, are also enjoying record ratings, as well as influence far beyond their own viewers.
  • Even bigger than Saturday Night Live have been the presidential and vice-presidential debates. Sarah Palin's set-to with Joe Biden on October 2 attracted nearly 70 million viewers – a record for a vice-presidential debate and the highest-rated election debate since 1992
  • It is impossible to imagine a similar level of engagement with political television in this country. Gordon Brown and David Cameron would not only have to debate each other on TV – an unlikely scenario in itself – but pull in an audience bigger than the finals of Britain's Got Talent and Strictly Come Dancing put together
  • American networks do have some advantages over the BBC and ITV in planning and executing their political coverage
  • four-year timetable, avoiding the unholy scramble when a British general election is called at a month's notice.
  • In a Newsweek poll in September, voters were asked whether Palin was qualified or unqualified to be president. The result was a near dead-heat. In the same poll this month, those saying she was "unqualified" outnumbered those saying she was "qualified" by a massive 16 points
  • "I think we're learning what it means to have opinion journalism in this country on such a grand scale," says Stelter. "It's only in the last six to 12 months that those lines have hardened between Fox and MSNBC. I think the [ratings] numbers for cable have surprised people.
  • I think that shows that people are looking for different stripes of political news."
  • American political TV certainly is polarised. When Governor Palin attacked the media in her speech at the Republican convention last month, the crowd chanted "NBC"
  • Gwen Ifill, a respected anchor on the non-commercial channel PBS, who moderated the vice-presidential debate, saw her impartiality attacked because she is writing a book about African-American politics that mentions Obama in its title
  • America's networks comprehensively outstrip this country in both volume and quality of political coverage.
  • All three major US networks – ABC, CBS and NBC – offer a large amount of serious (and unbiased) political coverage, both in their evening network newscasts and in their morning equivalents of GMTV
  • Impartiality and the public service ethos hardly characterise Tina Fey's performances. Tonight's presidential debate forms part of a series driven largely by commercial networks, not publicly funded channels. Neither Fox News nor MSNBC was set up as a sop to a regulator
Javier E

If It Feels Right - NYTimes.com - 3 views

  • What’s disheartening is how bad they are at thinking and talking about moral issues.
  • you see the young people groping to say anything sensible on these matters. But they just don’t have the categories or vocabulary to do so.
  • “Not many of them have previously given much or any thought to many of the kinds of questions about morality that we asked,” Smith and his co-authors write. When asked about wrong or evil, they could generally agree that rape and murder are wrong. But, aside from these extreme cases, moral thinking didn’t enter the picture, even when considering things like drunken driving, cheating in school or cheating on a partner.
  • ...8 more annotations...
  • The default position, which most of them came back to again and again, is that moral choices are just a matter of individual taste. “It’s personal,” the respondents typically said. “It’s up to the individual. Who am I to say?”
  • “I would do what I thought made me happy or how I felt. I have no other way of knowing what to do but how I internally feel.”
  • their attitudes at the start of their adult lives do reveal something about American culture. For decades, writers from different perspectives have been warning about the erosion of shared moral frameworks and the rise of an easygoing moral individualism. Allan Bloom and Gertrude Himmelfarb warned that sturdy virtues are being diluted into shallow values. Alasdair MacIntyre has written about emotivism, the idea that it’s impossible to secure moral agreement in our culture because all judgments are based on how we feel at the moment. Charles Taylor has argued that morals have become separated from moral sources. People are less likely to feel embedded on a moral landscape that transcends self. James Davison Hunter wrote a book called “The Death of Character.” Smith’s interviewees are living, breathing examples of the trends these writers have described.
  • Smith and company found an atmosphere of extreme moral individualism — of relativism and nonjudgmentalism.
  • they have not been given the resources — by schools, institutions and families — to cultivate their moral intuitions, to think more broadly about moral obligations, to check behaviors that may be degrading.
  • the interviewees were so completely untroubled by rabid consumerism.
  • Many were quick to talk about their moral feelings but hesitant to link these feelings to any broader thinking about a shared moral framework or obligation. As one put it, “I mean, I guess what makes something right is how I feel about it. But different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and wrong.”
  • In most times and in most places, the group was seen to be the essential moral unit. A shared religion defined rules and practices. Cultures structured people’s imaginations and imposed moral disciplines. But now more people are led to assume that the free-floating individual is the essential moral unit. Morality was once revealed, inherited and shared, but now it’s thought of as something that emerges in the privacy of your own heart.
  •  
    Goodness, I went through a bit of emotion reading that. Whew. Gotta center. Anyhoo, I feel certainly conflicted over the author's idea of "shallow values." Personally, I don't necessarily see the need to have a shared moral framework to connect to. What is this framework if not a system to instill shame and obligation into its members? While I do think it's important to have an articulate moral opinion on relevant subjects, I also think the world cannot be divided into realms of right or wrong when we can barely see even an infinitely small part of it at one time. What's wrong with open-mindedness?
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
« First ‹ Previous 61 - 80 of 1987 Next › Last »
Showing 20 items per page