Skip to main content

Home/ TOK Friends/ Group items tagged recruiting

Rss Feed Group items tagged

Javier E

Guess Who Doesn't Fit In at Work - NYTimes.com - 0 views

  • ACROSS cultures and industries, managers strongly prize “cultural fit” — the idea that the best employees are like-minded.
  • One recent survey found that more than 80 percent of employers worldwide named cultural fit as a top hiring priority.
  • When done carefully, selecting new workers this way can make organizations more productive and profitable.
  • ...18 more annotations...
  • In the process, fit has become a catchall used to justify hiring people who are similar to decision makers and rejecting people who are not.
  • The concept of fit first gained traction in the 1980s. The original idea was that if companies hired individuals whose personalities and values — and not just their skills — meshed with an organization’s strategy, workers would feel more attached to their jobs, work harder and stay longer.
  • in many organizations, fit has gone rogue. I saw this firsthand while researching the hiring practices of the country’s top investment banks, management consultancies and law firms. I interviewed 120 decision makers and spent nine months observing
  • While résumés (and connections) influenced which applicants made it into the interview room, interviewers’ perceptions of fit strongly shaped who walked out with job offers.
  • Crucially, though, for these gatekeepers, fit was not about a match with organizational values. It was about personal fit. In these time- and team-intensive jobs, professionals at all levels of seniority reported wanting to hire people with whom they enjoyed hanging out and could foresee developing close relationships with
  • To judge fit, interviewers commonly relied on chemistry. “
  • Many used the “airport test.” As a managing director at an investment bank put it, “Would I want to be stuck in an airport in Minneapolis in a snowstorm with them?”
  • interviewers were primarily interested in new hires whose hobbies, hometowns and biographies matched their own. Bonding over rowing college crew, getting certified in scuba, sipping single-malt Scotches in the Highlands or dining at Michelin-starred restaurants was evidence of fit; sharing a love of teamwork or a passion for pleasing clients was not
  • it has become a common feature of American corporate culture. Employers routinely ask job applicants about their hobbies and what they like to do for fun, while a complementary self-help industry informs white-collar job seekers that chemistry, not qualifications, will win them an offer.
  • Selection based on personal fit can keep demographic and cultural diversity low
  • In the elite firms I studied, the types of shared experiences associated with fit typically required large investments of time and money.
  • Class-biased definitions of fit are one reason investment banks, management consulting firms and law firms are dominated by people from the highest socioeconomic backgrounds
  • Also, whether the industry is finance, high-tech or fashion, a good fit in most American corporations still tends to be stereotypically masculine.
  • Perhaps most important, it is easy to mistake rapport for skill. Just as they erroneously believe that they can accurately tell when someone is lying, people tend to be overly confident in their ability to spot talent. Unstructured interviews, which are the most popular hiring tools for American managers and the primary way they judge fit, are notoriously poor predictors of job performance.
  • Organizations that use cultural fit for competitive advantage tend to favor concrete tools like surveys and structured interviews that systematically test behaviors associated with increased performance and employee retention.
  • For managers who want to use cultural fit in a more productive way, I have several suggestions.
  • First, communicate a clear and consistent idea of what the organization’s culture is (and is not) to potential employees. Second, make sure the definition of cultural fit is closely aligned with business goals. Ideally, fit should be based on data-driven analysis of what types of values, traits and behaviors actually predict on-the-job success. Third, create formal procedures like checklists for measuring fit, so that assessment is not left up to the eyes (and extracurriculars) of the beholder.
  • But cultural fit has become a new form of discrimination that keeps demographic and cultural diversity down
Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
qkirkpatrick

How your eyes betray your thoughts | Science | The Guardian - 0 views

  • ccording to the old saying, the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide.
  • Although modern science precludes the existence of the soul, it does suggest that there is a kernel of truth in this saying: it turns out the eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions.
  • Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously.
  • ...6 more annotations...
  • And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep.
  • Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate
  • Watching the eyes can even help predict what number a person has in mind. Tobias Loetscher and his colleagues at the University of Zurich recruited 12 volunteers and tracked their eye movements while they reeled off a list of 40 numbers.
  • They found that the direction and size of the participants’ eye movements accurately predicted whether the number they were about to say was bigger or smaller than the previous one – and by how much.
  • Each volunteer’s gaze shifted up and to the right just before they said a bigger number, and down and to the left before a smaller one. The bigger the shift from one side to the other, the bigger the difference between the numbers.
  • The ubiquity of eye-tracking apps for smartphones and other hand-held devices raises the possibility of altering people’s decision-making process remotely. “If you’re shopping online, they might bias your decision by offering f
  •  
    Eyes and how they can affect and show what we are thinking
Javier E

What do the best classrooms in the world look like? - By Amanda Ripley - Slate Magazine - 0 views

  • school systems in Singapore, Finland, and Korea recruit 100 percent of their teachers from the top one-third of their academic cohort
  • In the United States, about 23 percent of new teachers—and only 14 percent in high-poverty schools—come from the top one-third. "It is a remarkably large difference in approach, and in results,
  • Suben, like most great teachers, is in a hurry. She said computers can be useful, but mostly because they save her time—by assessing what her kids know more efficiently than she can.
  • ...2 more annotations...
  • When I ask Suben which gadget she would bring with her if she had to teach on a desert island, she chooses the overhead projector, without hesitation. "I wouldn't be able to give up the overhead, because then I'd have to turn my back to the class," she said. The oldest technology in the room is the one that helps her the most with a fundamental human skill—presenting material while staying connected to every student in the room, watching who is getting it and who is not, without having to turn to write on a chalkboard.
  • One of the most reliable indicators of academic achievement among English speakers in Canada is working knowledge of a second language, if it was one learned at school instead of natively, even better. Learning to think, write and speak in two or more languages has been proven repeatedly to boost intellectual development across a range of subjects
Javier E

Lies, Damned Lies, and Medical Science - Magazine - The Atlantic - 0 views

  • He and his team have shown, again and again, and in many different ways, that much of what biomedical researchers conclude in published studies—conclusions that doctors keep in mind when they prescribe antibiotics or blood-pressure medication, or when they advise us to consume more fiber or less meat, or when they recommend surgery for heart disease or back pain—is misleading, exaggerated, and often flat-out wrong. He charges that as much as 90 percent of the published medical information that doctors rely on is flawed. His work has been widely accepted by the medical community
  • for all his influence, he worries that the field of medical research is so pervasively flawed, and so riddled with conflicts of interest, that it might be chronically resistant to change—or even to publicly admitting that there’s a problem
  • he discovered that the range of errors being committed was astonishing: from what questions researchers posed, to how they set up the studies, to which patients they recruited for the studies, to which measurements they took, to how they analyzed the data, to how they presented their results, to how particular studies came to be published in medical journals
  • ...5 more annotations...
  • “The studies were biased,” he says. “Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there.” Researchers headed into their studies wanting certain results—and, lo and behold, they were getting them. We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it’s easy to manipulate results, even unintentionally or unconsciously. “At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded,” says Ioannidis. “There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded.”
  • Ioannidis laid out a detailed mathematical proof that, assuming modest levels of researcher bias, typically imperfect research techniques, and the well-known tendency to focus on exciting rather than highly plausible theories, researchers will come up with wrong findings most of the time.
  • if you’re attracted to ideas that have a good chance of being wrong, and if you’re motivated to prove them right, and if you have a little wiggle room in how you assemble the evidence, you’ll probably succeed in proving wrong theories right. His model predicted, in different fields of medical research, rates of wrongness roughly corresponding to the observed rates at which findings were later convincingly refuted: 80 percent of non-randomized studies (by far the most common type) turn out to be wrong, as do 25 percent of supposedly gold-standard randomized trials, and as much as 10 percent of the platinum-standard large randomized trials.
  • He zoomed in on 49 of the most highly regarded research findings in medicine over the previous 13 years, as judged by the science community’s two standard measures: the papers had appeared in the journals most widely cited in research articles, and the 49 articles themselves were the most widely cited articles in these journals
  • Ioannidis was putting his contentions to the test not against run-of-the-mill research, or even merely well-accepted research, but against the absolute tip of the research pyramid. Of the 49 articles, 45 claimed to have uncovered effective interventions. Thirty-four of these claims had been retested, and 14 of these, or 41 percent, had been convincingly shown to be wrong or significantly exaggerated. If between a third and a half of the most acclaimed research in medicine was proving untrustworthy, the scope and impact of the problem were undeniable.
Javier E

Big Think Interview With Nicholas Carr | Nicholas Carr | Big Think - 0 views

  • Neurologically, how does our brain adapt itself to new technologies? Nicholas Carr: A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses. And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections. And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways. On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
  • And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading.
  • What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
  • ...12 more annotations...
  • we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information.
  • Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit.
  • With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words.
  • If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that.
  • if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
  • The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking.
  • we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
  • the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
  • On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
  • the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
  • So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions.
  • what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
Javier E

Start-Up Handles Social Media Background Checks - NYTimes.com - 0 views

  • 75 percent of recruiters are required by their companies to do online research of candidates.
Javier E

A Christian Nation? Since When? - NYTimes.com - 0 views

  • For all our talk about separation of church and state, religious language has been written into our political culture in countless ways. It is inscribed in our pledge of patriotism, marked on our money, carved into the walls of our courts and our Capitol. Perhaps because it is everywhere, we assume it has been from the beginning.
  • the founding fathers didn’t create the ceremonies and slogans that come to mind when we consider whether this is a Christian nation. Our grandfathers did.
  • Back in the 1930s, business leaders found themselves on the defensive. Their public prestige had plummeted with the Great Crash; their private businesses were under attack by Franklin D. Roosevelt’s New Deal from above and labor from below. To regain the upper hand, corporate leaders fought back on all fronts. They waged a figurative war in statehouses and, occasionally, a literal one in the streets; their campaigns extended from courts of law to the court of public opinion.
  • ...15 more annotations...
  • But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity.The two had been described as soul mates before, but in this campaign they were wedded in pointed opposition to the “creeping socialism” of the New Deal
  • Accordingly, throughout the 1930s and ’40s, corporate leaders marketed a new ideology that combined elements of Christianity with an anti-federal libertarianism.
  • Powerful business lobbies like the United States Chamber of Commerce and the National Association of Manufacturers led the way, promoting this ideology’s appeal in conferences and P.R. campaigns. Generous funding came from prominent businessmen
  • In a shrewd decision, these executives made clergymen their spokesmen.
  • businessmen worked to recruit clergy through private meetings and public appeals. Many answered the call
  • The most important clergyman for Christian libertarianism, though, was the Rev. Billy Graham.
  • In his initial ministry, in the early 1950s, Mr. Graham supported corporate interests so zealously that a London paper called him “the Big Business evangelist.” The Garden of Eden, he informed revival attendees, was a paradise with “no union dues, no labor leaders, no snakes, no disease.” In the same spirit, he denounced all “government restrictions” in economic affairs, which he invariably attacked as “socialism.”
  • Dwight D. Eisenhower fulfilled that prediction. With Mr. Graham offering Scripture for Ike’s speeches, the Republican nominee campaigned in what he called a “great crusade for freedom.
  • Elected in a landslide, Eisenhower told Mr. Graham that he had a mandate for a “spiritual renewal.”
  • Although Eisenhower relied on Christian libertarian groups in the campaign, he parted ways with their agenda once elected. The movement’s corporate sponsors had seen religious rhetoric as a way to dismantle the New Deal state.
  • But the newly elected president thought that a fool’s errand. “Should any political party attempt to abolish Social Security, unemployment insurance, and eliminate labor laws and farm programs,” he noted privately, “you would not hear of that party again in our political history.”
  • Unlike those who held public spirituality as a means to an end, Eisenhower embraced it as an end unto itself.
  • Uncoupling the language of “freedom under God” from its Christian libertarian roots, Eisenhower erected a bigger revival tent, welcoming Jews and Catholics alongside Protestants, and Democrats as well as Republicans. Rallying the country, he advanced a revolutionary array of new religious ceremonies and slogans.
  • The rest of Washington consecrated itself, too. The Pentagon, State Department and other executive agencies quickly instituted prayer services of their own. In 1954, Congress added “under God” to the previously secular Pledge of Allegiance. It placed a similar slogan, “In God We Trust,” on postage that year and voted the following year to add it to paper money; in 1956, it became the nation’s official motto.
  • During these years, Americans were told, time and time again, not just that the country should be a Christian nation, but that it always had been one. They soon came to think of the United States as “one nation under God.” They’ve believed it ever since.
Javier E

The Rich Have Higher Level of Narcissism, Study Shows | TIME.com - 1 views

  • The rich really are different — and, apparently more self-absorbed, according to the latest research.
  • Recent studies show, for example, that wealthier people are more likely to cut people off in traffic and to behave unethically in simulated business and charity scenarios.
  • Earlier this year, statistics on charitable giving revealed that while the wealthy donate about 1.3% of their income to charity, the poorest actually give more than twice as much as a proportion of their earnings — 3.2%.
  • ...11 more annotations...
  • In five different experiments involving several hundred undergraduates and 100 adults recruited from online communities, the researchers found higher levels of both narcissism and entitlement among those of higher income and social class.
  • when asked to visually depict themselves as circles, with size indicating relative importance, richer people picked larger circles for themselves and smaller ones for others. Another experiment found that they also looked in the mirror more frequently.
  • The wealthier participants were also more likely to agree with statements like “I honestly feel I’m just more deserving than other people
  • But which came first — did gaining wealth increase self-aggrandizement? Were self-infatuated people more likely to seek and then gain riches
  • To explore that relationship further, the researchers also asked the college students in one experiment to report the educational attainment and annual income of their parents. Those with more highly educated and wealthier parents remained higher in their self-reported entitlement and narcissistic characteristics. “That would suggest that it’s not just [that] people who feel entitled are more likely to become wealthy,” says Piff. Wealth, in other words, may breed narcissistic tendencies — and wealthy people justify their excess by convincing themselves that they are more deserving of it
  • “The strength of the study is that it uses multiple methods for measuring narcissism and entitlement and social class and multiple populations, and that can really increase our confidence in the results,”
  • “This paper should not be read as saying that narcissists are more successful because we know from lots of other studies that that’s not true.
  • “entitlement is a facet of narcissism,” says Twenge. “And [it’s the] one most associated with high social class. It’s the idea that you deserve special treatment and that things will come to you without working hard.”
  • Manipulating the sense of entitlement, however, may provide a way to influence narcissism. In the final experiment in the paper, the researchers found that having participants who listed three benefits of seeing others as equals eliminated class differences in narcissism, while simply listing three daily activities did not.
  • In the meantime, the connection between wealth and entitlement could have troubling social implications. “You have this bifurcation of rich and poor,” says Levine. “The rich are increasingly entitled, and since they set the cultural tone for advertising and all those kinds of things, I think there’s a pervasive sense of entitlement.”
  • That could perpetuate a deepening lack of empathy that could fuel narcissistic tendencies. “You could imagine negative attitudes toward wealth redistribution as a result of entitlement,” says Piff. “The more severe inequality becomes, the more entitled people may feel and the less likely to share those resources they become.”
anonymous

Daily Report: The Internet Is Full of Mean People - The New York Times - 0 views

  • That the Internet is full of terrible things is not exactly a revelation, but a point worth noting.
  • Terrorist recruiting, flame wars, trolls, hackers and depictions of deviant behavior
  • It’s out there.
  • ...3 more annotations...
  • But in the interest of balance, given all this criticism the Internet has faced lately, let’s list a few great (or at least harmless) things about the global network
  • None of that, of course, even touches on the change-the-world technologies in medicine, commerce, communications, artificial intelligence, education and any number of fields that wouldn’t exist without the Internet.
  • o, Internet, you’ve got an ugly streak for sure. But maybe you’re getting a bum rap.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
anonymous

Holding hands can sync brainwaves, ease pain, study shows -- ScienceDaily - 0 views

  • Reach for the hand of a loved one in pain and not only will your breathing and heart rate synchronize with theirs, your brain wave patterns will couple up too, according to a study published this week in the Proceedings of the National Academy of Sciences (PNAS).The study, by researchers with the University of Colorado Boulder and University of Haifa, also found that the more empathy a comforting partner feels for a partner in pain, the more their brainwaves fall into sync. And the more those brain waves sync, the more the pain goes away.
  • The study is the latest in a growing body of research exploring a phenomenon known as "interpersonal synchronization," in which people physiologically mirror the people they are with. It is the first to look at brain wave synchronization in the context of pain, and offers new insight into the role brain-to-brain coupling may play in touch-induced analgesia, or healing touch.
  • He and his colleagues at University of Haifa recruited 22 heterosexual couples, age 23 to 32 who had been together for at least one year and put them through several two-minute scenarios as electroencephalography (EEG) caps measured their brainwave activity. The scenarios included sitting together not touching; sitting together holding hands; and sitting in separate rooms. Then they repeated the scenarios as the woman was subjected to mild heat pain on her arm.Merely being in each other's presence, with or without touch, was associated with some brain wave synchronicity in the alpha mu band, a wavelength associated with focused attention. If they held hands while she was in pain, the coupling increased the most.
  • ...1 more annotation...
  • The study did not explore whether the same effect would occur with same-sex couples, or what happens in other kinds of relationships. The takeaway for now, Pavel said: Don't underestimate the power of a hand-hold."You may express empathy for a partner's pain, but without touch it may not be fully communicated," he said.
Javier E

How Tech Can Turn Doctors Into Clerical Workers - The New York Times - 0 views

  • what I see in my colleague is disillusionment, and it has come too early, and I am seeing too much of it.
  • In America today, the patient in the hospital bed is just the icon, a place holder for the real patient who is not in the bed but in the computer. That virtual entity gets all our attention. Old-fashioned “bedside” rounds conducted by the attending physician too often take place nowhere near the bed but have become “card flip” rounds
  • My young colleague slumping in the chair in my office survived the student years, then three years of internship and residency and is now a full-time practitioner and teacher. The despair I hear comes from being the highest-paid clerical worker in the hospital: For every one hour we spend cumulatively with patients, studies have shown, we spend nearly two hours on our primitive Electronic Health Records, or “E.H.R.s,” and another hour or two during sacred personal time.
  • ...23 more annotations...
  • The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know
  • Our $3.4 trillion health care system is responsible for more than a quarter of a million deaths per year because of medical error, the rough equivalent of, say, a jumbo jet’s crashing every day.
  • I can get cash and account details all over America and beyond. Yet I can’t reliably get a patient record from across town, let alone from a hospital in the same state, even if both places use the same brand of E.H.R
  • the leading E.H.R.s were never built with any understanding of the rituals of care or the user experience of physicians or nurses. A clinician will make roughly 4,000 keyboard clicks during a busy 10-hour emergency-room shift
  • In the process, our daily progress notes have become bloated cut-and-paste monsters that are inaccurate and hard to wade through. A half-page, handwritten progress note of the paper era might in a few lines tell you what a physician really thought
  • so much of the E.H.R., but particularly the physical exam it encodes, is a marvel of fiction, because we humans don’t want to leave a check box empty or leave gaps in a template.
  • For a study, my colleagues and I at Stanford solicited anecdotes from physicians nationwide about patients for whom an oversight in the exam (a “miss”) had resulted in real consequences, like diagnostic delay, radiation exposure, therapeutic or surgical misadventure, even death. They were the sorts of things that would leave no trace in the E.H.R. because the recorded exam always seems complete — and yet the omission would be glaring and memorable to other physicians involved in the subsequent care. We got more than 200 such anecdotes.
  • The reason for these errors? Most of them resulted from exams that simply weren’t done as claimed. “Food poisoning” was diagnosed because the strangulated hernia in the groin was overlooked, or patients were sent to the catheterization lab for chest pain because no one saw the shingles rash on the left chest.
  • I worry that such mistakes come because we’ve gotten trapped in the bunker of machine medicine. It is a preventable kind of failure
  • How we salivated at the idea of searchable records, of being able to graph fever trends, or white blood counts, or share records at a keystroke with another institution — “interoperability”
  • The seriously ill patient has entered another kingdom, an alternate universe, a place and a process that is frightening, infantilizing; that patient’s greatest need is both scientific state-of-the-art knowledge and genuine caring from another human being. Caring is expressed in listening, in the time-honored ritual of the skilled bedside exam — reading the body — in touching and looking at where it hurts and ultimately in localizing the disease for patients not on a screen, not on an image, not on a biopsy report, but on their bodies.
  • What if the computer gave the nurse the big picture of who he was both medically and as a person?
  • a professor at M.I.T. whose current interest in biomedical engineering is “bedside informatics,” marvels at the fact that in an I.C.U., a blizzard of monitors from disparate manufacturers display EKG, heart rate, respiratory rate, oxygen saturation, blood pressure, temperature and more, and yet none of this is pulled together, summarized and synthesized anywhere for the clinical staff to use
  • What these monitors do exceedingly well is sound alarms, an average of one alarm every eight minutes, or more than 180 per patient per day. What is our most common response to an alarm? We look for the button to silence the nuisance because, unlike those in a Boeing cockpit, say, our alarms are rarely diagnosing genuine danger.
  • By some estimates, more than 50 percent of physicians in the United States have at least one symptom of burnout, defined as a syndrome of emotional exhaustion, cynicism and decreased efficacy at work
  • It is on the increase, up by 9 percent from 2011 to 2014 in one national study. This is clearly not an individual problem but a systemic one, a 4,000-key-clicks-a-day problem.
  • The E.H.R. is only part of the issue: Other factors include rapid patient turnover, decreased autonomy, merging hospital systems, an aging population, the increasing medical complexity of patients. Even if the E.H.R. is not the sole cause of what ails us, believe me, it has become the symbol of burnou
  • burnout is one of the largest predictors of physician attrition from the work force. The total cost of recruiting a physician can be nearly $90,000, but the lost revenue per physician who leaves is between $500,000 and $1 million, even more in high-paying specialties.
  • I hold out hope that artificial intelligence and machine-learning algorithms will transform our experience, particularly if natural-language processing and video technology allow us to capture what is actually said and done in the exam room.
  • as with any lab test, what A.I. will provide is at best a recommendation that a physician using clinical judgment must decide how to apply.
  • True clinical judgment is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done.
  • Much of that is a result of poorly coordinated care, poor communication, patients falling through the cracks, knowledge not being transferred and so on, but some part of it is surely from failing to listen to the story and diminishing skill in reading the body as a text.
  • As he was nearing death, Avedis Donabedian, a guru of health care metrics, was asked by an interviewer about the commercialization of health care. “The secret of quality,” he replied, “is love.”/•/
Javier E

The Moral Instinct - The New York Times - 2 views

  • Today, a new field is using illusions to unmask a sixth sense, the moral sense. Moral intuitions are being drawn out of people in the lab, on Web sites and in brain scanners, and are being explained with tools from game theory, neuroscience and evolutionary biology.
  • The other hallmark is that people feel that those who commit immoral acts deserve to be punished
  • If morality is a mere trick of the brain, some may fear, our very grounds for being moral could be eroded. Yet as we shall see, the science of the moral sense can instead be seen as a way to strengthen those grounds, by clarifying what morality is and how it should steer our actions.
  • ...13 more annotations...
  • The starting point for appreciating that there is a distinctive part of our psychology for morality is seeing how moral judgments differ from other kinds of opinions we have on how people ought to behave.
  • Moralization is a psychological state that can be turned on and off like a switch, and when it is on, a distinctive mind-set commandeers our thinking. This is the mind-set that makes us deem actions immoral (“killing is wrong”), rather than merely disagreeable (“I hate brussels sprouts”), unfashionable (“bell-bottoms are out”) or imprudent (“don’t scratch mosquito bites”).
  • The first hallmark of moralization is that the rules it invokes are felt to be universal
  • Many of these moralizations, like the assault on smoking, may be understood as practical tactics to reduce some recently identified harm. But whether an activity flips our mental switches to the “moral” setting isn’t just a matter of how much harm it does
  • We all know what it feels like when the moralization switch flips inside us — the righteous glow, the burning dudgeon, the drive to recruit others to the cause.
  • The human moral sense turns out to be an organ of considerable complexity, with quirks that reflect its evolutionary history and its neurobiological foundations.
  • At the same time, many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality.
  • This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it.
  • Much of our recent social history, including the culture wars between liberals and conservatives, consists of the moralization or amoralization of particular kinds of behavior.
  • People don’t generally engage in moral reasoning, Haidt argues, but moral rationalization: they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.
  • When psychologists say “most people” they usually mean “most of the two dozen sophomores who filled out a questionnaire for beer money.” But in this case it means most of the 200,000 people from a hundred countries who shared their intuitions on a Web-based experiment conducted by the psychologists Fiery Cushman and Liane Young and the biologist Marc Hauser. A difference between the acceptability of switch-pulling and man-heaving, and an inability to justify the choice, was found in respondents from Europe, Asia and North and South America; among men and women, blacks and whites, teenagers and octogenarians, Hindus, Muslims, Buddhists, Christians, Jews and atheists; people with elementary-school educations and people with Ph.D.’s.
  • Joshua Greene, a philosopher and cognitive neuroscientist, suggests that evolution equipped people with a revulsion to manhandling an innocent person. This instinct, he suggests, tends to overwhelm any utilitarian calculus that would tot up the lives saved and lost
  • the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis.
cvanderloo

Two stereotypes that diminish the humanity of the Atlanta shooting victims - and all As... - 0 views

  • Since the Atlanta spa shootings, the U.S. media has been working harder than usual to describe and understand Asian Americans.
  • One is that of Asian Americans as the “perpetual foreigner” – immigrants who constantly struggle, never assimilate.
  • Both stereotypes have been levied in tandem against Asian immigrants to the U.S. for centuries.
  • ...8 more annotations...
  • In the mid-1800s, Chinese laborers made up the first significant wave of Asian immigration to the United States. Recruited during the Gold Rush and to build the Transcontinental Railroad, the men were described by employers like industrialist Leland Stanford as “quiet, peaceable, patient, industrious and economical.”
  • The model-minority myth emerged later. In 1965 the Hart-Celler Act opened immigration quotas that had previously favored Western Europeans.
  • Out-earning all other racial groups, Asian Americans became the “model minority,” a term first coined by sociologist William Petersen in a 1966 New York Times article, “Success Story: Japanese American style.”
  • Without the same educational and professional sponsorships as my father had, many in these later waves founded mom-and-pop businesses and peer lending networks. They gravitated toward blue-collar industries and “pink-collar” jobs in salons, food service or child care.
  • The women who worked and died at Young’s Asian Massage, Gold Spa and Aromatherapy Spa lived in this Asian America – not my father’s or mine.
  • Such workers make up the other side of the high-earning “model minority” statistic. That data masks the fact that Asian Americans have the highest income inequality of any racial group, with the top 10% of this population earning more than 10 times what the bottom 10% earns.
  • When Asian Americans are so easily, and so often, stereotyped, they become categories, not people – not individuals who make lives, raise families and do the best they can in their adopted homeland.
  • “My mother didn’t do anything wrong,” the son of 63-year-old Yong Ae Yue, the fan of Korean soap operas and cooking, told the Atlanta Journal-Constitution. “And she deserves the recognition that she is a human.”
lucieperloff

As Pandemic Upends Teaching, Fewer Students Want to Pursue It - The New York Times - 0 views

  • Kianna Ameni-Melvin’s parents used to tell her that there wasn’t much money to be made in education. But it was easy enough for her to tune them out as she enrolled in an education studies program, with her mind set on teaching high school special education.
  • “I didn’t want to start despising a career I had a passion for because of the salary,”
  • leaving teachers concerned for their health and scrambling to do their jobs effectively.
  • ...12 more annotations...
  • Many program leaders believe enrollment fell because of the perceived hazards posed by in-person teaching and the difficulties of remote learning, combined with longstanding frustrations over low pay compared with professions that require similar levels of education.
  • But for many students, the challenges posed by remote teaching can be just as steep.
  • After months of seeing only her roommates, moving around a classroom brimming with fourth and fifth graders was nerve-racking.
  • new anxieties were most likely scaring away some potential applicants.
  • The number of education degrees conferred by American colleges and universities dropped by 22 percent between 2006 and 2019
  • At Portland State University in Oregon, some students were not able to get classroom placements while schools were operating remotely.
  • Educators have struggled with recruitment to the profession since long before the pandemic.
  • one quarter of respondents said that they were likely to leave the profession before the end of the school year.
  • California State University in Long Beach saw enrollment climb 15 percent this year,
  • Ms. Ameni-Melvin, the Towson student, said she would continue her education program for now because she felt invested after three years there.
  • Earlier in the pandemic, as she watched her parents, both teachers, stumble through the difficulties of preparing for remote class, she wondered: Was it too late to choose law school instead?
  • Ms. Ízunza Barba said she realized then that there was no other career path that could prove as meaningful. “Seeing her make her students laugh made me realize how much a teacher can impact someone’s day,” she said. “I was like, whoa, that’s something I want to do.”
Javier E

America Is Flunking Math - Persuasion - 1 views

  • One can argue that the preeminence of each civilization was, in part, due to their sophisticated understanding and use of mathematics. This is particularly clear in the case of the West, which forged ahead in the 17th century with the discovery of calculus, one of the greatest scientific breakthroughs of all time.
  • The United States became the dominant force in the mathematical sciences in the wake of World War II, largely due to the disastrous genocidal policies of the Third Reich. The Nazis’ obsession with purging German science of what it viewed as nefarious Jewish influence led to a massive exodus of Jewish mathematicians and scientists to America
  • Indeed, academic institutions in the United States have thrived largely because of their ability to attract talented individuals from around the world.
  • ...28 more annotations...
  • The quality of mathematics research in the United States today is the envy of the scientific world. This is a direct result of the openness and inclusivity of the profession.
  • Can Americans maintain this unmatched excellence in the future? There are worrisome signs that suggest not.
  • The Organization for Economic Cooperation and Development compares mathematical proficiency among 15-year-olds by country worldwide. According to its 2018 report, America ranked 37th while China, America’s main competitor for world leadership, came in first.
  • This is despite the fact that the United States is the fifth-highest spender per pupil among the 37 developed OECD nations
  • This massive failure of our K-12 education system trickles through the STEM pipeline.
  • At the undergraduate level, too few American students are prepared for higher-level mathematics courses. These students are then unprepared for rigorous graduate-level work
  • According to our own experiences at the universities where we teach, an overwhelming majority of American students with strong math backgrounds are either foreign-born or first-generation students who have additional support from their education-conscious families. At all levels, STEM disciplines are more and more dependent on a constant flow of foreign talent.
  • There are many reasons for this failure, but the way that we educate and prepare teachers is particularly influential. The vast majority of K-12 math teachers are graduates of teacher-preparation programs that teach very little substantive mathematics
  • This has led to a constant stream of ill-advised and dumbed-down reforms. One of the latest fads is anti-racist mathematics. Promoted in several states, the bizarre doctrine threatens to further degrade the teaching of mathematics.
  • Another major concern is the twisted interpretation of diversity, equity, and inclusion (DEI).
  • Under the banner of DEI, universities are abandoning the use of standardized tests like the SAT and GRE in admissions, and cities are considering scrapping academic tracking and various gifted programs in schools, which they deem “inequitable.”
  • such programs are particularly effective, when properly implemented, at discovering and encouraging talented children from disadvantaged backgrounds.
  • The new 2021 Mathematics Framework, currently under consideration by California’s Department of Education, does away “with all tracking, acceleration, gifted programs, or any instruction that involves clustering by individual differences, without expressing any awareness of the impact these drastic alterations would have in preparing STEM-ready candidates.”
  • These measures will not only hinder the progress of the generations of our future STEM workforce but also contribute to structural inequalities, as they are uniquely detrimental to students whose parents cannot send them to private schools or effective enrichment programs.
  • These are just a few examples of an unprecedented fervor for revolutionary change in the name of Critical Race Theory (CRT), a doctrine that views the world as a fierce battleground for the narratives of various identity groups.
  • This will only lead to a further widening of racial disparities in educational outcomes while lowering American children’s rankings in education internationally.
  • Ill-conceived DEI policies, often informed by CRT, and the declining standards of K-12 math education feed each other in a vicious circle
  • Regarding minorities, in particular, public K-12 education all too often produces students unprepared to compete, thus leading to large disparities in admissions at universities, graduate programs, and faculty positions. This disparity is then condemned as a manifestation of structural racism, resulting in administrative measures to lower the evaluation criteria. Lowering standards at all levels leads eventually to even worse outcomes and larger disparities, and so on in a downward spiral.
  • A case in point is the recent report by the American Mathematical Society that accuses the entire mathematics community, with the thinnest of specific evidence, of systemic racial discrimination. A major justification put forward for such a grave accusation is the lack of sufficient representation of Black mathematicians in the professio
  • the report, while raising awareness of several ugly facts from the long-ago past, makes little effort to address the real reasons for this, mainly the catastrophic failure of the K-12 mathematical educational system.
  • The National Science Foundation, a federal institution meant to support fundamental research, is now diverting some of its limited funding to various DEI initiatives of questionable benefit.
  • Meanwhile, other countries, especially China, are doing precisely the opposite, following the model of our past dedication to objective measures of excellence. How long before we will see a reverse exodus, away from the United States?
  • The present crisis can still be reversed by focusing on a few concrete actions:
  • Improve schools in urban areas and inner-city neighborhoods by following the most promising education programs. These programs demonstrate that inner-city children benefit if they are challenged by high standards and a nurturing environment.
  • Follow the lead of other highly successful rigorous programs such as BASIS schools and Math for America, which focus on rigorous STEM curricula, combined with 21st-century teaching methods, and recruit talented teachers to help them build on their STEM knowledge and teaching methods.
  • Increase, rather than eliminate, tailored instruction, both for accelerated and remedial math courses.
  • Reject the soft bigotry of low expectations, that Black children cannot do well in competitive mathematics programs and need dumbed-down ethnocentric versions of mathematics.
  • Uphold the objective selection process based on merit at all levels of education and research.
Javier E

Opinion | Bias Is a Big Problem. But So Is 'Noise.' - The New York Times - 1 views

  • The word “bias” commonly appears in conversations about mistaken judgments and unfortunate decisions. We use it when there is discrimination, for instance against women or in favor of Ivy League graduates
  • the meaning of the word is broader: A bias is any predictable error that inclines your judgment in a particular direction. For instance, we speak of bias when forecasts of sales are consistently optimistic or investment decisions overly cautious.
  • Society has devoted a lot of attention to the problem of bias — and rightly so
  • ...26 more annotations...
  • when it comes to mistaken judgments and unfortunate decisions, there is another type of error that attracts far less attention: noise.
  • To see the difference between bias and noise, consider your bathroom scale. If on average the readings it gives are too high (or too low), the scale is biased
  • It is hard to escape the conclusion that sentencing is in part a lottery, because the punishment can vary by many years depending on which judge is assigned to the case and on the judge’s state of mind on that day. The judicial system is unacceptably noisy.
  • While bias is the average of errors, noise is their variability.
  • Although it is often ignored, noise is a large source of malfunction in society.
  • The average difference between the sentences that two randomly chosen judges gave for the same crime was more than 3.5 years. Considering that the mean sentence was seven years, that was a disconcerting amount of noise.
  • If it shows different readings when you step on it several times in quick succession, the scale is noisy.
  • How much of a difference would you expect to find between the premium values that two competent underwriters assigned to the same risk?
  • Executives in the insurance company said they expected about a 10 percent difference.
  • But the typical difference we found between two underwriters was an astonishing 55 percent of their average premium — more than five times as large as the executives had expected.
  • Many other studies demonstrate noise in professional judgments. Radiologists disagree on their readings of images and cardiologists on their surgery decisions
  • Wherever there is judgment, there is noise — and more of it than you think.
  • Noise causes error, as does bias, but the two kinds of error are separate and independent.
  • A company’s hiring decisions could be unbiased overall if some of its recruiters favor men and others favor women. However, its hiring decisions would be noisy, and the company would make many bad choices
  • Where does noise come from?
  • There is much evidence that irrelevant circumstances can affect judgments.
  • for instance, a judge’s mood, fatigue and even the weather can all have modest but detectable effects on judicial decisions.
  • people can have different general tendencies. Judges often vary in the severity of the sentences they mete out: There are “hanging” judges and lenient ones.
  • People can have not only different general tendencies (say, whether they are harsh or lenient) but also different patterns of assessment (say, which types of cases they believe merit being harsh or lenient about).
  • Underwriters differ in their views of what is risky, and doctors in their views of which ailments require treatment.
  • Once you become aware of noise, you can look for ways to reduce it.
  • independent judgments from a number of people can be averaged (a frequent practice in forecasting)
  • Guidelines, such as those often used in medicine, can help professionals reach better and more uniform decisions
  • imposing structure and discipline in interviews and other forms of assessment tends to improve judgments of job candidates.
  • No noise-reduction techniques will be deployed, however, if we do not first recognize the existence of noise.
  • Organizations and institutions, public and private, will make better decisions if they take noise seriously.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

A News Organization That Rejects the View From Nowhere - Conor Friedersdorf - The Atlantic - 1 views

  • For many years, Rosen has been a leading critic of what he calls The View From Nowhere, or the conceit that journalists bring no prior commitments to their work. On his long-running blog, PressThink, he's advocated for "The View From Somewhere"—an effort by journalists to be transparent about their priors, whether ideological or otherwise.  Rosen is just one of several voices who'll shape NewCo. Still, the new venture may well be a practical test of his View from Somewhere theory of journalism. I chatted with Rosen about some questions he'll face. 
  • The View from Nowhere won’t be a requirement for our journalists. Nor will a single ideology prevail. NewCo itself will have a view of the world: Accountability journalism, exposing abuses of power, revealing injustices will no doubt be part of it. Under that banner many “views from somewhere” can fit.
  • The way "objectivity" evolves historically is out of something much more defensible and interesting, which is in that phrase "Of No Party or Clique." That's the founders of The Atlantic saying they want to be independent of party politics. They don't claim to have no politics, do they? They simply say: We're not the voice of an existing faction or coalition. But they're also not the Voice of God.
  • ...10 more annotations...
  • NewCo will emulate the founders of The Atlantic. At some point "independent from" turned into "objective about." That was the wrong turn, made long ago, by professional journalism, American-style.
  • You've written that The View From Nowhere is, in part, a defense mechanism against charges of bias originating in partisan politics. If you won't be invoking it, what will your defense be when those charges happen? There are two answers to that. 1) We told you where we're coming from. 2) High standards of verification. You need both.
  • What about ideological diversity? The View from Somewhere obviously permits it. You've said you'll have it. Is that because it is valuable in itself?
  • The basic insight is correct: Since "news judgment" is judgment, the product is improved when there are multiple perspectives at the table ... But, if the people who are recruited to the newsroom because they add perspectives that might otherwise be overlooked are also taught that they should leave their politics at the door, or think like professional journalists rather than representatives or their community, or privilege something called "news values" over the priorities they had when they decided to become journalists, then these people are being given a fatally mixed message, if you see what I mean. They are valued for the perspective they bring, and then told that they should transcend that perspective.
  • When people talk about objectivity in journalism they have many different things in mind. Some of these I have no quarrel with. You could even say I’m a “fan.” For example, if objectivity means trying to ground truth claims in verifiable facts, I am definitely for that. If it means there’s a “hard” reality out there that exists beyond any of our descriptions of it, sign me up. If objectivity is the requirement to acknowledge what is, regardless of whether we want it to be that way, then I want journalists who can be objective in that sense. Don’t you? If it means trying to see things in that fuller perspective Thomas Nagel talked about–pulling the camera back, revealing our previous position as only one of many–I second the motion. If it means the struggle to get beyond the limited perspective that our experience and upbringing afford us… yeah, we need more of that, not less. I think there is value in acts of description that do not attempt to say whether the thing described is good or bad. Is that objectivity? If so, I’m all for it, and I do that myself sometimes. 
  • By "we can do better than that" I mean: We can insist on the struggle to tell it like it is without also insisting on the View from Nowhere. The two are not connected. It was a mistake to think that they necessarily are. But why was this mistake made? To control people in the newsroom from "above." That's a big part of objectivity. Not truth. Control.
  • If it works out as you hope, if things are implemented well, etc., what's the potential payoff for readers? I think it's three things: First, this is a news site that is born into the digital world, but doesn't have to return profits to investors. That's not totally unique
  • Second: It's going to be a technology company as much as a news organization. That should result in better service.
  • a good formula for innovation is to start with something people want to do and eliminate some of the steps required to do it
  • The third upside is news with a human voice restored to it. This is the great lesson that blogging gives to journalism
‹ Previous 21 - 40 of 50 Next ›
Showing 20 items per page