Skip to main content

Home/ TOK Friends/ Group items tagged Journalism

Rss Feed Group items tagged

Javier E

Older Americans Are 'Hooked' on Vitamins - The New York Times - 1 views

  • When she was a young physician, Dr. Martha Gulati noticed that many of her mentors were prescribing vitamin E and folic acid to patients. Preliminary studies in the early 1990s had linked both supplements to a lower risk of heart disease.She urged her father to pop the pills as well: “Dad, you should be on these vitamins, because every cardiologist is taking them or putting their patients on [them],” recalled Dr. Gulati, now chief of cardiology for the University of Arizona College of Medicine-Phoenix
  • But just a few years later, she found herself reversing course, after rigorous clinical trials found neither vitamin E nor folic acid supplements did anything to protect the heart. Even worse, studies linked high-dose vitamin E to a higher risk of heart failure, prostate cancer and death from any cause.
  • More than half of Americans take vitamin supplements, including 68 percent of those age 65 and older, according to a 2013 Gallup poll. Among older adults, 29 percent take four or more supplements of any kind
  • ...20 more annotations...
  • Often, preliminary studies fuel irrational exuberance about a promising dietary supplement, leading millions of people to buy in to the trend. Many never stop. They continue even though more rigorous studies — which can take many years to complete — almost never find that vitamins prevent disease, and in some cases cause harm
  • There’s no conclusive evidence that dietary supplements prevent chronic disease in the average American, Dr. Manson said. And while a handful of vitamin and mineral studies have had positive results, those findings haven’t been strong enough to recommend supplements to the general American public, she said.
  • The National Institutes of Health has spent more than $2.4 billion since 1999 studying vitamins and minerals. Yet for “all the research we’ve done, we don’t have much to show for it,” said Dr. Barnett Kramer, director of cancer prevention at the National Cancer Institute.
  • A big part of the problem, Dr. Kramer said, could be that much nutrition research has been based on faulty assumptions, including the notion that people need more vitamins and minerals than a typical diet provides; that megadoses are always safe; and that scientists can boil down the benefits of vegetables like broccoli into a daily pill.
  • when researchers tried to deliver the key ingredients of a healthy diet in a capsule, Dr. Kramer said, those efforts nearly always failed.
  • It’s possible that the chemicals in the fruits and vegetables on your plate work together in ways that scientists don’t fully understand — and which can’t be replicated in a table
  • More important, perhaps, is that most Americans get plenty of the essentials, anyway. Although the Western diet has a lot of problems — too much sodium, sugar, saturated fat and calories, in general — it’s not short on vitamins
  • Without even realizing it, someone who eats a typical lunch or breakfast “is essentially eating a multivitamin,”
  • The body naturally regulates the levels of many nutrients, such as vitamin C and many B vitamins, Dr. Kramer said, by excreting what it doesn’t need in urine. He added: “It’s hard to avoid getting the full range of vitamins.”
  • Not all experts agree. Dr. Walter Willett, a professor at the Harvard T.H. Chan School of Public Health, says it’s reasonable to take a daily multivitamin “for insurance.” Dr. Willett said that clinical trials underestimate supplements’ true benefits because they aren’t long enough, often lasting five to 10 years. It could take decades to notice a lower rate of cancer or heart disease in vitamin taker
  • For Charlsa Bentley, 67, keeping up with the latest nutrition research can be frustrating. She stopped taking calcium, for example, after studies found it doesn’t protect against bone fractures. Additional studies suggest that calcium supplements increase the risk of kidney stones and heart disease.
  • People who take vitamins tend to be healthier, wealthier and better educated than those who don’t, Dr. Kramer said. They are probably less likely to succumb to heart disease or cancer, whether they take supplements or not. That can skew research results, making vitamin pills seem more effective than they really are
  • Because folic acid can lower homocysteine levels, researchers once hoped that folic acid supplements would prevent heart attacks and strokes.In a series of clinical trials, folic acid pills lowered homocysteine levels but had no overall benefit for heart disease, Dr. Lichtenstein said
  • When studies of large populations showed that people who eat lots of seafood had fewer heart attacks, many assumed that the benefits came from the omega-3 fatty acids in fish oil, Dr. Lichtenstein said.Rigorous studies have failed to show that fish oil supplements prevent heart attacks
  • But it’s possible the benefits of sardines and salmon have nothing to do with fish oil, Dr. Lichtenstein said. People who have fish for dinner may be healthier as a result of what they don’t eat, such as meatloaf and cheeseburgers.
  • “Eating fish is probably a good thing, but we haven’t been able to show that taking fish oil [supplements] does anything for you,
  • In the tiny amounts provided by fruits and vegetables, beta carotene and similar substances appear to protect the body from a process called oxidation, which damages healthy cells, said Dr. Edgar Miller, a professor of medicine at Johns Hopkins School of Medicine.Experts were shocked when two large, well-designed studies in the 1990s found that beta carotene pills actually increased lung cancer rates.
  • Likewise, a clinical trial published in 2011 found that vitamin E, also an antioxidant, increased the risk of prostate cancer in men by 17 percent
  • “Vitamins are not inert,” said Dr. Eric Klein, a prostate cancer expert at the Cleveland Clinic who led the vitamin E study. “They are biologically active agents. We have to think of them in the same way as drugs. If you take too high a dose of them, they cause side effects.”
  • “We should be responsible physicians,” she said, “and wait for the data.”
anonymous

JAMA Editor Placed on Leave After Deputy's Comments on Racism - The New York Times - 0 views

  • JAMA Editor Placed on Leave After Deputy’s Comments on Racism
  • After a staff member dismissed racism as a problem in medicine on a podcast, a petition signed by thousands demanded a review of editorial processes at the journal.
  • Following controversial comments on racism in medicine made by a deputy editor at JAMA, the editor in chief of the prominent medical journal was placed on administrative leave on Thursday.
  • ...15 more annotations...
  • “Structural racism is an unfortunate term,” said Dr. Livingston, who is white. “Personally, I think taking racism out of the conversation will help
  • Many people like myself are offended by the implication that we are somehow racist.”
  • The podcast was promoted with a tweet from the journal that said, “No physician is racist, so how can there be structural racism in health care?”
  • The response to both was swift and angry, prompting the journal to take down the podcast and delete the tweet.
  • Comments made in the podcast were inaccurate, offensive, hurtful, and inconsistent with the standards of JAMA,
  • The A.M.A.’s email to employees promised that the investigation would probe “
  • “We are instituting changes that will address and prevent such failures from happening again.”
  • “It’s not just that this podcast is problematic — it’s that there is a long and documented history of institutional racism at JAMA,”
  • “That podcast should never have happened,”
  • The fact that podcast was conceived of, recorded and posted was unconscionable.”
  • “I think it caused an incalculable amount of pain and trauma to Black physicians and patients,” she said. “And I think it’s going to take a long time for the journal to heal that pain.
  • “staff and leadership are overwhelmingly white and economically privileged,” and he committed to reviewing its editorial process.
  • Dr. Livingston later resigned.
  • how the podcast and associated tweet were developed, reviewed, and ultimately published,” and said that the association had engaged independent investigators to ensure objectivity.
  • The email did not offer a date for conclusion of the investigation.
Javier E

Decline of the WSJ « The Dish - 0 views

  • Dean Starkman charts the number of WSJ pieces longer than 2,500 words:
  • A common trait among Pulitzer projects is that they are ambitious, require extensive reporting and careful writing, carry some significance beyond the normal gathering of news, and/or have some kind of impact on the real world
  • Rupert] Murdoch’s oft-stated antipathy to the concept of longform narrative public-interest journalism was the main reason some of us opposed his taking over the Journal in the first place
  • ...1 more annotation...
  • Murdoch’s view: “The entire rationale of modern, objective, arm’s-length, editor-driven journalism—the quasi-religious nature of which had blossomed in no small way as a response to him—he regarded as artifice if not an outright sham.”
sissij

Fake Academe, Looking Much Like the Real Thing - The New York Times - 0 views

  • Academics need to publish in order to advance professionally, get better jobs or secure tenure.
  •  
    Academe is losing its meaning now because the society only sees how many journals you have published but not what you actually write in the journals. I think the growing business of academic publication fraud reflects that our society values our certificates more than our skills. The numerous articles on those "good" colleges also put pressure on teenagers and parent that a title means all. However, that shouldn't be core of education. There is never a shortcut to success. --Sissi (12/31/2016)
Javier E

The Republican Horse Race Is Over, and Journalism Lost - The New York Times - 0 views

  • Wrong, wrong, wrong — to the very end, we got it wrong.
  • in the end, you have to point the finger at national political journalism, which has too often lost sight of its primary directives in this election season: to help readers and viewers make sense of the presidential chaos; to reduce the confusion, not add to it; to resist the urge to put ratings, clicks and ad sales above the imperative of getting it right.
  • The first signs that something was amiss in the coverage of the Tea Party era actually surfaced in the 2014 midterms. Oh, you broadcast network newscast viewers didn’t know we had important elections with huge consequences for the governance of your country that year? You can be forgiven because the broadcast networks hardly covered them.
  • ...6 more annotations...
  • the lesson in Virginia, as the Washington Post reporter Paul Farhi wrote at the time, was that nothing exceeds the value of shoe-leather reporting, given that politics is an essentially human endeavor and therefore can defy prediction and reason.
  • Yet when Mr. Trump showed up on the scene, it was as if that had never happened.
  • It was another thing to declare, as The Huffington Post did, that coverage of his campaign could be relegated to the entertainment section (and to add a disclaimer to articles about him) and still another to give Mr. Trump a “2 percent” chance at the nomination despite strong polls in his favor, as FiveThirtyEight did six months before the first votes were cast.
  • Predictions that far out can be viewed as being all in good fun. But in Mr. Trump’s case, they also arguably sapped the journalistic will to scour his record as aggressively as those of his supposedly more serious rivals. In other words, predictions can have consequences.
  • The problems weren’t at all only due to the reliance on data. Don’t forget those moments that were supposed to have augured Mr. Trump’s collapse: the certainty that once the race narrowed to two or three candidates, Mr. Trump would be through, and what at one point became the likelihood of a contested convention.
  • That’s all the more reason in the coming months to be as sharply focused on the data we don’t have as we are on the data we do have (and maybe watching out for making any big predictions about the fall based on the polling of today). But a good place to start would be to get a good night’s sleep, and then talk to some voters.
Javier E

Why Broadcast Journalism Is Flirting With Jon Stewart - The Atlantic - 0 views

  • The subtext of The Daily Show, The Colbert Report, and Last Week Tonight (the best of the three) is that elected and appointed officials belong to a suspect class of people who've earned intense skepticism and are better mocked than venerated. Even if the shows go easier on Democrats than Republicans, all three are straightforward proponents of the notion that all politicians are somewhat absurd, base characters, often in over their heads, and willing to shamelessly lie and spin.
  • Most broadcast journalists are totally unequipped to confront bad leaders, whether they're malign, inept, or merely buffoons. The reflexive deference gets in the way. The root of the problem is a conception of journalism that is insufficiently adversarial—a confusion that mistakes deference for fairness and epistemic humility.
carolinewren

Brain-to-brain interfaces: the science of telepathy - 0 views

  • Recent advances in brain-computer interfaces are turning the science fantasy of transmitting thoughts directly from one brain to another into reality.
  • Studies published in the last two years have reported direct transmission of brain activity between two animals, between two humans and even between a human and a rat.
  • Cell-to-cell communication occurs via a process known as synaptic transmission, where chemical signals are passed between cells resulting in electrical spikes in the receiving cell.
  • ...11 more annotations...
  • Because cells are connected in a network, brain activity produces a synchronised pulse of electrical activity, which is called a “brain wave”.
  • Brainwaves are detected using a technique known as electroencephalography (EEG),
  • The pattern of activity is then recorded and interpreted using computer software.
  • The electrical nature of the brain allows not only for sending of signals, but also for the receiving of electrical pulses
  • A TMS device creates a magnetic field over the scalp, which then causes an electrical current in the brain.
  • The connection was reinforced by giving both rats a reward when the receiver rat performed the task correctly.
  • By combining EEG and TMS, scientists have transmitted the thought of moving a hand from one person to a separate individual, who actually moved their hand.
  • including EEG, the Internet and TMS – the team of researchers was able to transmit a thought all the way from India to France.
  • Words were first coded into binary notation
  • Now that these BBI technologies are becoming a reality, they have a huge potential to impact the way we interact with other humans. And maybe even the way we communicate with animals through direct transmission of thought.
  • Such technologies have obvious ethical and legal implications, however. So it is important to note that the success of BBIs depends upon the conscious coupling of the subjects.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
caelengrubb

6 Ways Social Media Affects Our Mental Health - 0 views

  • But possibly as concerning is the thing that we often do while we're sitting: Mindlessly scrolling through our social media feeds when we have a few spare minutes (or for some, hours)
  • It’s addictive
  • Experts have not been in total agreement on whether internet addiction is a real thing, let alone social media addiction, but there’s some good evidence that both may exist.
  • ...14 more annotations...
  • And studies have confirmed that people tend to undergo a kind of withdrawal: A study a few years ago from Swansea University found that people experienced the psychological symptoms of withdrawal when they stopped using (this went for all internet use, not just social media).
  • It triggers more sadness, less well-being
  • The more we use social media, the less happy we seem to be. One study a few years ago found that Facebook use was linked to both less moment-to-moment happiness and less life satisfaction—the more people used Facebook in a day, the more these two variables dropped off
  • In fact, another study found that social media use is linked to greater feelings of social isolation.
  • Comparing our lives with others is mentally unhealthy
  • One study looked at how we make comparisons to others posts, in “upward” or “downward” directions—that is, feeling that we’re either better or worse off than our friends.
  • It can lead to jealousy—and a vicious cycle
  • Studies have certainly shown that social media use triggers feelings of jealousy. The authors of one study, looking at jealousy and other negative feelings while using Facebook, wrote that “This magnitude of envy incidents taking place on FB alone is astounding, providing evidence that FB offers a breeding ground for invidious feelings."
  • We get caught in the delusion of thinking it will help
  • Part of the unhealthy cycle is that we keep coming back to social media, even though it doesn’t make us feel very good.
  • This is probably because of what’s known as a forecasting error: Like a drug, we think getting a fix will help, but it actually makes us feel worse, which comes down to an error in our ability to predict our own response.
  • More friends on social doesn’t mean you’re more social
  • A couple of years ago, a study found that more friends on social media doesn’t necessarily mean you have a better social life—there seems to be a cap on the number of friends a person’s brain can handle, and it takes actual social interaction (not virtual) to keep up these friendships
  • All of this is not to say that there’s no benefit to social media—obviously it keeps us connected across great distances, and helps us find people we’d lost touch with years ago
Javier E

The View from Nowhere: Questions and Answers » Pressthink - 2 views

  • In pro journalism, American style, the View from Nowhere is a bid for trust that advertises the viewlessness of the news producer. Frequently it places the journalist between polarized extremes, and calls that neither-nor position “impartial.” Second, it’s a means of defense against a style of criticism that is fully anticipated: charges of bias originating in partisan politics and the two-party system. Third: it’s an attempt to secure a kind of universal legitimacy that is implicitly denied to those who stake out positions or betray a point of view. American journalists have almost a lust for the View from Nowhere because they think it has more authority than any other possible stance.
  • Who gets credit for the phrase, “view from nowhere?” # A. The philosopher Thomas Nagel, who wrote a very important book with that title.
  • Q. What does it say? # A. It says that human beings are, in fact, capable of stepping back from their position to gain an enlarged understanding, which includes the more limited view they had before the step back. Think of the cinema: when the camera pulls back to reveal where a character had been standing and shows us a fuller tableau. To Nagel, objectivity is that kind of motion. We try to “transcend our particular viewpoint and develop an expanded consciousness that takes in the world more fully.” #
  • ...11 more annotations...
  • But there are limits to this motion. We can’t transcend all our starting points. No matter how far it pulls back the camera is still occupying a position. We can’t actually take the “view from nowhere,” but this doesn’t mean that objectivity is a lie or an illusion. Our ability to step back and the fact that there are limits to it– both are real. And realism demands that we acknowledge both.
  • Q. So is objectivity a myth… or not? # A. One of the many interesting things Nagel says in that book is that “objectivity is both underrated and overrated, sometimes by the same persons.” It’s underrated by those who scoff at it as a myth. It is overrated by people who think it can replace the view from somewhere or transcend the human subject. It can’t.
  • When MSNBC suspends Keith Olbermann for donating without company permission to candidates he supports– that’s dumb. When NPR forbids its “news analysts” from expressing a view on matters they are empowered to analyze– that’s dumb. When reporters have to “launder” their views by putting them in the mouths of think tank experts: dumb. When editors at the Washington Post decline even to investigate whether the size of rallies on the Mall can be reliably estimated because they want to avoid charges of “leaning one way or the other,” as one of them recently put it, that is dumb. When CNN thinks that, because it’s not MSNBC and it’s not Fox, it’s the only the “real news network” on cable, CNN is being dumb about itself.
  • Let some in the press continue on with the mask of impartiality, which has advantages for cultivating sources and soothing advertisers. Let others experiment with transparency as the basis for trust. When you click on their by-line it takes you to a disclosure page where there is a bio, a kind of mission statement, and a creative attempt to say: here’s where I’m coming from (one example) along with campaign contributions, any affiliations or memberships, and–I’m just speculating now–a list of heroes and villains, or major influences, along with an archive of the work, plus anything else that might assist the user in placing this person on the user’s mattering map.
  • if objectivity means trying to ground truth claims in verifiable facts, I am definitely for that. If it means there’s a “hard” reality out there that exists beyond any of our descriptions of it, sign me up. If objectivity is the requirement to acknowledge what is, regardless of whether we want it to be that way, then I want journalists who can be objective in that sense.
  • If it means trying to see things in that fuller perspective Thomas Nagel talked about–pulling the camera back, revealing our previous position as only one of many–I second the motion. If it means the struggle to get beyond the limited perspective that our experience and upbringing afford us… yeah, we need more of that, not less. I think there is value in acts of description that do not attempt to say whether the thing described is good or bad
  • I think we are in the midst of shift in the system by which trust is sustained in professional journalism. David Weinberger tried to capture it with his phrase: transparency is the new objectivity. My version of that: it’s easier to trust in “here’s where I’m coming from” than the View from Nowhere. These are two different ways of bidding for the confidence of the users.
  • In the newer way, the logic is different. “Look, I’m not going to pretend that I have no view. Instead, I am going to level with you about where I’m coming from on this. So factor that in when you evaluate my report. Because I’ve done the work and this is what I’ve concluded…”
  • it has unearned authority in the American press. If in doing the serious work of journalism–digging, reporting, verification, mastering a beat–you develop a view, expressing that view does not diminish your authority. It may even add to it. The View from Nowhere doesn’t know from this. It also encourages journalists to develop bad habits. Like: criticism from both sides is a sign that you’re doing something right, when you could be doing everything wrong.
  • Who gets credit for the phrase, “view from nowhere?” # A. The philosopher Thomas Nagel, who wrote a very important book with that title.
  • It says that human beings are, in fact, capable of stepping back from their position to gain an enlarged understanding, which includes the more limited view they had before the step back. Think of the cinema: when the camera pulls back to reveal where a character had been standing and shows us a fuller tableau. To Nagel, objectivity is that kind of motion. We try to “transcend our particular viewpoint and develop an expanded consciousness that takes in the world more fully.”
blythewallick

People with Depression Use Language Differently | JSTOR Daily - 0 views

  • New research shows that people with depression use absolute words, such as “always,” “nothing,” or “completely,” more often than others.
  • Scientists have long tried to pin down the exact relationship between depression and language, and technology is helping us get closer to a full picture. Our new study, published in Clinical Psychological Science, has now unveiled a class of words that can help accurately predict whether someone is suffering from depression.
  • So far, personal essays and diary entries by depressed people have been useful, as has the work of well-known artists such as Cobain and Plath. For the spoken word, snippets of natural language of people with depression have also provided insight. Taken together, the findings from such research reveal clear and consistent differences in language between those with and without symptoms of depression.
  • ...5 more annotations...
  • We know that rumination (dwelling on personal problems) and social isolation are common features of depression. However, we don’t know whether these findings reflect differences in attention or thinking style. Does depression cause people to focus on themselves, or do people who focus on themselves get symptoms of depression?
  • Crucially, those who have previously had depressive symptoms are more likely to have them again. Therefore, their greater tendency for absolutist thinking, even when there are currently no symptoms of depression, is a sign that it may play a role in causing depressive episodes. The same effect is seen in use of pronouns, but not for negative emotion words.
  • From the outset, we predicted that those with depression will have a more black and white view of the world, and that this would manifest in their style of language.
  • Our lab recently conducted a big data text analysis of 64 different online mental health forums, examining over 6,400 members. “Absolutist words” – which convey absolute magnitudes or probabilities, such as “always”, “nothing” or “completely” – were found to be better markers for mental health forums than either pronouns or negative emotion words.
  • But as the World Health Organisation estimates that more than 300m people worldwide are now living with depression, an increase of more than 18% since 2005, having more tools available to spot the condition is certainly important to improve health and prevent tragic suicides such as those of Plath and Cobain.
krystalxu

Journal of Economic Psychology - Elsevier - 0 views

  • The Journal aims to present research that will improve understanding of behavioral, in particular psychological, aspects of economic phenomena and processes.
Javier E

Dispute Within Art Critics Group Over Diversity Reveals a Widening Rift - The New York ... - 0 views

  • Amussen, 33, is the editor of Burnaway, which focuses on criticism in the American South and often features young Black artists. (The magazine started in 2008 in response to layoffs at the Atlanta Journal-Constitution’s culture section and now runs as a nonprofit with four full-time employees and a budget that mostly consists of grants.)
  • Efforts to revive AICA-USA are continuing. In January, Jasmine Amussen joined the organization’s board to help rethink the meaning of criticism for a younger generation.
  • The organization has yearly dues of $115 and provides free access to many museums. But some members complained that the fee was too expensive for young critics, yet not enough to support significant programming.
  • ...12 more annotations...
  • “It just came down to not having enough money,” said Terence Trouillot, a senior editor at Frieze, a contemporary art magazine . He spent nearly three years on the AICA-USA board, resigning in 2022. He said that initiatives to re-energize the group “were just moving too slowly.”
  • According to Lilly Wei, a longtime AICA-USA board member who recently resigned, the group explored different ways of protecting writers in the industry. There were unrealized plans of turning the organization into a union; others hoped to create a permanent emergency fund to keep financially struggling critics afloat. She said the organization has instead canceled initiatives, including an awards program for the best exhibitions across the country.
  • Large galleries — including Gagosian, Hauser & Wirth, and Pace Gallery — now produce their own publications with interviews and articles sometimes written by the same freelance critics who simultaneously moonlight as curators and marketers. Within its membership, AICA-USA has a number of writers who belong to all three categories.
  • “It’s crazy that the ideal job nowadays is producing catalog essays for galleries, which are basically just sales pitches,” Dillon said in a phone interview. “Critical thinking about art is not valued financially.”
  • Noah Dillon, who was on the AICA-USA board until he resigned last year, has been reluctant to recommend that anyone follow his path to become a critic. Not that they could. The graduate program in art writing that he attended at the School of Visual Arts in Manhattan also closed during the pandemic.
  • David Velasco, editor in chief of Artforum, said in an interview that he hoped the magazine’s acquisition would improve the publication’s financial picture. The magazine runs nearly 700 reviews a year, Velasco said; about half of those run online and pay $50 for roughly 250 words. “Nobody I know who knows about art does it for the money,” Velasco said, “but I would love to arrive at a point where people could.”
  • While most editors recognize the importance of criticism in helping readers decipher contemporary art, and the multibillion-dollar industry it has created, venues for such writing are shrinking. Over the years, newspapers including The Philadelphia Inquirer and The Miami Herald have trimmed critics’ jobs.
  • In December, the Penske Media Corporation announced that it had acquired Artforum, a contemporary art journal, and was bringing the title under the same ownership as its two competitors, ARTnews and Art in America. Its sister publication, Bookforum, was not acquired and ceased operations. Through the pandemic, other outlets have shuttered, including popular blogs run by SFMOMA and the Walker Art Center in Minneapolis as well as smaller magazines called Astra and Elephant.
  • The need for change in museums was pointed out in the 2022 Burns Halperin Report, published by Artnet News in December, that analyzed more than a decade of data from over 30 cultural institutions. It found that just 11 percent of acquisitions at U.S. museums were by female artists and only 2.2 percent were by Black American artists
  • (National newspapers with art critics on staff include The New York Times, The Los Angeles Times, The Boston Globe and The Washington Post. )
  • Julia Halperin, one of the study’s organizers, who recently left her position as Artnet’s executive editor, said that the industry has an asymmetric approach to diversity. “The pool of artists is diversifying somewhat, but the pool of staff critics has not,” she said.
  • the matter of diversity in criticism is compounded by the fact that opportunities for all critics have been diminished.
Javier E

The state of science writing, circa 2012: The summer of our discontent, made glorious b... - 0 views

  • the authors were able to provide empirical evidence for a troubling phenomenon that seems to be all but baked in to the way our scientific culture operates: We pay lots of attention to things that are almost assuredly not true.
  • Because it’s sexier to discover something than to show there’s nothing to be discovered, high-impact journals show a marked preference for “initial studies” as opposed to disconfirmations. Unfortunately, as anyone who has ever worked in a research lab knows, initial observations are almost inevitably refuted or heavily attenuated by future studies — and that data tends to get printed in less prestigious journals.  Newspapers, meanwhile, give lots of attention to those first, eye-catching results while spilling very little (if any) ink on the ongoing research that shows why people shouldn’t have gotten all hot and bothered in the first place.
  • The result? ”[A]n almost complete amnesia in the newspaper coverage of biomedical findings.”
  • ...1 more annotation...
  • everything we write about will probably end up being wrong anyway — not that we’ll bother to let you know when the time comes.
Javier E

Sticking with the truth : Columbia Journalism Review - 0 views

  • In 1998, The Lancet, one of the most respected medical journals, published a study by lead author Andrew Wakefield, a British physician who claimed there might be a link between the vaccine for measles, mumps, and rubella (MMR) and autism
  • Among scientists, however, there really was never much of a debate; only a small group of researchers ever even entertained the theory about autism. The coverage rarely emphasized this, if it noted it at all, and instead propagated misunderstanding about vaccines and autism and gave credence to what was largely a manufactured controversy
  • Between 1998 and 2006, 60 percent of vaccine-autism articles in British newspapers, and 49 percent in American papers, were “balanced,” in the sense that they either mentioned both pro-link and anti-link perspectives, or neither perspective, according to a 2008 study by Christopher Clarke at Cornell University. The remainder—40 percent in the British press and 51 percent in the American press—mentioned only one perspective or the other, but British journalists were more likely to focus on pro-link claims and the Americans were more likely to focus on anti-link claims.
  • ...3 more annotations...
  • While it’s somewhat reassuring that almost half the US stories (41 percent) tried, to varying degrees, to rebut the vaccine-autism connection, the study raises the problem of “objectivity” in stories for which a preponderance of evidence is on one side of a “debate.” In such cases, “balanced” coverage can be irresponsible, because it suggests a controversy where none really exists. (Think climate change, and how such he-said-she-said coverage helped sustain the illusion of a genuine debate within the science community.)
  • A follow-up study by Clarke and Graham Dixon, published in November 2012, makes this point. The two scholars assigned 320 undergrads to read either a “balanced” article or one that was one-sided for or against a link between vaccines and autism. Those students who read the “balanced” articles were far more likely to believe that a link existed than those who read articles that said no link exits.
  • Today, people who worry that childhood inoculations trigger autism prefer to be described as “vaccine-hesitant,” rather than “anti-vaccine,” and think the CDC’s immunization schedule “overwhelms” kids’ immune systems. This rhetorical shift is illustrates how those who claim a link exists keep moving the goalposts.
Javier E

Meeting 'the Other' Face to Face - The New York Times - 0 views

  • Sitting in a conference room at a hotel near the Massachusetts Institute of Technology here, I slip on large headphones and an Oculus Rift virtual reality headset and wriggle into the straps of a backpack, weighed down with a computer and a battery.
  • when I stand, I quickly find myself in a featureless all-white room, a kind of Platonic vestibule. On the walls at either end are striking poster-size black-and-white portraits taken by the noted Belgian-Tunisian photographer Karim Ben Khelifa, one showing a young Israeli soldier and another a Palestinian fighter about the same age, whose face is almost completely hidden by a black hood.
  • Then the portraits disappear, replaced by doors, which open. In walk the two combatants — Abu Khaled, a fighter for the Popular Front for the Liberation of Palestine, and Gilad Peled, an Israeli soldier — seeming, except for a little pixelation and rigid body movement, like flesh-and-blood people who are actually in the room with me.
  • ...11 more annotations...
  • What he saw there was a culture of warfare that often perpetuated itself through misunderstanding and misinformation, with no mechanism for those of opposing sects or political forces to gain a sense of the enemy as a fellow human being.
  • “I began to think, ‘I’m meeting the same people over and over again,’” he said. “I’m seeing people I knew as kids, and now they’re grown-up fighters, in power, fighting the same fight. And you start to think about your work in terms of: ‘Am I helping to change anything? Am I having any impact?’ ”
  • “I thought of myself as a war illustrator. I started calling myself that.”
  • as a visiting artist at the university’s Center for Art, Science and Technology, he transformed what he initially conceived of as an unconventional photo and testimonial project involving fighters into a far more unconventional way of hearing and seeing his subjects, hoping to be able to engender a form of empathy beyond the reach of traditional documentary film
  • Then he and a small crew captured three-dimensional scans of the men and photographed them from multiple angles
  • He interviewed Mr. Khaled in Gaza and Mr. Peled in Tel Aviv, asking them the same six questions — basic ones like “Who’s your enemy and why?”; “What is peace for you?”; “Have you ever killed one of your enemies?”; “Where do you see yourself in 20 years?”
  • he began to build avatars of his interviewees and ways for them to move and respond inside a virtual world so realistic it makes even a 3-D movie seem like an artifact from the distant past. Mr. Harrell describes it as “long-form journalism in a totally new form.”
  • “You have something here you don’t have in any other form of journalism: body language.”
  • indeed, inside the world they have made, the power comes from the feeling of listening to the interviewees speak (you hear Mr. Ben Khelifa’s disembodied voice asking the questions, and the men’s voices answer, overlaid by the voice of an interpreter) as your body viscerally senses a person standing a few feet away from you, his eyes following yours as he talks, his chest rising and falling as he breathes.
  • Sofia Ayala, an M.I.T. sophomore, tested the project after I did and emerged — as I did — with a mesmerized flush on her face, a feeling of meeting someone not really there. “It makes it feel so much more personal than just reading about these things online,” she said. “When someone’s right there talking to you, you want to listen.”
  • “In many places I’ve been, you’re given your enemy when you’re born,” he said. “You grow up with this ‘other’ always out there. The best we can hope is that the ‘other’ will now be able to come into the same room with you for a while, where you can listen to him, and see him face to face.”
proudsa

Journal of the Marine Biological Association of the United Kingdom - The size and compl... - 0 views

  • During evolution, dolphins may have increased the computational performance of their cytoarchitectonically ‘simple’ neocortex by a multiplication of relevant structures (resulting in a hypertrophic surface area) instead of increasing its complexity.
    • proudsa
       
      Interesting addition to our discussion of whether other beings could have the ability to process as highly as humans
‹ Previous 21 - 40 of 394 Next › Last »
Showing 20 items per page