Skip to main content

Home/ TOK Friends/ Group items matching "translation" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
10More

Sensing Gene Therapy | The Scientist Magazine® - 0 views

  • but gene therapy may be coming to the rescue. Gene therapy’s success in treating  blindness disorders –many are in late stage trials—gave hope to a field deterred by early missteps. And now gene therapy researchers are expanding their gaze to focus on all manner of sensory diseases.
  • notable success in using gene therapy techniques to treat a sensory disorder came last year when otolaryngolotist
  • In olfactory dysfunction, there are few curative therapies,
  • ...7 more annotations...
  • working on more broadly applying [the therapy] to other forms of genetic hearing loss,” he said. But in contrast to VGLUT3 mutant mice, which are missing the protein entirely, humans with missense mutations expressed a defective transporter, making it unclear whether Lustig’s strategy could translate to human VGLUT3-linked deafness.
  • Taste and smell are two of the senses that have received less attention from gene therapy researchers—but that’s changing
  • The neurons [in VGLUT3 mutant mice] are waiting for the neurotransmitter to activate them”—but no signal comes, and the mice are profoundly deaf,
  • Treating the mice intra-nasally with gene therapy vectors carrying the wildtype Ift88 gene, researchers saw significant regrowth of nasal cilia, whereas control mice given empty vectors showed no regrowth. Treated mice almost doubled in weight compared to controls.
  • So far, no scientists have designed a gene therapy to target taste buds, but at least one team is tackling an important factor in taste: saliva. If a person’s saliva production drops below 50 percent of normal, “you get tooth decay and trouble swallowing,”
  • Scientists are also developing gene therapies for disorders involving touch—or at least pain-sensing—neurons, with one drug candidate
  • Wolfe envisions that someday pain treatment could be as simple as visiting the doctor every few months for a quick skin prick “wherever it hurts”—choosing between a variety of genes to get the best effect.
7More

Inhibit Mitochondria to Live Longer? | The Scientist Magazine® - 0 views

  • Although previous work had indirectly suggested that changing mitochondrial function affected lifespan, “this is the first clear demonstration [that it] extends mouse lifespan,” Miller added.
  • well known that mitochondria are linked to health. Some evidence suggests that inhibiting mitochondrial function can be harmful—as in the case of diabetes or obesity—but earlier data from nematodes and fruit flies also suggest a link to lifespan increase. The latest findings are a step toward untangling one of the current debates in the field—whether inhibiting mitochondrial function is detrimental or beneficial,
  • The average lifespan of BXD mice range from 1 year to almost 2.5 years. The researchers were able to link 3 genes to longevity variability, including mitochondrial ribosomal protein S5 (Mrps5), which encodes a protein integral to mitochondrial protein synthesis. They found that BXD strains with 50 percent less Mrps5 expression lived about 250 days longer than BXD mice with more robust Mrps5 expression.
  • ...4 more annotations...
  • researchers were also able to activate the mitochondrial UPR via pharmacological means. Dosing worms with the antibiotic doxycyline, which inhibits bacterial and mitochondrial protein translation, also activated the mitochondrial UPR and extended worm lifespans. Rapamycin, shown to enhance longevity in mice, also extended worm lifespan and induced mitonuclear protein imbalance and the mitochondrial UPR in mouse hepatocytes.
  • mitochondrial ribosomal proteins are not to be trifled with. “There are a number of well-defined severe disorders in humans, including neonatal lethality, due to defects in those exact proteins,”
  • is beginning to cast a wider net, looking to see whether mitonuclear protein imbalance could explain longevity induced by other means, such as caloric restriction. Auwerx hopes that the work will aid in designing a drug intervention “to pump up this response via pharmacological tools.”
  • he’s optimistic that his team has identified a “common thread” demonstrating that longevity is not affected so much by inhibiting or stimulating mitochondria, but how the organelles “deal with proteins.”
14More

WHICH IS THE BEST LANGUAGE TO LEARN? | More Intelligent Life - 2 views

  • For language lovers, the facts are grim: Anglophones simply aren’t learning them any more. In Britain, despite four decades in the European Union, the number of A-levels taken in French and German has fallen by half in the past 20 years, while what was a growing trend of Spanish-learning has stalled. In America, the numbers are equally sorry.
  • compelling reasons remain for learning other languages.
  • First of all, learning any foreign language helps you understand all language better—many Anglophones first encounter the words “past participle” not in an English class, but in French. Second, there is the cultural broadening. Literature is always best read in the original. Poetry and lyrics suffer particularly badly in translation. And learning another tongue helps the student grasp another way of thinking.
  • ...11 more annotations...
  • is Chinese the language of the future?
  • So which one should you, or your children, learn? If you take a glance at advertisements in New York or A-level options in Britain, an answer seems to leap out: Mandarin.
  • The practical reasons are just as compelling. In business, if the team on the other side of the table knows your language but you don’t know theirs, they almost certainly know more about you and your company than you do about them and theirs—a bad position to negotiate from.
  • This factor is the Chinese writing system (which Japan borrowed and adapted centuries ago). The learner needs to know at least 3,000-4,000 characters to make sense of written Chinese, and thousands more to have a real feel for it. Chinese, with all its tones, is hard enough to speak. But  the mammoth feat of memory required to be literate in Mandarin is harder still. It deters most foreigners from ever mastering the system—and increasingly trips up Chinese natives.
  • If you were to learn ten languages ranked by general usefulness, Japanese would probably not make the list. And the key reason for Japanese’s limited spread will also put the brakes on Chinese.
  • A recent survey reported in the People’s Daily found 84% of respondents agreeing that skill in Chinese is declining.
  • Fewer and fewer native speakers learn to produce characters in traditional calligraphy. Instead, they write their language the same way we do—with a computer. And not only that, but they use the Roman alphabet to produce Chinese characters: type in wo and Chinese language-support software will offer a menu of characters pronounced wo; the user selects the one desired. (Or if the user types in wo shi zhongguo ren, “I am Chinese”, the software detects the meaning and picks the right characters.) With less and less need to recall the characters cold, the Chinese are forgetting them
  • As long as China keeps the character-based system—which will probably be a long time, thanks to cultural attachment and practical concerns alike—Chinese is very unlikely to become a true world language, an auxiliary language like English, the language a Brazilian chemist will publish papers in, hoping that they will be read in Finland and Canada. By all means, if China is your main interest, for business or pleasure, learn Chinese. It is fascinating, and learnable—though Moser’s online essay, “Why Chinese is so damn hard,” might discourage the faint of heart and the short of time.
  • But if I was asked what foreign language is the most useful, and given no more parameters (where? for what purpose?), my answer would be French. Whatever you think of France, the language is much less limited than many people realise.
  • French ranks only 16th on the list of languages ranked by native speakers. But ranked above it are languages like Telegu and Javanese that no one would call world languages. Hindi does not even unite India. Also in the top 15 are Arabic, Spanish and Portuguese, major languages to be sure, but regionally concentrated. If your interest is the Middle East or Islam, by all means learn Arabic. If your interest is Latin America, Spanish or Portuguese is the way to go. Or both; learning one makes the second quite easy.
  • if you want another truly global language, there are surprisingly few candidates, and for me French is unquestionably top of the list. It can enhance your enjoyment of art, history, literature and food, while giving you an important tool in business and a useful one in diplomacy. It has native speakers in every region on earth. And lest we forget its heartland itself, France attracts more tourists than any other country—76.8m in 2010, according to the World Tourism Organisation, leaving America a distant second with 59.7m
7More

The Lies of Science Writing - WSJ.com - 0 views

  • Writing about science poses a fundamental problem right at the outset: You have to lie.
  • because math is the language of science, scientists who want to translate their work into popular parlance have to use verbal or pictorial metaphors that are necessarily inexact.
  • Choosing the proper metaphor can make all the difference between distorting science and providing an appropriate context from which nonscientists can appreciate new scientific findings and put them in perspective.
  • ...4 more annotations...
  • Not only is a good picture, even a mental one, worth at least a thousand words, but many scientists themselves think in these terms.
  • Though metaphors are useful in trying to understand complicated scientific ideas, they have their pitfalls.
  • Consider another famous scientific metaphor, the evolutionary biologist Richard Dawkins's idea of the "selfish gene." This is a brilliant and simple way to explain that natural selection relies on the self-perpetuation of genes that promote higher rates of survival. But for some critics, it suggests an intentionality that is absent in the process of evolution. Others worry that it implies an immoral world where selfishness wins out.
  • When used effectively, an apt metaphor can enhance the real purpose in writing about science for the public: provoking interest and a desire to learn more.
3More

Book Review: The Last Lingua Franca - WSJ.com - 0 views

  • After narrating the history of Latin, Persian, Phoenician and other once-dominant languages, all now either dead or consigned to their native communities, Mr. Ostler argues that English too will sputter out relatively soon. Among the factors dooming it is the lack of any institution to demand its survival—no priestly use, as Latin or Sanskrit had, or government that requires its subjects to keep their linguistic skills up to enjoy full citizenship. As English loses cachet, it will become optional, and ultimately its reign will be one of the shortest in the history of lingua francas.
  • But regional languages are gaining enough traction in trade to allow their speakers to discard English, particularly if people can transact their cultural and commercial business with the crutch of computer software and machine translation.
  • The one issue that Mr. Ostler treats insufficiently is what the world might lose after what his subtitle calls "the return of Babel." One needn't be sentimental about English to wonder whether it isn't useful to have one language, rich in literature, that everyone shares in addition to a mother tongue.
1More

Amazon Rewrites the Rules of Book Publishing - NYTimes.com - 0 views

  • “The Hangman’s Daughter” was an e-book hit. Amazon bought the rights to the historical novel by a first-time writer, Oliver Pötzsch, and had it translated from German. It has now sold 250,000 digital copies.
8More

Eduardo Galeano Disavows His Book 'The Open Veins' - NYTimes.com - 0 views

  • For more than 40 years, Eduardo Galeano’s “The Open Veins of Latin America” has been the canonical anti-colonialist, anti-capitalist and anti-American text in that region
  • now Mr. Galeano, a 73-year-old Uruguayan writer, has disavowed the book, saying that he was not qualified to tackle the subject and that it was badly written. Predictably, his remarks have set off a vigorous regional debate, with the right doing some “we told you so” gloating, and the left clinging to a dogged defensiveness.
  •  ‘Open Veins’ tried to be a book of political economy, but I didn’t yet have the necessary training or preparation,” Mr. Galeano said last month while answering questions at a book fair in Brazil, where he was being honored on the 43rd anniversary of the book’s publication. He added: “I wouldn’t be capable of reading this book again; I’d keel over. For me, this prose of the traditional left is extremely leaden, and my physique can’t tolerate it
  • ...5 more annotations...
  • “If I were teaching this in a course,” said Merilee Grindle, president of the Latin American Studies Association and director of the David Rockefeller Center for Latin American Studies at Harvard, “I would take his comments, add them in and use them to generate a far more interesting discussion about how we see and interpret events at different points in time.” And that seems to be exactly what many professors plan to do.
  • “Open Veins” has been translated into more than a dozen languages and has sold more than a million copies. In its heyday, its influence extended throughout what was then called the third world, including Africa and Asia, until the economic rise of China and India and Brazil seemed to undercut parts of its thesis.In the United States, “Open Veins” has been widely taught on university campuses since the 1970s, in courses ranging from history and anthropology to economics and geography. But Mr. Galeano’s unexpected takedown of his own work has left scholars wondering how to deal with the book in class.
  • “Reality has changed a lot, and I have changed a lot,” he said in Brazil, adding: “Reality is much more complex precisely because the human condition is diverse. Some political sectors close to me thought such diversity was a heresy. Even today, there are some survivors of this type who think that all diversity is a threat. Fortunately, it is not.”
  • In the mid-1990s, three advocates of free-market policies — the Colombian writer and diplomat Plinio Apuleyo Mendoza, the exiled Cuban author Carlos Alberto Montaner and the Peruvian journalist and author Álvaro Vargas Llosa — reacted to Mr. Galeano with a polemic of their own, “Guide to the Perfect Latin American Idiot.” They dismissed “Open Veins” as “the idiot’s bible,” and reduced its thesis to a single sentence: “We’re poor; it’s their fault.”
  • Mr. Montaner responded to Mr. Galeano’s recent remarks with a blog post titled “Galeano Corrects Himself and the Idiots Lose Their Bible.” In Brazil, Rodrigo Constantino, the author of “The Caviar Left,” took an even harsher tone, blaming Mr. Galeano’s analysis and prescription for many of Latin America’s ills. “He should feel really guilty for the damage he caused,”
17More

Our Machine Masters - NYTimes.com - 0 views

  • the smart machines of the future won’t be humanlike geniuses like HAL 9000 in the movie “2001: A Space Odyssey.” They will be more modest machines that will drive your car, translate foreign languages, organize your photos, recommend entertainment options and maybe diagnose your illnesses. “Everything that we formerly electrified we will now cognitize,” Kelly writes. Even more than today, we’ll lead our lives enmeshed with machines that do some of our thinking tasks for us.
  • This artificial intelligence breakthrough, he argues, is being driven by cheap parallel computation technologies, big data collection and better algorithms. The upshot is clear, “The business plans of the next 10,000 start-ups are easy to forecast: Take X and add A.I.”
  • Two big implications flow from this. The first is sociological. If knowledge is power, we’re about to see an even greater concentration of power.
  • ...14 more annotations...
  • in 2001, the top 10 websites accounted for 31 percent of all U.S. page views, but, by 2010, they accounted for 75 percent of them.
  • As a result, our A.I. future is likely to be ruled by an oligarchy of two or three large, general-purpose cloud-based commercial intelligences.”
  • Advances in artificial intelligence will accelerate this centralizing trend. That’s because A.I. companies will be able to reap the rewards of network effects. The bigger their network and the more data they collect, the more effective and attractive they become.
  • The Internet has created a long tail, but almost all the revenue and power is among the small elite at the head.
  • engineers at a few gigantic companies will have vast-though-hidden power to shape how data are collected and framed, to harvest huge amounts of information, to build the frameworks through which the rest of us make decisions and to steer our choices. If you think this power will be used for entirely benign ends, then you have not read enough history.
  • The second implication is philosophical. A.I. will redefine what it means to be human. Our identity as humans is shaped by what machines and other animals can’t do
  • On the other hand, machines cannot beat us at the things we do without conscious thinking: developing tastes and affections, mimicking each other and building emotional attachments, experiencing imaginative breakthroughs, forming moral sentiments.
  • For the last few centuries, reason was seen as the ultimate human faculty. But now machines are better at many of the tasks we associate with thinking — like playing chess, winning at Jeopardy, and doing math.
  • In the age of smart machines, we’re not human because we have big brains. We’re human because we have social skills, emotional capacities and moral intuitions.
  • I could paint two divergent A.I. futures, one deeply humanistic, and one soullessly utilitarian.
  • In the humanistic one, machines liberate us from mental drudgery so we can focus on higher and happier things. In this future, differences in innate I.Q. are less important. Everybody has Google on their phones so having a great memory or the ability to calculate with big numbers doesn’t help as much.
  • In this future, there is increasing emphasis on personal and moral faculties: being likable, industrious, trustworthy and affectionate. People are evaluated more on these traits, which supplement machine thinking, and not the rote ones that duplicate it
  • In the cold, utilitarian future, on the other hand, people become less idiosyncratic. If the choice architecture behind many decisions is based on big data from vast crowds, everybody follows the prompts and chooses to be like each other. The machine prompts us to consume what is popular, the things that are easy and mentally undemanding.
  • In the current issue of Wired, the technology writer Kevin Kelly says that we had all better get used to this level of predictive prowess. Kelly argues that the age of artificial intelligence is finally at hand.
5More

Hearing Is Believing - NYTimes.com - 0 views

  • Listening is more efficient than reading: When we read, we absorb print with our eyes and translate it into “meaning,” a cumbersome process that requires us first to see the words, then to make sense of them, and finally to employ our imaginations to conjure up events and sounds and characters that aren’t there. Reception by aural means is more direct: All you have to do is listen. Not only that, you can multitask, driving to work or walking the dog.
  • you can impulse-buy Hermione Lee’s biography of Penelope Fitzgerald while you’re shivering at a bus stop and have it show up on your Kindle reader almost instantly. But there is something about the act of listening that invigorates the mind.
  • The aural/oral revolution won’t mean the end of the book any more than the e-book did.
  • ...2 more annotations...
  • “In the history of mankind, words were heard before they were seen,” wrote Albert B. Lord,
  • Progress doesn’t always mean going forward.
18More

How the Disney Animated Film 'Frozen' Took Over the World : The New Yorker - 1 views

  • In the end, though, Litman concluded, the findings were complicated: these factors could largely tell a dog from a general success, but they couldn’t predict the true runaway sensations.
  • few things continued to stand out: story and social influence. The most important figure in determining ultimate creative success, Simonton found, was the writer. “We can learn a great deal about what makes a successful film just by focusing on the quality of the screenplay,” he declared. Still, as he’d found earlier, quality did not always translate to quantity
  • And the thing that could potentially be even more, or at least equally, predictive wasn’t easy to quantify: so-called information cascades (basically, a snowball effect) that result from word-of-mouth dynamics.
  • ...15 more annotations...
  • “The character identification is the driving force,” says Wells, whose own research focusses on perception and the visual appeal of film. “It’s why people tend to identify with that medium always—it allows them to be put in those roles and experiment through that.”
  • one theme seemed to resonate: everyone could identify with Elsa. She wasn’t your typical princess. She wasn’t your typical Disney character. Born with magical powers that she couldn’t quite control, she meant well but caused harm, both on a personal scale (hurting her sister, repeatedly) and a global one (cursing her kingdom, by mistake). She was flawed—actually flawed, in a way that resulted in real mistakes and real consequences. Everyone could interpret her in a unique way and find that the arc of her story applied directly to them
  • what does all of this mean for “Frozen”? On the one hand, the movie shares many typical story elements with other Disney films. There are the parents dead within the first ten minutes (a must, it seems, in Disney productions), royalty galore, the quest to meet your one true love, the comic-relief character (Olaf the Snowman) to punctuate the drama. Even the strong female lead isn’t completely new
  • In 2012, he and Simonton conducted a study of two hundred and twenty family films released between 1996 and 2009, to see whether successful children’s movies had certain identifying characteristics. They found that films that dealt with nuanced and complex themes did better than those that played it safe, as measured both by ratings on metacritic.com, rottentomatoes.com, and IMDb and by over-all financial performance.
  • the story keeps the audience engaged because it subverts expected tropes and stereotypes, over and over. “It’s the furthest thing from a typical princess movie,”
  • It also, unlike prior Disney films, aces the Bechdel Test: not only are both leads female, but they certainly talk about things other than men. It is the women, in fact, not the men, who save the day, repeatedly—and a selfless act of sacrifice rather than a “kiss of true love” that ends up winning.
  • She recalls the sheer diversity of the students who joined the discussion: a mixture, split evenly between genders, of representatives of the L.G.B.T. community, artists, scientists.
  • “A good story, issues to think about and wrestle with,”
  • Simonton and Kaufman were able to explain only twenty to twenty-four per cent of variance in critical success and twenty-five in domestic gross earnings.
  • The other element, of course, is that intangible that Litman calls “buzz” and Simonson calls “information cascades,” the word of mouth that makes people embrace the story,
  • Part of the credit goes to Disney’s strategy. In their initial marketing campaign, they made an effort to point out the story’s uniqueness.
  • And their lawyers allowed the music to spread naturally through social media.
  • part of the credit goes to Jennifer Lee’s team, for the choices they consciously made to make the screenplay as complex as it was. Elsa was once evil; Elsa and Anna weren’t originally sisters; the prince wasn’t a sociopath. Their decisions to forego a true villain—something no Disney film had successfully done—and to make the story one driven by sibling love rather than romantic infatuation have made “Frozen” more than simply nuanced and relatable. They’ve made it more universally acceptable.
  • In contrast to other recent Disney films, like “Tangled,” “Frozen” isn’t politically fraught or controversial: you can say it’s good without fear of being accused of being a racist or an apologist or an animal-rights opponent
  • to echo the words of the screenwriting legend William Goldman, “Nobody knows anything.” In the end, it may just be a bit of magic.
11More

How a Simple Spambot Became the Second Most Powerful Member of an Italian Social Networ... - 0 views

  • Luca Maria Aiello and a few pals from the University of Turin in Italy began studying a social network called aNobii.com in which people exchange information and opinions about the books they love. Each person has a site that anybody can visit. Users can then choose to set up social links with others
  • To map out the structure of the network, Aiello and co-created an automated crawler that starts by visiting one person’s profile on the network and then all of the people that connect to this node in turn. It then visits each of the people that link to these nodes and so on. In this way, the bot builds up a map of the network
  • people began to respond to the crawler’s visits. That gave the team an idea. “The unexpected reactions the bot caused by its visits motivated us to set up a social experiment in two parts to answer the question: can an individual with no trust gain popularity and influence?”
  • ...8 more annotations...
  • Aiello and co were careful to ensure that the crawler did not engage with anybody on the network in any way other than to visit his or her node. Their idea was to isolate a single, minimal social activity and test how effective it was in gaining popularity.
  • They began to record the reactions to lajello’s visits including the number of messages it received, their content, the links it received and how they varied over time and so on.
  • By December 2011, lajello’s profile had become one of the most popular on the entire social network. It had received more than 66,000 visits as well as 2435 messages from more than 1200 different people.  In terms of the number of different message received, a well-known writer was the most popular on this network but lajello was second.
  • “Our experiment gives strong support to the thesis that popularity can be gained just with continuous “social probing”,” conclude Aiello and co. “We have shown that a very simple spambot can attract great interest even without emulating any aspects of typical human behavior.”
  • Having created all this popularity, Aiello and co wanted to find out how influential the spam bot could be. So they started using the bot to send recommendations to users on who else to connect to.The spam bot could either make a recommendation chosen at random or one that was carefully selected by a recommendation engine. It then made its recommendations to users that had already linked to lajello and to other users chosen at random.
  • “Among the 361 users who created at least one social connection in the 36 hours after the recommendation, 52 per cent followed suggestion given by the bot,” they say.
  • shows just how easy it is for an automated bot to play a significant role in a social network. Popularity appears easy to buy using nothing more than page visits, at least in this experiment. What is more, this popularity can be easily translated into influence
  • It is not hard to see the significance of this work. Social bots are a fact of life on almost every social network and many have become so sophisticated they are hard to distinguish from humans. If the simplest of bots created by Aiello and co can have this kind of impact, it is anybody’s guess how more advanced bots could influence everything from movie reviews and Wikipedia entries to stock prices and presidential elections.
13More

Among the Disrupted - NYTimes.com - 0 views

  • Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind.
  • Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability.
  • the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms:
  • ...10 more annotations...
  • Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology
  • The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past.
  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.”
11More

André Glucksmann, French Philosopher Who Renounced Marxism, Dies at 78 - The ... - 0 views

  • In 1975, in “The Cook and the Cannibal,” Mr. Glucksmann subjected Marxism to a scalding critique. Two years later, he broadened his attack in his most influential work, “The Master Thinkers,” which drew a direct line from the philosophies of Marx, Hegel, Fichte and Nietzsche to the enormities of Nazism and Soviet Communism. It was they, he wrote in his conclusion, who “erected the mental apparatus which is indispensable for launching the grand final solutions of the 20th century.”
  • An instant best seller, the book put him in the company of several like-minded former radicals, notably Bernard-Henri Lévy and Pascal Bruckner. Known as the nouveaux philosophes, a term coined by Mr. Lévy, they became some of France’s most prominent public intellectuals, somewhat analogous to the neoconservatives in the United States, but with a lingering leftist orientation.
  • Their apostasy sent shock waves through French intellectual life, and onward to Moscow, which depended on the cachet afforded by Jean-Paul Sartre and other leftist philosophers
  • ...8 more annotations...
  • “It was André Glucksmann who dealt the decisive blow to Communism in France,”
  • “In the West, he presented the anti-totalitarian case more starkly and more passionately than anyone else in modern times,
  • “He was a passionate defender of the superoppressed, whether it was the prisoners of the Gulag, the Bosnians and Kosovars, gays during the height of the AIDS crisis, the Chechens under Putin or the Iraqis under Saddam,” he said. “When he turned against Communism, it was because he realized that Communists were not on the same side.”
  • After earning the teaching degree known as an agrégation from the École Normale Supérieure de Saint-Cloud in 1961, Mr. Glucksmann enrolled in the National Center for Scientific Research to pursue a doctorate under Raymond Aron — an odd matchup because Aron was France’s leading anti-Marxist intellectual.
  • His subsequent turn away from Marxism made him a reviled figure on the left, and former comrades looked on aghast as he became one of France’s most outspoken defenders of the United States. He argued for President Ronald Reagan’s policy of nuclear deterrence toward the Soviet Union, intervention in the Balkans and both American invasions of Iraq. In 2007, he supported the candidacy of Nicolas Sarkozy for the French presidency.
  • “There is the Glucksmann who was right and the Glucksmann who could — with the same fervor, the same feeling of being in the right — be wrong,” Mr. Lévy wrote in a posthumous appreciation for Le Monde. “What set him apart from others under such circumstances is that he would admit his error, and when he came around he was fanatical about studying his mistake, mulling it over, understanding it.”
  • In his most recent book, “Voltaire Counterattacks,” published this year, he positioned France’s greatest philosopher, long out of favor, as a penetrating voice perfectly suited to the present moment.
  • “I think thought is an individual action, not one of a party,” Mr. Glucksmann told The Chicago Tribune in 1991. “First you think. And if that corresponds with the Left, then you are of the Left; if Right, then you are of the Right. But this idea of thinking Left or Right is a sin against the spirit and an illusion.”
8More

Words matter in 'ISIS' war, so use 'Daesh' - The Boston Globe - 0 views

  • the Islamic State in Iraq and al-Sham, or ISIS; the Islamic State in Iraq and the Levant, or ISIL; and, more recently, the Islamic State, or IS. French officials recently declared that that country would stop using any of those names and instead refer to the group as “Daesh.”
  • how we talk about this group is central to defeating them.
  • Laurent Fabius said, “This is a terrorist group and not a state. . . the term Islamic State blurs the lines between Islam, Muslims, and Islamists.” President Obama made similar remarks saying, “ISIL is not Islamic . . . and [is] certainly not a state.”
  • ...5 more annotations...
  • The term “Daesh” is strategically a better choice because it is still accurate in that it spells out the acronym of the group’s full Arabic name, al-Dawla al-Islamiya fi al-Iraq wa al-Sham
  • “Daesh” can also be understood as a play on words — and an insult. Depending on how it is conjugated in Arabic, it can mean anything from “to trample down and crush” to “a bigot who imposes his view on others.
  • By using the militants’ preferred names, the US government implicitly gives them legitimacy. But referring to the group as Daesh doesn’t just withhold validity. It also might help the United States craft better policy.
  • A number of studies suggest that the language we use affects the way we think and behave. By using a term that references the Arabic name and not an English translation, American policy makers can potentially inoculate themselves from inherent biases that could affect their decision making
  • the United States is weakening the potency of its own messaging if it continues to refer to the group as ISIL.
11More

It's Time for a Real Code of Ethics in Teaching - Noah Berlatsky - The Atlantic - 3 views

  • More 5inShare Email Print A defendant in the Atlanta Public Schools case turns herself in at the Fulton County Jail on April 2. (David Goldman/AP) Earlier this week at The Atlantic, Emily Richmond asked whether high-stakes testing caused the Atlanta schools cheating scandal. The answer, I would argue, is yes... just not in the way you might think. Tests don't cause unethical behavior. But they did cause the Atlanta cheating scandal, and they are doing damage to the teaching profession. The argument that tests do not cause unethical behavior is fairly straightforward, and has been articulated by a number of writers. Jonathan Chait quite correctly points out that unethical behavior occurs in virtually all professions -- and that it occurs particularly when there are clear incentives to succeed. Incentivizing any field increases the impetus to cheat. Suppose journalism worked the way teaching traditionally had. You get hired at a newspaper, and your advancement and pay are dictated almost entirely by your years on the job, with almost no chance of either becoming a star or of getting fired for incompetence. Then imagine journalists changed that and instituted the current system, where you can get really successful if your bosses like you or be fired if they don't. You could look around and see scandal after scandal -- phone hacking! Jayson Blair! NBC's exploding truck! Janet Cooke! Stephen Glass! -- that could plausibly be attributed to this frightening new world in which journalists had an incentive to cheat in order to get ahead. It holds true of any field. If Major League Baseball instituted tenure, and maybe used tee-ball rules where you can't keep score and everybody gets a chance to hit, it could stamp out steroid use. Students have been cheating on tests forever -- massive, systematic cheating, you could say. Why? Because they have an incentive to do well. Give teachers and administrators an incentive for their students to do well, and more of them will cheat. For Chait, then, teaching has just been made more like journalism or baseball; it has gone from an incentiveless occupation to one with incentives.
  • Chait refers to violations of journalistic ethics -- like the phone-hacking scandal -- and suggests they are analogous to Major-League steroid use, and that both are similar to teachers (or students) cheating on tests. But is phone hacking "cheating"
  • Phone hacking was, then, not an example of cheating. It was a violation of professional ethics. And those ethics are not arbitrarily imposed, but are intrinsic to the practice of journalism as a profession committed to public service and to truth.
  • ...8 more annotations...
  • Behaving ethically matters, but how it matters, and what it means, depends strongly on the context in which it occurs.
  • Ethics for teachers is not, apparently, first and foremost about educating their students, or broadening their minds. Rather, ethics for teachers in our current system consists in following the rules. The implicit, linguistic signal being given is that teachers are not like journalists or doctors, committed to a profession and to the moral code needed to achieve their professional goals. Instead, they are like athletes playing games, or (as Chait says) like children taking tests.
  • Using "cheating" as an ethical lens tends to both trivialize and infantilize teacher's work
  • Professions with social respect and social capital, like doctors and lawyers, collaborate in the creation of their own standards. The assumption is that those standards are intrinsic to the profession's goals, and that, therefore, professionals themselves are best equipped to establish and monitor them. Teachers' standards, though, are imposed from outside -- as if teachers are children, or as if teaching is a game.
  • High-stakes testing, then, does leads to cheating. It does not create unethical behavior -- but it does create the particular unethical behavior of "cheating."
  • We have reached a point where we can only talk about the ethics of the profession in terms of cheating or not cheating, as if teachers' main ethical duty is to make sure that scantron bubbles get filled in correctly. Teachers, like journalists, should have a commitment to truth; like doctors, they have a duty of care. Translating those commitments and duties into a bureaucratized measure of cheating-or-not-cheating diminishes ethic
  • For teachers it is, literally, demoralizing. It severs the moral experience of teaching from the moral evaluation of teaching, which makes it almost impossible for good teachers (in all the senses of "good") to stay in the system.
  • We need better ethics for teachers -- ethics that treat them as adults and professionals, not like children playing games.
11More

Documenting Sports With Tech, or It Didn't Happen - The New York Times - 0 views

  • The real-life issues now so embedded with the sports world — like debates over racial injustice, brain damage, the ethics of college sports and cheating at the Olympics, plus 100 other things — cannot be parsed to 140 characters.
  • ? Twitter has turned a lot of sports reporting into play-by-play, hot takes and snarky one-liners. With retweets and replies, the echo can be deafening.
  • The biggest transformation has been the use of social media, and Twitter is the opium of the sports-reporting masses
  • ...8 more annotations...
  • I’m learning I can have nothing but an iPhone and I’m fine.
  • The game changer was the smartphone. It's not only my office phone. I can also use it to record interviews (its microphone is better than the one in my old Olympus, which is important in crowded, noisy places), take pictures and videos to help me remember the details of what I see, and even type or speak notes and interview answers into emails that I send myself.
  • I use it the way other people use their phones. I email, text, tweet, post to Instagram, get directions, set timers and alarms, change flights, check weather, update my calendar, map my jogs, and listen to podcasts and Spotify during long drives or plane rides. On assignment, I’ve had entire conversations with Google Translate, two of us passing my phone back and forth.
  • Besides being an all-in-one communication tool, the iPhone helps my writing. I take photographs of places I know I’ll want to describe in detail later — the inside of someone’s home, a rocky mountain summit, a piece of jewelry that a subject is wearing, the shape of the clouds and the color of the sky. I take videos of places, too, and narrate them as I shoot so that I can watch and listen later.
  • I often do stories overseas, and for the last couple of years, I have constantly connected with sources, interview subjects and my own family on my phone through WhatsApp, a brilliant messaging service that seems to be well known everywhere except the United States.
  • I use it to text, but also to trade photographs, short videos and voice messages, instantly. And you can call from it, even use it for face-to-face video conversations, free if you’re on Wi-Fi.
  • More than anything, technology has brought the sports world into the “now.”
  • Now we can see almost any game on television, in a dozen sports from anywhere in the world, with a computer on our laps and a phone in our hands. We receive and give instant analysis through the world of social media. We can track statistics for our fantasy teams. We can tweet nasty messages to famous athletes and coaches who disappoint us. Like so many other parts of society, we’re probably watching sports more physically alone than ever, but more connected in other ways.
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
43More

The Navy's USS Gabrielle Giffords and the Future of Work - The Atlantic - 0 views

  • Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone.
  • Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles.
  • If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.
  • ...40 more annotations...
  • “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
  • The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux
  • Or, for that matter, on the relevance of the question What do you want to be when you grow up?
  • By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published
  • I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
  • Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities
  • It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide.
  • Then, in 2001, Donald Rumsfeld arrived at the Pentagon. The new secretary of defense carried with him a briefcase full of ideas from the corporate world: downsizing, reengineering, “transformational” technologies. Almost immediately, what had been an experimental concept became an article of faith
  • But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field.
  • Because there really is no such thing as multitasking—just a rapid switching of attention—I began to feel overstrained, put upon, and finally irked by the impossible set of concurrent demands. Shouldn’t someone be giving me a hand here? This, Hambrick explained, meant I was hitting the limits of working memory—basically, raw processing power—which is an important aspect of “fluid intelligence” and peaks in your early 20s. This is distinct from “crystallized intelligence”—the accumulated facts and know-how on your hard drive—which peaks in your 50
  • Others noticed the change but continued to devote equal attention to all four tasks. Their scores fell. This group, Hambrick found, was high in “conscientiousness”—a trait that’s normally an overwhelming predictor of positive job performance. We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance.
  • he discovered another correlation in his test: The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.”
  • To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
  • High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
  • One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well
  • These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
  • he studied West Point students and graduates.
  • Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself.
  • It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrentl
  • “Fluid, learning-intensive environments are going to require different traits than classical business environments,” I was told by Frida Polli, a co-founder of an AI-powered hiring platform called Pymetrics. “And they’re going to be things like ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity.”
  • “We’re starting to see a big shift,” says Guy Halfteck, a people-analytics expert. “Employers are looking less at what you know and more and more at your hidden potential” to learn new things
  • advice to employers? Stop hiring people based on their work experience. Because in these environments, expertise can become an obstacle.
  • “The Curse of Expertise.” The more we invest in building and embellishing a system of knowledge, they found, the more averse we become to unbuilding it.
  • All too often experts, like the mechanic in LePine’s garage, fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” LePine said, “that he was repeating the same mistake over and over.
  • The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness.
  • Aboard littoral combat ships, the crew lacks the expertise to carry out some important tasks, and instead has to rely on civilian help
  • Meanwhile, the modular “plug and fight” configuration was not panning out as hoped. Converting a ship from sub-hunter to minesweeper or minesweeper to surface combatant, it turned out, was a logistical nightmare
  • So in 2016 the concept of interchangeability was scuttled for a “one ship, one mission” approach, in which the extra 20-plus sailors became permanent crew members
  • “As equipment breaks, [sailors] are required to fix it without any training,” a Defense Department Test and Evaluation employee told Congress. “Those are not my words. Those are the words of the sailors who were doing the best they could to try to accomplish the missions we gave them in testing.”
  • These results were, perhaps, predictable given the Navy’s initial, full-throttle approach to minimal manning—and are an object lesson on the dangers of embracing any radical concept without thinking hard enough about the downsides
  • a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept.
  • if you keep going down this road, you end up with one really expensive ship with just a few people on it who are geniuses … That’s not a future we want to see, because you need a large enough crew to conduct multiple tasks in combat.
  • hat does all this mean for those of us in the workforce, and those of us planning to enter it? It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does
  • A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
  • But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return
  • In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
  • It leaves us with lifelong learning,
  • I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media
  • I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out?
  • Everybody I met on the Giffords seemed to share that mentality. They regarded every minute on board—even during a routine transit back to port in San Diego Harbor—as a chance to learn something new.
9More

How scientists fool themselves - and how they can stop : Nature News & Comment - 1 views

  • In 2013, five years after he co-authored a paper showing that Democratic candidates in the United States could get more votes by moving slightly to the right on economic policy1, Andrew Gelman, a statistician at Columbia University in New York City, was chagrined to learn of an error in the data analysis. In trying to replicate the work, an undergraduate student named Yang Yang Hu had discovered that Gelman had got the sign wrong on one of the variables.
  • Gelman immediately published a three-sentence correction, declaring that everything in the paper's crucial section should be considered wrong until proved otherwise.
  • Reflecting today on how it happened, Gelman traces his error back to the natural fallibility of the human brain: “The results seemed perfectly reasonable,” he says. “Lots of times with these kinds of coding errors you get results that are just ridiculous. So you know something's got to be wrong and you go back and search until you find the problem. If nothing seems wrong, it's easier to miss it.”
  • ...6 more annotations...
  • This is the big problem in science that no one is talking about: even an honest person is a master of self-deception. Our brains evolved long ago on the African savannah, where jumping to plausible conclusions about the location of ripe fruit or the presence of a predator was a matter of survival. But a smart strategy for evading lions does not necessarily translate well to a modern laboratory, where tenure may be riding on the analysis of terabytes of multidimensional data. In today's environment, our talent for jumping to conclusions makes it all too easy to find false patterns in randomness, to ignore alternative explanations for a result or to accept 'reasonable' outcomes without question — that is, to ceaselessly lead ourselves astray without realizing it.
  • Failure to understand our own biases has helped to create a crisis of confidence about the reproducibility of published results
  • Although it is impossible to document how often researchers fool themselves in data analysis, says Ioannidis, findings of irreproducibility beg for an explanation. The study of 100 psychology papers is a case in point: if one assumes that the vast majority of the original researchers were honest and diligent, then a large proportion of the problems can be explained only by unconscious biases. “This is a great time for research on research,” he says. “The massive growth of science allows for a massive number of results, and a massive number of errors and biases to study. So there's good reason to hope we can find better ways to deal with these problems.”
  • Although the human brain and its cognitive biases have been the same for as long as we have been doing science, some important things have changed, says psychologist Brian Nosek, executive director of the non-profit Center for Open Science in Charlottesville, Virginia, which works to increase the transparency and reproducibility of scientific research. Today's academic environment is more competitive than ever. There is an emphasis on piling up publications with statistically significant results — that is, with data relationships in which a commonly used measure of statistical certainty, the p-value, is 0.05 or less. “As a researcher, I'm not trying to produce misleading results,” says Nosek. “But I do have a stake in the outcome.” And that gives the mind excellent motivation to find what it is primed to find.
  • Another reason for concern about cognitive bias is the advent of staggeringly large multivariate data sets, often harbouring only a faint signal in a sea of random noise. Statistical methods have barely caught up with such data, and our brain's methods are even worse, says Keith Baggerly, a statistician at the University of Texas MD Anderson Cancer Center in Houston. As he told a conference on challenges in bioinformatics last September in Research Triangle Park, North Carolina, “Our intuition when we start looking at 50, or hundreds of, variables sucks.”
  • One trap that awaits during the early stages of research is what might be called hypothesis myopia: investigators fixate on collecting evidence to support just one hypothesis; neglect to look for evidence against it; and fail to consider other explanations.
18More

Will a Student Loan Debt Crisis Sink the U.S. Economy? - 1 views

  • Student debt has more than tripled since 2004, reaching $1.52 trillion in the first quarter of 2018, according to the Federal Reserve — second only to mortgage debt in the U.S. College costs have outpaced the Consumer Price Index more than four-fold since 1985, and tuition assistance today is often harder to come by, particularly at schools without large endowments.
  • About 44 million graduates hold student debt, and today’s graduates leave school holding promissory notes worth an average of $37,000, raising concerns that the burden is creating a cascade of pressures compelling many to put off traditional life milestones
  • The storyline, as it has emerged, is that college debt delays buying a house, getting married, having children and saving for retirement, and there is some evidence that this is happening.
  • ...15 more annotations...
  • But the truth is more nuanced, and, statistically at least, the question of how burdensome student debt is and the extent to which it is disrupting major life events depends on a number of factors, including when you graduated from college with debt.
  • For those who graduated with debt as the economy was crashing, it was a double-whammy, said Keys, “so you’re seeing delayed marriage, delayed child-bearing, which are at least in part a function of the ongoing damage from the Great Recession.
  • Before the Great Recession, student debt levels were below auto loans, credit card debt and home-equity lines of credit in the ranking of household debt. Since then, student loan debt has surpassed these other debts
  • A $1,000 increase in student loan debt lowers the homeownership rate by about 1.5 percentage points for public four-year college-goers during their mid 20s, equivalent to an average delay of 2.5 months in attaining homeownership,
  • Individuals who attain higher education average higher salaries, which translates into a higher tax base. With higher levels of education attainment, there is also less reliance on social welfare programs, as individuals who attain higher education are more likely to be employed, less likely to be unemployed, and less likely to be in poverty. Higher levels of education are also associated with greater civic engagement, as well as lower crime.”
  • In 2014, the largest chunk of student debt — nearly 40% — belonged to people owing between $1 and $10,000.
  • The bigger problem, Webber said, comes when students take out loans and then don’t graduate from college
  • Nationally, 60% of people who start at a four-year institution wind up graduating within the next six years
  • There are other ways in which all debt is not created equal. “Many of the people who have the largest loans and are the most likely to default are also the people who got the worst credentials and poorest quality training when they graduated or potentially didn’t even graduate
  • But although $1.5 trillion is a big number, it may not be an unreasonable amount given the value it is creating
  • In 2002, a bachelor’s degree holder could expect to make 75% more than someone with just a high school diploma, and nearly a decade later that premium had risen to 84%
  • A bachelor’s degree is worth about $2.8 million over a lifetime, the study also found.
  • Australia has a system that links the repayment of loans with the tax system. “Income-driven repayment options have been created in the U.S.,” said Perna, “but these options are more cumbersome and administratively complex than in Australia and some other nations. By linking the amount of the monthly payment to an individual’s income, income-driven repayment options can help to protect borrowers against the risk of non-repayment. But a more seamless system wouldn’t require borrowers to annually report their income to the U.S. Department of Education
  • “Promise” or “free tuition” programs cropping up in some states are also worth examining
  • “Right now there is, frankly, very little accountability that schools have; they practically have no skin in the game. If students default on their loans, there is no bad effect for the school.”
« First ‹ Previous 41 - 60 of 88 Next › Last »
Showing 20 items per page