Skip to main content

Home/ TOK Friends/ Group items tagged berlin

Rss Feed Group items tagged

Javier E

History News Network | We Traded in One of the Most Self-Disciplined Presidents for the... - 1 views

  • How ironic it is then that President Obama, the bane of conservatives, possessed an abundance of self-discipline, and President Trump, who most conservatives (including Bennet) favored over Hilary Clinton, possesses almost none.
  • The ancient Greek philosopher Aristotle thought that political leaders should exercise practical wisdom (phronesis) or prudence. He considered temperance (i.e., moderation, self-restraint) and self-discipline two of the most important virtues required for such wisdom. He believed that the two virtues should help us regulate what he called “the appetitive faculty,” which deals with our emotions and desires
  • One of the twentieth-century’s most prominent commentators on political wisdom, Britain’s Isaiah Berlin (1909-1997), viewed temperance as an important political virtue, and he connected it to humility and tolerance—neither of which Trump displays. And in his “Two Concepts of Liberty,” Berlin wrote, “Freedom is self-mastery.”
  • ...12 more annotations...
  • Barry Schwartz and Kenneth Sharpe in their Practical Wisdom: The Right Way to Do the Right Thing (2010) state that such wisdom is greatly needed
  • The authors quote Aristotle and give the example of a man practicing practical wisdom and mention that “he had the self-control—the emotion-regulating skills—to choose rightly.”
  • Psychologist, futurist, and editor of The Wisdom Page Tom Lombardo also stresses the importance of temperance and self-control. In his new book on Future Consciousness
  • he includes a whole chapter (of 45 pages) on “Self-Control and Self-Responsibility.” In it he cites favorably two authors who claim that “most human problems are due to a lack of self-control.” He also states that “we cannot flourish without self-responsibility, self-control, and . . . . one of the most unethical forms of thinking and behavior in life . . . is to abdicate self-responsibility and self-control in ourselves.”
  • In Inside Obama’s Brain (2009), journalist Sasha Abramsky talked to over a hundred people who knew Obama and reported that “during the election campaign Obama almost never got upset, or panicked, by day-to-day shifts in momentum, by the ups and downs of opinion polls.” Almost a year into his presidency, Abramsky refered to the president as “a voice of moderation in a corrosively shrill, partisan political milieu.”
  • Up until the end of his presidency, Obama maintained his self-control and temperance. As a Huffington Post piece noted in 2016, he “has been the model of temperance in office on all fronts.”
  • Just as many individuals have commented on Obama’s self-discipline and temperance, so too have many remarked on Trump’s lack of these virtues
  • In May 2017, Brooks stated: “At base, Trump is an infantalist. There are three tasks that most mature adults have sort of figured out by the time they hit 25. Trump has mastered none of them. Immaturity is becoming the dominant note of his presidency, lack of self-control his leitmotif.”
  • Two months later, Douthat opined about Trump: “He is nonetheless clearly impaired, gravely deficient somewhere at the intersection of reason and judgment and conscience and self-control. . . . This president should not be the president, and the sooner he is not, the better.”
  • Karl Rove, a former senior adviser to President George W. Bush, insisted that Trump “lacks the focus or self-discipline to do the basic work required of a president.”
  • At about the same time former Republican senator Tom Coburn (R-OK) declared “The question is, does he have the self-discipline and some control over his ego to be able to say ‘I’m wrong’ every now and then? I haven’t seen that.
  • it is Trump’s narcissism and lack of humility that are his chief faults and hinder him most from being even a mediocre president.
Ellie McGinnis

The 50 Greatest Breakthroughs Since the Wheel - James Fallows - The Atlantic - 0 views

  • Some questions you ask because you want the right answer. Others are valuable because no answer is right; the payoff comes from the range of attempts.
  • That is the diversity of views about the types of historical breakthroughs that matter, with a striking consensus on whether the long trail of innovation recorded here is now nearing its end.
  • The clearest example of consensus was the first item on the final compilation, the printing press
  • ...27 more annotations...
  • Leslie Berlin, a historian of business at Stanford, organized her nominations not as an overall list but grouped into functional categories.
  • Innovations that expand the human intellect and its creative, expressive, and even moral possibilities.
  • Innovations that are integral to the physical and operating infrastructure of the modern world
  • Innovations that enabled the Industrial Revolution and its successive waves of expanded material output
  • Innovations extending life, to use Leslie Berlin’s term
  • Innovations that allowed real-time communication beyond the range of a single human voice
  • Innovations in the physical movement of people and goods.
  • Organizational breakthroughs that provide the software for people working and living together in increasingly efficient and modern ways
  • Finally, and less prominently than we might have found in 1950 or 1920—and less prominently than I initially expected—we have innovations in killing,
  • Any collection of 50 breakthroughs must exclude 50,000 more.
  • We learn, finally, why technology breeds optimism, which may be the most significant part of this exercise.
  • Popular culture often lionizes the stars of discovery and invention
  • For our era, the major problems that technology has helped cause, and that faster innovation may or may not correct, are environmental, demographic, and socioeconomic.
  • people who have thought deeply about innovation’s sources and effects, like our panelists, were aware of the harm it has done along with the good.
  • “Does innovation raise the wealth of the planet? I believe it does,” John Doerr, who has helped launch Google, Amazon, and other giants of today’s technology, said. “But technology left to its own devices widens rather than narrows the gap between the rich and the poor.”
  • Are today’s statesmen an improvement over those of our grandparents’ era? Today’s level of public debate? Music, architecture, literature, the fine arts—these and other manifestations of world culture continually change, without necessarily improving. Tolstoy and Dostoyevsky, versus whoever is the best-selling author in Moscow right now?
  • The argument that a slowdown might happen, and that it would be harmful if it did, takes three main forms.
  • Some societies have closed themselves off and stopped inventing altogether:
  • By failing to move forward, they inevitably moved backward relative to their rivals and to the environmental and economic threats they faced. If the social and intellectual climate for innovation sours, what has happened before can happen again.
  • visible slowdown in the pace of solutions that technology offers to fundamental problems.
  • a slowdown in, say, crop yields or travel time is part of a general pattern of what economists call diminishing marginal returns. The easy improvements are, quite naturally, the first to be made; whatever comes later is slower and harder.
  • America’s history as a nation happens to coincide with a rare moment in technological history now nearing its end. “There was virtually no economic growth before 1750,” he writes in a recent paper.
  • “We can be concerned about the last 1 percent of an environment for innovation, but that is because we take everything else for granted,” Leslie Berlin told me.
  • This reduction in cost, he says, means that the next decade should be a time of “amazing advances in understanding the genetic basis of disease, with especially powerful implications for cancer.”
  • the very concept of an end to innovation defied everything they understood about human inquiry. “If you look just at the 20th century, the odds against there being any improvement in living standards are enormous,”
  • “Two catastrophic world wars, the Cold War, the Depression, the rise of totalitarianism—it’s been one disaster after another, a sequence that could have been enough to sink us back into barbarism. And yet this past half century has been the fastest-ever time of technological growth. I see no reason why that should be slowing down.”
  • “I am a technological evolutionist,” he said. “I view the universe as a phase-space of things that are possible, and we’re doing a random walk among them. Eventually we are going to fill the space of everything that is possible.”
Javier E

Obama, With Angela Merkel in Berlin, Assails Spread of Fake News - The New York Times - 0 views

  • “Because in an age where there’s so much active misinformation and its packaged very well and it looks the same when you see it on a Facebook page or you turn on your television,” Mr. Obama said. “If everything seems to be the same and no distinctions are made,
  • Bogus news stories appearing online and on social media appear to have had a greater reach in the final months of the campaign than articles by authoritative, mainstream news outlets, according to an analysis of Facebook activity by BuzzFeed
  • In the three months before Election Day, the most popular stories produced by hoax sites and “hyperpartisan blogs” generated more engagement — likes, shares and comments — on Facebook than the most popular articles by major news websites, the analysis found.
  • ...2 more annotations...
  • Among the 20 most popular fake election stories identified by BuzzFeed, all but three favored Mr. Trump or denigrated Hillary Clinton.
  • “If we are not serious about facts and what’s true and what’s not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can’t discriminate between serious arguments and propaganda, then we have problems.”
Duncan H

A Treaty to Save Euro May Split Europe - NYTimes.com - 0 views

  • European leaders, meeting until the early hours of Friday, agreed to sign an intergovernmental treaty that would require them to enforce stricter fiscal and financial discipline in their future budgets. But efforts to get unanimity among the 27 members of the European Union, as desired by Germany, failed as Britain refused to go along.
  • all 17 members of the European Union that use the euro agreed to the new treaty, along with six other countries that wish to join the currency union eventually.
  • Twenty years after the Maastricht Treaty, which was designed not just to integrate Europe but to contain the might of a united Germany, Berlin had effectively united Europe under its control, with Britain all but shut out.
  • ...1 more annotation...
  • the leaders agreed to provide an additional 200 billion euros to the International Monetary Fund to help increase a “firewall” of money in European bailout funds to help cover Italy and Spain. He also said a permanent 500 billion euro European Stability Mechanism would be put into effect a year early, by July 2012, and for a year, would run alongside the existing and temporary 440 billion euro European Financial Stability Facility, thus also increasing funds for the firewall. The leaders also agreed that private-sector lenders to euro zone nations would not automatically face losses, as had been the plan in the event of another future bailout. When Greece’s debt was finally restructured, the private sector suffered, making investors more anxious about other vulnerable economies.
Javier E

The Leadership Revival - NYTimes.com - 0 views

  • take a reality bath. Go off and become a stranger in a strange land. Go off to some alien part of this country or the world. Immerse yourself in the habits and daily patterns of that existence and stay there long enough to get acculturated. Stay there long enough so that you forget the herd mentality of our partisan culture.
  • When you return home, you will look at your own place with foreign eyes. You’ll see the contours of your own reality more clearly. When you return to native ground, you’re more likely to possess the sort of perceptiveness that Isaiah Berlin says is the basis of political judgment.
  • This sort of wisdom consists of “a special sensitiveness to the contours of the circumstances in which we happen to be placed; it is a capacity for living without falling foul of some permanent condition or factor which cannot be either altered, or even fully described.” This wisdom is based on a tactile awareness of your country and its people — what they want, how they react. You don’t think this awareness. You feel it. You experience a visceral oneness with culture and circumstance — the smell of the street, tinges of anger and hope and aspiration. The irony is that you are more likely to come into union with your own home culture after you have been away from it.
maddieireland334

How to learn 30 languages - 0 views

  •  
    Out on a sunny Berlin balcony, Tim Keeley and Daniel Krasa are firing words like bullets at each other. First German, then Hindi, Nepali, Polish, Croatian, Mandarin and Thai - they've barely spoken one language before the conversation seamlessly melds into another. Together, they pass through about 20 different languages or so in total.
Javier E

Teaching a Different Shakespeare From the One I Love - The New York Times - 0 views

  • Even the highly gifted students in my Shakespeare classes at Harvard are less likely to be touched by the subtle magic of his words than I was so many years ago or than my students were in the 1980s in Berkeley, Calif. What has happened? It is not that my students now lack verbal facility. In fact, they write with ease, particularly if the format is casual and resembles the texting and blogging that they do so constantly. The problem is that their engagement with language, their own or Shakespeare’s, often seems surprisingly shallow or tepid.
  • There are many well-rehearsed reasons for the change: the rise of television followed by the triumph of digital technology, the sending of instant messages instead of letters, the ‘‘visual turn’’ in our culture, the pervasive use of social media. In their wake, the whole notion of a linguistic birthright could be called quaint, the artifact of particular circumstances that have now vanished
  • For my parents, born in Boston, the English language was a treasured sign of arrival and rootedness; for me, a mastery of Shakespeare, the supreme master of that language, was like a purchased coat of arms, a title of gentility tracing me back to Stratford-upon-Avon.
  • ...6 more annotations...
  • It is not that the English language has ceased to be a precious possession; on the contrary, it is far more important now than it ever was in my childhood. But its importance has little or nothing to do any longer with the dream of rootedness. English is the premier international language, the global medium of communication and exchange.
  • as I have discovered in my teaching, it is a different Shakespeare from the one with whom I first fell in love. Many of my students may have less verbal acuity than in years past, but they often possess highly developed visual, musical and performative skills. They intuitively grasp, in a way I came to understand only slowly, the pervasiveness of songs in Shakespeare’s plays, the strange ways that his scenes flow one into another or the cunning alternation of close-ups and long views
  • When I ask them to write a 10-page paper analyzing a particular web of metaphors, exploring a complex theme or amassing evidence to support an argument, the results are often wooden; when I ask them to analyze a film clip, perform a scene or make a video, I stand a better chance of receiving something extraordinary.
  • This does not mean that I should abandon the paper assignment; it is an important form of training for a range of very different challenges that lie in their future. But I see that their deep imaginative engagement with Shakespeare, their intoxication, lies elsewhere.
  • The M.I.T. Global Shakespeare Project features an electronic archive that includes images of every page of the First Folio of 1623. In the Norton Shakespeare, which I edit, the texts of his plays are now available not only in the massive printed book with which I was initiated but also on a digital platform. One click and you can hear each song as it might have sounded on the Elizabethan stage; another click and you listen to key scenes read by a troupe of professional actors. It is a kind of magic unimagined even a few years ago or rather imaginable only as the book of a wizard like Prospero in ‘‘The Tempest.’
  • But it is not the new technology alone that attracts students to Shakespeare; it is still more his presence throughout the world as the common currency of humanity. In Taiwan, Tokyo and Nanjing, in a verdant corner of the Villa Borghese gardens in Rome and in an ancient garden in Kabul, in Berlin and Bangkok and Bangalore, his plays continue to find new and unexpected ways to enchant.
aqconces

Hitler constantly high on crystal meth while leading Nazi Germany: report - NY Daily News - 0 views

  • New research shows that the German Nazi leader was on a constant supply of crystal methamphetamines to stay awake and energized, according to the UK Independent.
  • The intoxicated Fuhrer, a famous hypochondriac, was on more than 74 different medications while he ordered the systematic murders of Jews across Europe
  • It also claims he took nine shots of methamphetamine while living out his last days in his bunker to ease his pain and stress
  • ...3 more annotations...
  • Hitler was on a steady stream of barbiturate tranquilizers, morphine, nasal and eye drops containing cocaine and other drugs — along with bulls’ semen to boost his testosterone — thanks to his Berlin-based personal physician, Theodor Morell, according to the report
  • He was characterized as “a quack and a fraud and a snake oil salesman”
  • Hitler was shown to have signs of Parkinson's disease by the end of World War II in 1945, and the dizzying array of drugs likely contributed to his serious health issues.
  •  
    Studies show that Hitler was constantly high on crystal meth while leading Germany.  Fuhrer was on more than 74 different medications while he ordered murders of Jews.  
Javier E

A Curious Midlife Crisis for a Tech Entrepreneur - The New York Times - 0 views

  • as he approached 40, Fabrice Grinda, a French technology entrepreneur with an estimated net worth of $100 million, couldn’t shake the feeling that something was terribly wrong. Somehow the trappings of his success were weighing him down.
  • “People turn 40 and usually buy a shiny sports car,” Mr. Grinda said during an interview in a penthouse suite at Sixty LES, a downtown boutique hotel. “They don’t say, ‘I’m downsizing my life and giving up all my possessions to focus on experiences and friendships.’
  • He dubbed it “the very big downgrade”: He was going to travel the world, working on the fly while staying with friends and family. He was purposely arranging things so that he would have a chance to focus on what was meaningful in life.
  • ...13 more annotations...
  • But that is exactly what Mr. Grinda did. He moved out of the Bedford house in December 2012, ditched the city apartment and got rid of the McLaren. He donated clothes, sports equipment and kitchen utensils to the Church of St. Francis Xavier in Lower Manhattan. He gave his furniture to Housing Works and he packed a Tumi carry-on suitcase with 50 items, including two pairs of jeans, a bathing suit and 10 pairs of socks.
  • Once he realized his days as a roving houseguest were numbered, Mr. Grinda decided to shift his approach: He kept traveling, but now he was renting apartments on Airbnb or staying in luxury hotels.
  • Born in suburban Paris in 1974, Mr. Grinda graduated from Princeton in 1996 with a degree in economics. He worked as a consultant at McKinsey & Company for two years before moving back to France to found an online auction start-up funded by the business magnate Bernard Arnault, which Mr. Grinda sold in 2000.He returned to the United States, where he co-founded Zingy, a mobile phone ringtone and game maker, which fetched $80 million in a 2004 sale. After that, he was a founder of OLX, a Craigslist-like service that has become one of the largest global classified websites.Now he is an entrepreneur and angel investor, with more than 200 investments to date, who visits start-ups in Berlin, Paris, New York, San Francisco and other cities.
  • He looks (and acts) something like Sheldon Cooper, the oddball science geek played by Jim Parsons on “The Big Bang Theory,” an observation Mr. Grinda himself has made.“Friends, who knew me in my late teens and early twenties, would tell you I had exactly the same delusional sense of self-worth and condescending and arrogant self-centered worldview,” he wrote in a blog post that noted his similarities to the sitcom character.
  • In all, Mr. Grinda said, he stayed with about 15 friends and family members in the first months of 2013. “Everyone was, like, ‘It’s a great idea. Come over,’ ” Mr. Grinda said. “The problem is, the idea of ‘Great, come over’ and me there 24 hours a day, seven days a week, is very different. Especially when their lives are not in sync with mine.”
  • “When I looked back at the things that mattered the most to me,” he said, “they were experiences, friendships and family — none of which I had invested much in, partly because I was too busy, and partly because I felt anchored by my possessions.”
  • He hatched a new plan: His friends and family members would come to him.“Rather than me going to them and disrupting their routine,” he said, “getting everyone together in a setting of vacation makes more sense.”
  • He invited his parents, his friends, their partners, children and nannies for a two-week stay in Anguilla, an island east of Puerto Rico, where he rented two conjoining houses, at a cost of $240,000, with chefs and full house service (and a total of 19 bedrooms).
  • Mr. Grinda forgot to consider that not everyone lives as he does.For one thing, he had scheduled the Anguilla vacation during the school year, which meant friends with children couldn’t make it. The island’s remoteness, furthermore, meant some guests were forced to endure a tangle of flight connections, leaving some of them exhausted by the time they arrived.And many of the people he invited, who had jobs and other obligations, could stay only for a long weekend.
  • Mr. Grinda said he has learned a lot from his very big downgrade. He reconnected with old friends, even if it meant annoying them a little, and he rekindled his relationship with his father.“We spent time talking about his life,” he said. And he is no longer against the idea of having a fixed address; he said he is now in negotiations to buy a two-bedroom apartment on the Lower East Side, which he plans to rent out when he is not in town.
  • Still, the experiment has taken its toll. “The philosophy is interesting,” he said. “But how do you put it into practice? How do you make it real?”
  • He recently split up with Otilia Aionesei, a former model who works at technology start-up, whom he had been dating, off and on, for two years. The sticking point was their lack of a shared home.“If you want to be his girlfriend, this is the life you have to lead,” Ms. Aionesei said. “I like simple things, to watch movies on the same couch.”Mr. Grinda had a different view. “We went to the Galápagos,” he said. “We went to Tulum. To St. Barts. We have these wonderful experiences and memories together.”
  • “My home is where I am,” he said. “And it doesn’t matter if it is a friend’s place or a couch or the middle of the jungle or a hotel room on the Lower East Side. But I realize that most of humanity, especially women, don’t see it that way.”
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

Raymond Tallis Takes Out the 'Neurotrash' - The Chronicle Review - The Chronicle of Hig... - 0 views

  • Tallis informs 60 people gathered in a Kent lecture hall that his talk will demolish two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions.
  • Aping Mankind argues that neuroscientific approaches to things like love, wisdom, and beauty are flawed because you can't reduce the mind to brain activity alone.
  • Stephen Cave, a Berlin-based philosopher and writer who has called Aping Mankind "an important work," points out that most philosophers and scientists do in fact believe "that mind is just the product of certain brain activity, even if we do not currently know quite how." Tallis "does both the reader and these thinkers an injustice" by declaring that view "obviously" wrong,
  • ...5 more annotations...
  • cultural memes. The Darwinesque concept originates in Dawkins's 1976 book, The Selfish Gene. Memes are analogous to genes, Dennett has said, "replicating units of culture" that spread from mind to mind like a virus. Religion, chess, songs, clothing, tolerance for free speech—all have been described as memes. Tallis considers it absurd to talk of a noun-phrase like "tolerance for free speech" as a discrete entity. But Dennett argues that Tallis's objections are based on "a simplistic idea of what one might mean by a unit." Memes aren't units? Well, in that spirit, says Dennett, organisms aren't units of biology, nor are species—they're too complex, with too much variation. "He's got to allow theory to talk about entities which are not simple building blocks," Dennett says.
  • Geraint Rees, director of University College London's Institute of Cognitive Neuroscience, complains that reading Tallis is "a bit like trying to nail jelly to the wall." He "rubbishes every current theory of the relationship between mind and brain, whether philosophical or neuroscientific," while offering "little or no alternative,"
  • How is it that he perceives the glass of water on the table? How is it that he feels a sense of self over time? How is it that he can remember a patient he saw in 1973, and then cast his mind forward to his impending visit to the zoo? There are serious problems with trying to reduce such things to impulses in the brain, he argues. We can explain "how the light gets in," he says, but not "how the gaze looks out." And isn't it astonishing, he adds, that much neural activity seems to have no link to consciousness? Instead, it's associated with things like controlling automatic movements and regulating blood pressure. Sure, we need the brain for consciousness: "Chop my head off, and my IQ descends." But it's not the whole story. There is more to perceptions, memories, and beliefs than neural impulses can explain. The human sphere encompasses a "community of minds," Tallis has written, "woven out of a trillion cognitive handshakes of shared attention, within which our freedom operates and our narrated lives are led." Those views on perception and memory anchor his attack on "neurobollocks." Because if you can't get the basics right, he says, then it's premature to look to neuroscience for clues to complex things like love.
  • Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."
  • Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain." "You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
Emilio Ergueta

No Consolation For Kalashnikov | Issue 59 | Philosophy Now - 0 views

  • The legendary AK 47 assault rifle was invented in 1946 by Mikhail Kalashnikov. It was issued to the armies of the old Warsaw Pact countries and has been used in many conflicts, eg by the North Vietnamese Army (NVA), Soviet soldiers in Afghanistan, and even this year by Al Qaeda operatives in Iraq.
  • Whatever interpretation one puts on those two conflicts, almost no-one sane would condone the use of the AK 47 in killing civilians, for instance Shiites in Iraq.
  • Mikhail Kalashnikov has come to have some doubts about his invention. He told The Times in June 2006, “I don’t worry when my guns are used for national liberation or defence. But when I see how peaceful people are killed and wounded by these weapons, I get very distressed and upset. I calm down by telling myself that I invented this gun 60 years ago to protect the interests of my country.”
  • ...10 more annotations...
  • Weapons research produces in the first place not guns, bombs, bullets and planes and the various command, control and communications hardware and software needed to use these things, but plans, blueprints and designs – knowledge and know-how. Unless these useful plans are lost or destroyed, they can be implemented or instantiated many times over, and thus project unforeseen into the future.
  • If any one person invented the atomic bomb, it was Leo Szilard. It seems he had the idea, and he made great efforts from 1935 until 1942, when the Manhattan Project was set up, to get the research done that would show whether an atomic bomb was possible; how to make one; and if need be, to provide the basis for actually making one.
  • This perception was greatly strengthened when Hahn and Strassmann discovered nuclear fission in Berlin in 1938. So Szilard, worried about the Nazis getting an atomic bomb, thought that the Allies should do the research to see if and how one could be made, in order to deter or otherwise prevent the Nazis from using one.
  • As far as Szilard and a good number of other atomic scientists were concerned there was no longer a rationale for the bomb project. Szilard, Philip Franck and others wrote The Franck Report in June 1945, which among other things advocated a demonstration of the power of the atomic bomb by dropping one on an uninhabited island. The Franck Report was ignored.The project was not abandoned, of course, and two of its products were used on Japanese cities, to kill mostly Japanese civilians.
  • The point of this example is to show how scientists lose control of their work when they take part in weapons research – they lose control of it in other settings besides, but this case is the most problematic.
  • One way out of the dilemma is to refuse to do war research under any circumstances. I’d like to endorse this option, especially as it does not imply that we should judge Kalashnikov, Szilard, Watson-Watt and other well-intentioned researchers harshly, since we can argue that the dilemma has only become evident recently.
  • Another possibility is to deny that weapons research must take place within history, as a good Marxist might put it. That is, as I would put it: Perhaps weapons research is not an activity that must take account of historical contingencies.
  • We must acknowledge that there is no such thing as an inherently defensive weapon, something that can only be used for the morally acceptable purpose of responding against an aggressor. Doing weapons research for defensive systems is therefore not morally acceptable, as any weapons might feasibly be used as part of an unjust war of aggression.
  • Kalashnikov’s preferred description of what he did when he designed the AK 47 is something like “providing the means for liberation,” or “defending my country,” not “providing the means to kill innocents.” However, he acknowledges that the latter description applies to his situation equally well. Nevertheless, J might try to portray her actions as something like “provide the means for deterrence,” the idea being that what she is helping to create is intended to deter, and hence prevent harm rather than cause it.
  • You might say that this is utopian, and it would never work, but then it might console Kalashnikov, who, after all, was a Marxist, and perhaps also a utopian.
Javier E

In Defense of Naïve Reading - NYTimes.com - 1 views

  • Clearly, poems and novels and paintings were not produced as objects for future academic study; there is no a priori reason to think that they could be suitable objects of  “research.” By and large they were produced for the pleasure and enlightenment of those who enjoyed them.
  • But just as clearly, the teaching of literature in universities ─ especially after the 19th-century research model of Humboldt University of Berlin was widely copied ─ needed a justification consistent with the aims of that academic setting
  • The main aim was research: the creating and accumulation and transmission of knowledge. And the main model was the natural science model of collaborative research: define problems, break them down into manageable parts, create sub-disciplines and sub-sub-disciplines for the study of these, train students for such research specialties and share everything. With that model, what literature and all the arts needed was something like a general “science of meaning” that could eventually fit that sort of aspiration. Texts or art works could be analyzed as exemplifying and so helping establish such a science. Results could be published in scholarly journals, disputed by others, consensus would eventually emerge and so on.
  • ...3 more annotations...
  • literature study in a university education requires some method of evaluation of whether the student has done well or poorly. Students’ papers must be graded and no faculty member wants to face the inevitable “that’s just your opinion” unarmed, as it were. Learning how to use a research methodology, providing evidence that one has understood and can apply such a method, is understandably an appealing pedagogy
  • Literature and the arts have a dimension unique in the academy, not shared by the objects studied, or “researched” by our scientific brethren. They invite or invoke, at a kind of “first level,” an aesthetic experience that is by its nature resistant to restatement in more formalized, theoretical or generalizing language. This response can certainly be enriched by knowledge of context and history, but the objects express a first-person or subjective view of human concerns that is falsified if wholly transposed to a more “sideways on” or third person view.
  • such works also can directly deliver a  kind of practical knowledge and self-understanding not available from a third person or more general formulation of such knowledge. There is no reason to think that such knowledge — exemplified in what Aristotle said about the practically wise man (the phronimos)or in what Pascal meant by the difference between l’esprit géometrique and l’esprit de finesse — is any less knowledge because it cannot be so formalized or even taught as such.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
cvanderloo

Germany Says Not Enough Vaccine Available To Stop Its 3rd Wave : Coronavirus Updates : NPR - 1 views

  • German Health Minister Jens Spahn is telling Germans to diligently follow coronavirus safety rules, warning that vaccines won't arrive quickly enough to prevent a third wave of the COVID-19 pandemic.
  • "Even if the deliveries from EU orders come reliably, it will still take a few weeks until the risk groups are fully vaccinated."
  • Germany's infection rate is rising at a pace not seen since the record spike it endured in December and January.
  • ...4 more annotations...
  • With Germany set for a four-day-weekend in early April due to the Easter holiday, Spahn said the country isn't ready to relax travel and physical distancing rules. In fact, he said, Germans should be prepared to revert to tighter restrictions.
  • "Health experts are calling on the German government to order a third lockdown to prevent hospitals from being overrun," NPR's Rob Schmitz reports from Berlin.
  • As it tries to boost its vaccine campaign, Germany is also moving ahead with talks to acquire Russia's Sputnik V vaccine — with or without the rest of the EU's involvement.
  • Spahn said Friday that he believes a deal with Russia could be reached quickly once a delivery amount is agreed upon.
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
knudsenlu

How World War II Spurred a Veteran's Ambition - The Atlantic - 0 views

  • My first 18 months of military service were uninspiring. Donning the uniform did not fill me with pride, nor did the experience alter my perspective on life. What basic training had taught me was that the best way to get by was to stay out of sight.
  • Accordingly, my detachment expanded its mission: all males—soldiers and others of military serviceable age, no matter where encountered or whether in uniform—were to be taken prisoner unless they had persuasive evidence of either having been exempted or discharged from military service. Anyone without such proof was considered a potential guerilla. A sweep of the countryside would yield scores of German “civilians,” among them soldiers who had simply shed their uniform or party activists suspected of organizing a resistance usually with cover stories that I had to break. Not only did I become very adept at this task, but it also gave me some great insights into postwar German mentalities—insights that would later inspire me to revisit my views on higher education.
  • Wilke and his family had emigrated from Germany in the early 1920s, and he still spoke German with a genuine Saxon dialect.
  • ...4 more annotations...
  • As soon as I was released from the hospital, I worked my way step-by-step from France back to Berlin, the capital of the four occupying powers. After some stumbling blocks and a job I despised, I found a position as a research analyst in the intelligence branch of the military government’s information-control division.
  • This job turned out to be extremely challenging, but that also made it a real blessing. I wrote drafts on a wide range of political topics, including the identification of potential political leaders not yet recognized, a catalog of the rumors circulating among the population, incidents indicative of how people felt about American troops, and the dominant mood among German youth. We gleaned information from reports compiled by field representatives stationed in roughly a dozen communities throughout the American occupation zone, supplemented with details from German newspapers—and, in my case, with insights based on contacts and conversation I had whenever I passed myself off as a German civilian.
  • That is how my war and post-war service induced me, in the fall of 1947, after an interlude of more than five years, to enroll in the University of Chicago as a freshman. I stayed for six years, left with a Ph.D., and ever since enjoyed a long academic career as a sociologist, first specializing in the military and then studying propaganda and the effect of television on politics.
  • World War II spurred my ambition by teaching me how to navigate the army. Those lessons led me to confront the society I had once known so well, and to study politics and people living in a time of upheaval.
tongoscar

Turkey, Russia and the Libyan conundrum | Middle East | Al Jazeera - 0 views

  • The UAE, Egypt and to a certain extent France and Russia have all been supporting renegade military commander Khalifa Haftar, who since 2014 has been trying to take power in Libya through military force. The military operation he launched in April 2019 to take over the capital Tripoli, has further complicated the situation.
  • Currently, Turkey and Russia seem to be trying to work together to resolve the conflict, as they have become the two key international players in Libya. The two managed to broker a ceasefire between the GNA and Haftar's forces which came into effect on January 12.
  • Whether Italy and France can use the Berlin conference to seize back the initiative in the Libyan conflict from Turkey and Russia remains to be seen. For now, the shift in dynamics on the ground has established Ankara and Moscow as the main powerbrokers.
  • ...2 more annotations...
  • The Turkish military presence in Libya will mean that the GNA is in a much stronger position to survive the onslaught by Haftar.
  • The deal struck between Erdogan and Putin ultimately means that both sides in Libya will need to compromise and realise that there is no way for either side to achieve a victory through military means.
tongoscar

Germans divided over plans for Tesla electric car factory | World news | The Guardian - 0 views

  • German environmentalists and political leaders are at loggerheads over a proposed Tesla electric car factory in woodland outside Berlin, with the government warning over the future of a project seen as key for its support of green technologies and regeneration in the east of the country.
  • The environmentalist group Grüne Liga Brandenburg (Green League Brandenburg) said the US company was being given “preferential treatment” and should not be allowed to start felling trees until it had been granted full building permission.
  • “To fell half of the forest, when many aspects of this process are yet to be clarified seems fairly problematic, which is why we have asked the court to deal with it,” said Heinz-Herwig Mascher of Grüne Liga. “It is not that we have something as such against Tesla as a company or its objectives. But we are concerned the preferential treatment they’re being given could set a precedence.”
  • ...3 more annotations...
  • The project’s future is far from certain. It was hailed as an economic breakthrough for the underdeveloped region when it was announced by Tesla’s chief executive, Elon Musk, in November.
  • Grünheide residents are divided over the issue. The town, home in the 20s to a bakelite plastics factory, has repeatedly missed out on any big economic success in the decades since reunification.
  • “Of course you always have to weigh up the economic interests with those of environmental protection,” he told Die Zeit. “It’s important to involve the locals in the discussion from an early stage, and that simply did not happen here. The people are supposed to feel blessed by the fact that Tesla is coming and bringing many jobs with it.”
1 - 20 of 21 Next ›
Showing 20 items per page