Skip to main content

Home/ TOK Friends/ Group items tagged immersion

Rss Feed Group items tagged

Javier E

Why Silicon Valley can't fix itself | News | The Guardian - 1 views

  • After decades of rarely apologising for anything, Silicon Valley suddenly seems to be apologising for everything. They are sorry about the trolls. They are sorry about the bots. They are sorry about the fake news and the Russians, and the cartoons that are terrifying your kids on YouTube. But they are especially sorry about our brains.
  • Sean Parker, the former president of Facebook – who was played by Justin Timberlake in The Social Network – has publicly lamented the “unintended consequences” of the platform he helped create: “God only knows what it’s doing to our children’s brains.”
  • Parker, Rosenstein and the other insiders now talking about the harms of smartphones and social media belong to an informal yet influential current of tech critics emerging within Silicon Valley. You could call them the “tech humanists”. Amid rising public concern about the power of the industry, they argue that the primary problem with its products is that they threaten our health and our humanity.
  • ...52 more annotations...
  • It is clear that these products are designed to be maximally addictive, in order to harvest as much of our attention as they can. Tech humanists say this business model is both unhealthy and inhumane – that it damages our psychological well-being and conditions us to behave in ways that diminish our humanity
  • The main solution that they propose is better design. By redesigning technology to be less addictive and less manipulative, they believe we can make it healthier – we can realign technology with our humanity and build products that don’t “hijack” our minds.
  • its most prominent spokesman is executive director Tristan Harris, a former “design ethicist” at Google who has been hailed by the Atlantic magazine as “the closest thing Silicon Valley has to a conscience”. Harris has spent years trying to persuade the industry of the dangers of tech addiction.
  • In February, Pierre Omidyar, the billionaire founder of eBay, launched a related initiative: the Tech and Society Solutions Lab, which aims to “maximise the tech industry’s contributions to a healthy society”.
  • the tech humanists are making a bid to become tech’s loyal opposition. They are using their insider credentials to promote a particular diagnosis of where tech went wrong and of how to get it back on track
  • The real reason tech humanism matters is because some of the most powerful people in the industry are starting to speak its idiom. Snap CEO Evan Spiegel has warned about social media’s role in encouraging “mindless scrambles for friends or unworthy distractions”,
  • In short, the effort to humanise computing produced the very situation that the tech humanists now consider dehumanising: a wilderness of screens where digital devices chase every last instant of our attention.
  • After years of ignoring their critics, industry leaders are finally acknowledging that problems exist. Tech humanists deserve credit for drawing attention to one of those problems – the manipulative design decisions made by Silicon Valley.
  • these decisions are only symptoms of a larger issue: the fact that the digital infrastructures that increasingly shape our personal, social and civic lives are owned and controlled by a few billionaires
  • Because it ignores the question of power, the tech-humanist diagnosis is incomplete – and could even help the industry evade meaningful reform
  • Taken up by leaders such as Zuckerberg, tech humanism is likely to result in only superficial changes
  • they will not address the origin of that anger. If anything, they will make Silicon Valley even more powerful.
  • To the litany of problems caused by “technology that extracts attention and erodes society”, the text asserts that “humane design is the solution”. Drawing on the rhetoric of the “design thinking” philosophy that has long suffused Silicon Valley, the website explains that humane design “starts by understanding our most vulnerable human instincts so we can design compassionately”
  • this language is not foreign to Silicon Valley. On the contrary, “humanising” technology has long been its central ambition and the source of its power. It was precisely by developing a “humanised” form of computing that entrepreneurs such as Steve Jobs brought computing into millions of users’ everyday lives
  • Facebook had a new priority: maximising “time well spent” on the platform, rather than total time spent. By “time well spent”, Zuckerberg means time spent interacting with “friends” rather than businesses, brands or media sources. He said the News Feed algorithm was already prioritising these “more meaningful” activities.
  • They believe we can use better design to make technology serve human nature rather than exploit and corrupt it. But this idea is drawn from the same tradition that created the world that tech humanists believe is distracting and damaging us.
  • Tech humanists say they want to align humanity and technology. But this project is based on a deep misunderstanding of the relationship between humanity and technology: namely, the fantasy that these two entities could ever exist in separation.
  • The story of our species began when we began to make tools
  • All of which is to say: humanity and technology are not only entangled, they constantly change together.
  • This is not just a metaphor. Recent research suggests that the human hand evolved to manipulate the stone tools that our ancestors used
  • The ways our bodies and brains change in conjunction with the tools we make have long inspired anxieties that “we” are losing some essential qualities
  • Yet as we lose certain capacities, we gain new ones.
  • The nature of human nature is that it changes. It can not, therefore, serve as a stable basis for evaluating the impact of technology
  • Yet the assumption that it doesn’t change serves a useful purpose. Treating human nature as something static, pure and essential elevates the speaker into a position of power. Claiming to tell us who we are, they tell us how we should be.
  • Holding humanity and technology separate clears the way for a small group of humans to determine the proper alignment between them
  • Harris and his fellow tech humanists also frequently invoke the language of public health. The Center for Humane Technology’s Roger McNamee has gone so far as to call public health “the root of the whole thing”, and Harris has compared using Snapchat to smoking cigarettes
  • The public-health framing casts the tech humanists in a paternalistic role. Resolving a public health crisis requires public health expertise. It also precludes the possibility of democratic debate. You don’t put the question of how to treat a disease up for a vote – you call a doctor.
  • They also remain confined to the personal level, aiming to redesign how the individual user interacts with technology rather than tackling the industry’s structural failures. Tech humanism fails to address the root cause of the tech backlash: the fact that a small handful of corporations own our digital lives and strip-mine them for profit.
  • This is a fundamentally political and collective issue. But by framing the problem in terms of health and humanity, and the solution in terms of design, the tech humanists personalise and depoliticise it.
  • Far from challenging Silicon Valley, tech humanism offers Silicon Valley a useful way to pacify public concerns without surrendering any of its enormous wealth and power.
  • these principles could make Facebook even more profitable and powerful, by opening up new business opportunities. That seems to be exactly what Facebook has planned.
  • reported that total time spent on the platform had dropped by around 5%, or about 50m hours per day. But, Zuckerberg said, this was by design: in particular, it was in response to tweaks to the News Feed that prioritised “meaningful” interactions with “friends” rather than consuming “public content” like video and news. This would ensure that “Facebook isn’t just fun, but also good for people’s well-being”
  • Zuckerberg said he expected those changes would continue to decrease total time spent – but “the time you do spend on Facebook will be more valuable”. This may describe what users find valuable – but it also refers to what Facebook finds valuable
  • not all data is created equal. One of the most valuable sources of data to Facebook is used to inform a metric called “coefficient”. This measures the strength of a connection between two users – Zuckerberg once called it “an index for each relationship”
  • Facebook records every interaction you have with another user – from liking a friend’s post or viewing their profile, to sending them a message. These activities provide Facebook with a sense of how close you are to another person, and different activities are weighted differently.
  • Messaging, for instance, is considered the strongest signal. It’s reasonable to assume that you’re closer to somebody you exchange messages with than somebody whose post you once liked.
  • Why is coefficient so valuable? Because Facebook uses it to create a Facebook they think you will like: it guides algorithmic decisions about what content you see and the order in which you see it. It also helps improve ad targeting, by showing you ads for things liked by friends with whom you often interact
  • emphasising time well spent means creating a Facebook that prioritises data-rich personal interactions that Facebook can use to make a more engaging platform.
  • “time well spent” means Facebook can monetise more efficiently. It can prioritise the intensity of data extraction over its extensiveness. This is a wise business move, disguised as a concession to critics
  • industrialists had to find ways to make the time of the worker more valuable – to extract more money from each moment rather than adding more moments. They did this by making industrial production more efficient: developing new technologies and techniques that squeezed more value out of the worker and stretched that value further than ever before.
  • there is another way of thinking about how to live with technology – one that is both truer to the history of our species and useful for building a more democratic future. This tradition does not address “humanity” in the abstract, but as distinct human beings, whose capacities are shaped by the tools they use.
  • It sees us as hybrids of animal and machine – as “cyborgs”, to quote the biologist and philosopher of science Donna Haraway.
  • The cyborg way of thinking, by contrast, tells us that our species is essentially technological. We change as we change our tools, and our tools change us. But even though our continuous co-evolution with our machines is inevitable, the way it unfolds is not. Rather, it is determined by who owns and runs those machines. It is a question of power
  • The various scandals that have stoked the tech backlash all share a single source. Surveillance, fake news and the miserable working conditions in Amazon’s warehouses are profitable. If they were not, they would not exist. They are symptoms of a profound democratic deficit inflicted by a system that prioritises the wealth of the few over the needs and desires of the many.
  • If being technological is a feature of being human, then the power to shape how we live with technology should be a fundamental human right
  • The decisions that most affect our technological lives are far too important to be left to Mark Zuckerberg, rich investors or a handful of “humane designers”. They should be made by everyone, together.
  • Rather than trying to humanise technology, then, we should be trying to democratise it. We should be demanding that society as a whole gets to decide how we live with technology
  • What does this mean in practice? First, it requires limiting and eroding Silicon Valley’s power.
  • Antitrust laws and tax policy offer useful ways to claw back the fortunes Big Tech has built on common resources
  • democratic governments should be making rules about how those firms are allowed to behave – rules that restrict how they can collect and use our personal data, for instance, like the General Data Protection Regulation
  • This means developing publicly and co-operatively owned alternatives that empower workers, users and citizens to determine how they are run.
  • we might demand that tech firms pay for the privilege of extracting our data, so that we can collectively benefit from a resource we collectively create.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

Accelerationism: how a fringe philosophy predicted the future we live in | World news |... - 1 views

  • Roger Zelazny, published his third novel. In many ways, Lord of Light was of its time, shaggy with imported Hindu mythology and cosmic dialogue. Yet there were also glints of something more forward-looking and political.
  • accelerationism has gradually solidified from a fictional device into an actual intellectual movement: a new way of thinking about the contemporary world and its potential.
  • Accelerationists argue that technology, particularly computer technology, and capitalism, particularly the most aggressive, global variety, should be massively sped up and intensified – either because this is the best way forward for humanity, or because there is no alternative.
  • ...31 more annotations...
  • Accelerationists favour automation. They favour the further merging of the digital and the human. They often favour the deregulation of business, and drastically scaled-back government. They believe that people should stop deluding themselves that economic and technological progress can be controlled.
  • Accelerationism, therefore, goes against conservatism, traditional socialism, social democracy, environmentalism, protectionism, populism, nationalism, localism and all the other ideologies that have sought to moderate or reverse the already hugely disruptive, seemingly runaway pace of change in the modern world
  • Robin Mackay and Armen Avanessian in their introduction to #Accelerate: The Accelerationist Reader, a sometimes baffling, sometimes exhilarating book, published in 2014, which remains the only proper guide to the movement in existence.
  • “We all live in an operating system set up by the accelerating triad of war, capitalism and emergent AI,” says Steve Goodman, a British accelerationist
  • A century ago, the writers and artists of the Italian futurist movement fell in love with the machines of the industrial era and their apparent ability to invigorate society. Many futurists followed this fascination into war-mongering and fascism.
  • One of the central figures of accelerationism is the British philosopher Nick Land, who taught at Warwick University in the 1990s
  • Land has published prolifically on the internet, not always under his own name, about the supposed obsolescence of western democracy; he has also written approvingly about “human biodiversity” and “capitalistic human sorting” – the pseudoscientific idea, currently popular on the far right, that different races “naturally” fare differently in the modern world; and about the supposedly inevitable “disintegration of the human species” when artificial intelligence improves sufficiently.
  • In our politically febrile times, the impatient, intemperate, possibly revolutionary ideas of accelerationism feel relevant, or at least intriguing, as never before. Noys says: “Accelerationists always seem to have an answer. If capitalism is going fast, they say it needs to go faster. If capitalism hits a bump in the road, and slows down” – as it has since the 2008 financial crisis – “they say it needs to be kickstarted.”
  • On alt-right blogs, Land in particular has become a name to conjure with. Commenters have excitedly noted the connections between some of his ideas and the thinking of both the libertarian Silicon Valley billionaire Peter Thiel and Trump’s iconoclastic strategist Steve Bannon.
  • “In Silicon Valley,” says Fred Turner, a leading historian of America’s digital industries, “accelerationism is part of a whole movement which is saying, we don’t need [conventional] politics any more, we can get rid of ‘left’ and ‘right’, if we just get technology right. Accelerationism also fits with how electronic devices are marketed – the promise that, finally, they will help us leave the material world, all the mess of the physical, far behind.”
  • In 1972, the philosopher Gilles Deleuze and the psychoanalyst Félix Guattari published Anti-Oedipus. It was a restless, sprawling, appealingly ambiguous book, which suggested that, rather than simply oppose capitalism, the left should acknowledge its ability to liberate as well as oppress people, and should seek to strengthen these anarchic tendencies, “to go still further … in the movement of the market … to ‘accelerate the process’”.
  • By the early 90s Land had distilled his reading, which included Deleuze and Guattari and Lyotard, into a set of ideas and a writing style that, to his students at least, were visionary and thrillingly dangerous. Land wrote in 1992 that capitalism had never been properly unleashed, but instead had always been held back by politics, “the last great sentimental indulgence of mankind”. He dismissed Europe as a sclerotic, increasingly marginal place, “the racial trash-can of Asia”. And he saw civilisation everywhere accelerating towards an apocalypse: “Disorder must increase... Any [human] organisation is ... a mere ... detour in the inexorable death-flow.”
  • With the internet becoming part of everyday life for the first time, and capitalism seemingly triumphant after the collapse of communism in 1989, a belief that the future would be almost entirely shaped by computers and globalisation – the accelerated “movement of the market” that Deleuze and Guattari had called for two decades earlier – spread across British and American academia and politics during the 90s. The Warwick accelerationists were in the vanguard.
  • In the US, confident, rainbow-coloured magazines such as Wired promoted what became known as “the Californian ideology”: the optimistic claim that human potential would be unlocked everywhere by digital technology. In Britain, this optimism influenced New Labour
  • The Warwick accelerationists saw themselves as participants, not traditional academic observers
  • The CCRU gang formed reading groups and set up conferences and journals. They squeezed into the narrow CCRU room in the philosophy department and gave each other impromptu seminars.
  • The main result of the CCRU’s frantic, promiscuous research was a conveyor belt of cryptic articles, crammed with invented terms, sometimes speculative to the point of being fiction.
  • At Warwick, however, the prophecies were darker. “One of our motives,” says Plant, “was precisely to undermine the cheery utopianism of the 90s, much of which seemed very conservative” – an old-fashioned male desire for salvation through gadgets, in her view.
  • K-punk was written by Mark Fisher, formerly of the CCRU. The blog retained some Warwick traits, such as quoting reverently from Deleuze and Guattari, but it gradually shed the CCRU’s aggressive rhetoric and pro-capitalist politics for a more forgiving, more left-leaning take on modernity. Fisher increasingly felt that capitalism was a disappointment to accelerationists, with its cautious, entrenched corporations and endless cycles of essentially the same products. But he was also impatient with the left, which he thought was ignoring new technology
  • lex Williams, co-wrote a Manifesto for an Accelerationist Politics. “Capitalism has begun to constrain the productive forces of technology,” they wrote. “[Our version of] accelerationism is the basic belief that these capacities can and should be let loose … repurposed towards common ends … towards an alternative modernity.”
  • What that “alternative modernity” might be was barely, but seductively, sketched out, with fleeting references to reduced working hours, to technology being used to reduce social conflict rather than exacerbate it, and to humanity moving “beyond the limitations of the earth and our own immediate bodily forms”. On politics and philosophy blogs from Britain to the US and Italy, the notion spread that Srnicek and Williams had founded a new political philosophy: “left accelerationism”.
  • Two years later, in 2015, they expanded the manifesto into a slightly more concrete book, Inventing the Future. It argued for an economy based as far as possible on automation, with the jobs, working hours and wages lost replaced by a universal basic income. The book attracted more attention than a speculative leftwing work had for years, with interest and praise from intellectually curious leftists
  • Even the thinking of the arch-accelerationist Nick Land, who is 55 now, may be slowing down. Since 2013, he has become a guru for the US-based far-right movement neoreaction, or NRx as it often calls itself. Neoreactionaries believe in the replacement of modern nation-states, democracy and government bureaucracies by authoritarian city states, which on neoreaction blogs sound as much like idealised medieval kingdoms as they do modern enclaves such as Singapore.
  • Land argues now that neoreaction, like Trump and Brexit, is something that accelerationists should support, in order to hasten the end of the status quo.
  • In 1970, the American writer Alvin Toffler, an exponent of accelerationism’s more playful intellectual cousin, futurology, published Future Shock, a book about the possibilities and dangers of new technology. Toffler predicted the imminent arrival of artificial intelligence, cryonics, cloning and robots working behind airline check-in desks
  • Land left Britain. He moved to Taiwan “early in the new millennium”, he told me, then to Shanghai “a couple of years later”. He still lives there now.
  • In a 2004 article for the Shanghai Star, an English-language paper, he described the modern Chinese fusion of Marxism and capitalism as “the greatest political engine of social and economic development the world has ever known”
  • Once he lived there, Land told me, he realised that “to a massive degree” China was already an accelerationist society: fixated by the future and changing at speed. Presented with the sweeping projects of the Chinese state, his previous, libertarian contempt for the capabilities of governments fell away
  • Without a dynamic capitalism to feed off, as Deleuze and Guattari had in the early 70s, and the Warwick philosophers had in the 90s, it may be that accelerationism just races up blind alleys. In his 2014 book about the movement, Malign Velocities, Benjamin Noys accuses it of offering “false” solutions to current technological and economic dilemmas. With accelerationism, he writes, a breakthrough to a better future is “always promised and always just out of reach”.
  • “The pace of change accelerates,” concluded a documentary version of the book, with a slightly hammy voiceover by Orson Welles. “We are living through one of the greatest revolutions in history – the birth of a new civilisation.”
  • Shortly afterwards, the 1973 oil crisis struck. World capitalism did not accelerate again for almost a decade. For much of the “new civilisation” Toffler promised, we are still waiting
blythewallick

With 250 babies born each minute, how many people can the Earth sustain? | Lucy Lamble ... - 0 views

  • but UN data suggests there were about a billion people in 1800, 2 billion in 1927, 5 billion in 1987 and just over 7.5 billion today.
  • Since the 1960s, more boys than girls have been born every year. About 117 million women are believed to be “missing” in Asia and eastern Europe – due to discriminatory son preference and gender-biased sex selection.
  • Over the last 30 years, some regions have seen up to 25% more male births than female births, reflecting the persistent low status of women and girls.
  • ...13 more annotations...
  • Experts like Paul Ehrlich argue that the population of the world has long since surpassed optimal levels, though critics counter that consumption is as important as population levels.
  • one-third of all people – almost 4 billion – will be African.
  • By that year, there will be more Nigerians than Americans.
  • If birthrates have fallen so far, why is the population still rising fast?
  • In the pre-modern era, fertility rates of 4.5 to 7 children per woman were common. At that time, high mortality rates of young people kept population growth low.
  • The level of education in a society – of women in particular – is one of the most important predictors for the number of children families have.
  • People are living longer
  • which means that one child in 13 die before their fifth birthday.
  • This compares with six for every 1,000 in Europe and northern America and four for every 1,000 in Australia and New Zealand.
  • A consequence of falling child mortality but continuing high fertility is a “youth bulge” – a high population of young people.
  • But ageing populations can be a cause for celebration. It means development has taken place.
  • What next?
  • Family planning organisations are learning that to survive political shifts and budget cuts, they need to diversify their sources of funding. This means seeing family planning as not just a public health concern but also about development and a clear return on investment.
Javier E

The Psychopath Makeover - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • The eminent criminal psychologist and creator of the widely used Psychopathy Checklist paused before answering. "I think, in general, yes, society is becoming more psychopathic," he said. "I mean, there's stuff going on nowadays that we wouldn't have seen 20, even 10 years ago. Kids are becoming anesthetized to normal sexual behavior by early exposure to pornography on the Internet. Rent-a-friend sites are getting more popular on the Web, because folks are either too busy or too techy to make real ones. ... The recent hike in female criminality is particularly revealing. And don't even get me started on Wall Street."
  • in a survey that has so far tested 14,000 volunteers, Sara Konrath and her team at the University of Michigan's Institute for Social Research has found that college students' self-reported empathy levels (as measured by the Interpersonal Reactivity Index, a standardized questionnaire containing such items as "I often have tender, concerned feelings for people less fortunate than me" and "I try to look at everybody's side of a disagreement before I make a decision") have been in steady decline over the past three decades—since the inauguration of the scale, in fact, back in 1979. A particularly pronounced slump has been observed over the past 10 years. "College kids today are about 40 percent lower in empathy than their counterparts of 20 or 30 years ago," Konrath reports.
  • Imagining, it would seem, really does make it so. Whenever we read a story, our level of engagement is such that we "mentally simulate each new situation encountered in a narrative," according to one of the researchers, Nicole Speer. Our brains then interweave these newly encountered situations with knowledge and experience gleaned from our own lives to create an organic mosaic of dynamic mental syntheses.
  • ...16 more annotations...
  • during this same period, students' self-reported narcissism levels have shot through the roof. "Many people see the current group of college students, sometimes called 'Generation Me,' " Konrath continues, "as one of the most self-centered, narcissistic, competitive, confident, and individualistic in recent history."
  • Reading a book carves brand-new neural pathways into the ancient cortical bedrock of our brains. It transforms the way we see the world—makes us, as Nicholas Carr puts it in his recent essay, "The Dreams of Readers," "more alert to the inner lives of others." We become vampires without being bitten—in other words, more empathic. Books make us see in a way that casual immersion in the Internet, and the quicksilver virtual world it offers, doesn't.
  • if society really is becoming more psychopathic, it's not all doom and gloom. In the right context, certain psychopathic characteristics can actually be very constructive. A neurosurgeon I spoke with (who rated high on the psychopathic spectrum) described the mind-set he enters before taking on a difficult operation as "an intoxication that sharpens rather than dulls the senses." In fact, in any kind of crisis, the most effective individuals are often those who stay calm—who are able to respond to the exigencies of the moment while at the same time maintaining the requisite degree of detachment.
  • mental toughness isn't the only characteristic that Special Forces soldiers have in common with psychopaths. There's also fearlessness.
  • I ask Andy whether he ever felt any regret over anything he'd done. Over the lives he'd taken on his numerous secret missions around the world. "No," he replies matter-of-factly, his arctic-blue eyes showing not the slightest trace of emotion. "You seriously don't think twice about it. When you're in a hostile situation, the primary objective is to pull the trigger before the other guy pulls the trigger. And when you pull it, you move on. Simple as that. Why stand there, dwelling on what you've done? Go down that route and chances are the last thing that goes through your head will be a bullet from an M16. "The regiment's motto is 'Who Dares Wins.' But sometimes it can be shortened to 'F--- It.' "
  • one of the things that we know about psychopaths is that the light switches of their brains aren't wired up in quite the same way as the rest of ours are—and that one area particularly affected is the amygdala, a peanut-size structure located right at the center of the circuit board. The amygdala is the brain's emotion-control tower. It polices our emotional airspace and is responsible for the way we feel about things. But in psychopaths, a section of this airspace, the part that corresponds to fear, is empty.
  • Turn down the signals to the amygdala, of course, and you're well on the way to giving someone a psychopath makeover. Indeed, Liane Young and her team in Boston have since kicked things up a notch and demonstrated that applying TMS to the right temporoparietal junction—a neural ZIP code within that neighborhood—has significant effects not just on lying ability but also on moral-reasoning ability: in particular, ascribing intentionality to others' actions.
  • at an undisclosed moment sometime within the next 60 seconds, the image you see at the present time will change, and images of a different nature will appear on the screen. These images will be violent. And nauseating. And of a graphic and disturbing nature. "As you view these images, changes in your heart rate, skin conductance, and EEG activity will be monitored and compared with the resting levels that are currently being recorded
  • "OK," says Nick. "Let's get the show on the road." He disappears behind us, leaving Andy and me merrily soaking up the incontinence ad. Results reveal later that, at this point, as we wait for something to happen, our physiological output readings are actually pretty similar. Our pulse rates are significantly higher than our normal resting levels, in anticipation of what's to come. But with the change of scene, an override switch flips somewhere in Andy's brain. And the ice-cold Special Forces soldier suddenly swings into action. As vivid, florid images of dismemberment, mutilation, torture, and execution flash up on the screen in front of us (so vivid, in fact, that Andy later confesses to actually being able to "smell" the blood: a "kind of sickly-sweet smell that you never, ever forget"), accompanied not by the ambient spa music of before but by blaring sirens and hissing white noise, his physiological readings start slipping into reverse. His pulse rate begins to slow. His GSR begins to drop, his EEG to quickly and dramatically attenuate. In fact, by the time the show is over, all three of Andy's physiological output measures are pooling below his baseline.
  • Nick has seen nothing like it. "It's almost as if he was gearing himself up for the challenge," he says. "And then, when the challenge eventually presented itself, his brain suddenly responded by injecting liquid nitrogen into his veins. Suddenly implemented a blanket neural cull of all surplus feral emotion. Suddenly locked down into a hypnotically deep code red of extreme and ruthless focus." He shakes his head, nonplused. "If I hadn't recorded those readings myself, I'm not sure I would have believed them," he continues. "OK, I've never tested Special Forces before. And maybe you'd expect a slight attenuation in response. But this guy was in total and utter control of the situation. So tuned in, it looked like he'd completely tuned out."
  • My physiological output readings, in contrast, went through the roof. Exactly like Andy's, they were well above baseline as I'd waited for the carnage to commence. But that's where the similarity ended. Rather than go down in the heat of battle, in the midst of the blood and guts, mine had appreciated exponentially. "At least it shows that the equipment is working properly," comments Nick. "And that you're a normal human being."
  • TMS can't penetrate far enough into the brain to reach the emotion and moral-reasoning precincts directly. But by damping down or turning up the regions of the cerebral cortex that have links with such areas, it can simulate the effects of deeper, more incursive influence.
  • Before the experiment, I'd been curious about the time scale: how long it would take me to begin to feel the rush. Now I had the answer: about 10 to 15 minutes. The same amount of time, I guess, that it would take most people to get a buzz out of a beer or a glass of wine.
  • The effects aren't entirely dissimilar. An easy, airy confidence. A transcendental loosening of inhibition. The inchoate stirrings of a subjective moral swagger: the encroaching, and somehow strangely spiritual, realization that hell, who gives a s---, anyway? There is, however, one notable exception. One glaring, unmistakable difference between this and the effects of alcohol. That's the lack of attendant sluggishness. The enhancement of attentional acuity and sharpness. An insuperable feeling of heightened, polished awareness. Sure, my conscience certainly feels like it's on ice, and my anxieties drowned with a half-dozen shots of transcranial magnetic Jack Daniel's. But, at the same time, my whole way of being feels as if it's been sumptuously spring-cleaned with light. My soul, or whatever you want to call it, immersed in a spiritual dishwasher.
  • So this, I think to myself, is how it feels to be a psychopath. To cruise through life knowing that no matter what you say or do, guilt, remorse, shame, pity, fear—all those familiar, everyday warning signals that might normally light up on your psychological dashboard—no longer trouble you.
  • I suddenly get a flash of insight. We talk about gender. We talk about class. We talk about color. And intelligence. And creed. But the most fundamental difference between one individual and another must surely be that of the presence, or absence, of conscience. Conscience is what hurts when everything else feels good. But what if it's as tough as old boots? What if one's conscience has an infinite, unlimited pain threshold and doesn't bat an eye when others are screaming in agony?
Javier E

What's the secret to learning a second language? - Salon.com - 0 views

  • “Arabic is a language of memorization,” he said. “You just have to drill the words into your head, which unfortunately takes a lot of time.” He thought, “How can I maximize the number of words I learn in the minimum amount of time?”
  • Siebert started studying the science of memory and second-language acquisition and found two concepts that went hand in hand to make learning easier: selective learning and spaced repetition. With selective learning, you spend more time on the things you don’t know, rather than on the things you already do
  • Siebert designed his software to use spaced repetition. If you get cup right, the program will make the interval between seeing the word cup longer and longer, but it will cycle cup back in just when you’re about to forget it. If you’ve forgotten cup entirely, the cycle starts again. This system moves the words from your brain’s short-term memory into long-term memory and maximizes the number of words you can learn effectively in a period. You don’t have to cram
  • ...8 more annotations...
  • ARABIC IS ONE of the languages the U.S. Department of State dubs “extremely hard.” Chinese, Japanese, and Korean are the others. These languages’ structures are vastly different from that of English, and they are memorization-driven.
  • To help meet its language-learning goals, in 2003 the Department of Defense established the University of Maryland Center for Advanced Study of Language.
  • MICHAEL GEISLER, a vice president at Middlebury College, which runs the foremost language-immersion school in the country, was blunt: “The drill-and-kill approach we used 20 years ago doesn’t work.” He added, “The typical approach that most programs take these days—Rosetta Stone is one example—is scripted dialogue and picture association. You have a picture of the Eiffel Tower, and you have a sentence to go with it. But that’s not going to teach you the language.”
  • According to Geisler, you need four things to learn a language. First, you have to use it. Second, you have to use it for a purpose. Research shows that doing something while learning a language—preparing a cooking demonstration, creating an art project, putting on a play—stimulates an exchange of meaning that goes beyond using the language for the sake of learning it.Third, you have to use the language in context. This is where Geisler says all programs have fallen short.
  • Fourth, you have to use language in interaction with others. In a 2009 study led by Andrew Meltzoff at the University of Washington, researchers found that young children easily learned a second language from live human interaction while playing and reading books. But audio and DVD approaches with the same material, without the live interaction, fostered no learning progress at all. Two people in conversation constantly give each other feedback that can be used to make changes in how they respond.
  • our research shows that the ideal model is a blended one,” one that blends technology and a teacher. “Our latest research shows that with the proper use of technology and cognitive neuroscience, we can make language learning more efficient.” 
  • The school released its first two online programs, for French and Spanish, last year. The new courses use computer avatars for virtual collaboration; rich video of authentic, unscripted conversations with native speakers; and 3-D role-playing games in which students explore life in a city square, acting as servers and taking orders from customers in a café setting. The goal at the end of the day, as Geisler put it, is for you to “actually be able to interact with a native speaker in his own language and have him understand you, understand him, and, critically, negotiate when you don’t understand what he is saying.” 
  • The program includes the usual vocabulary lists and lessons in how to conjugate verbs, but students are also consistently immersed in images, audio, and video of people from different countries speaking with different accents. Access to actual teachers is another critical component.
Javier E

The Leadership Revival - NYTimes.com - 0 views

  • take a reality bath. Go off and become a stranger in a strange land. Go off to some alien part of this country or the world. Immerse yourself in the habits and daily patterns of that existence and stay there long enough to get acculturated. Stay there long enough so that you forget the herd mentality of our partisan culture.
  • When you return home, you will look at your own place with foreign eyes. You’ll see the contours of your own reality more clearly. When you return to native ground, you’re more likely to possess the sort of perceptiveness that Isaiah Berlin says is the basis of political judgment.
  • This sort of wisdom consists of “a special sensitiveness to the contours of the circumstances in which we happen to be placed; it is a capacity for living without falling foul of some permanent condition or factor which cannot be either altered, or even fully described.” This wisdom is based on a tactile awareness of your country and its people — what they want, how they react. You don’t think this awareness. You feel it. You experience a visceral oneness with culture and circumstance — the smell of the street, tinges of anger and hope and aspiration. The irony is that you are more likely to come into union with your own home culture after you have been away from it.
Javier E

Big Think Interview With Nicholas Carr | Nicholas Carr | Big Think - 0 views

  • Neurologically, how does our brain adapt itself to new technologies? Nicholas Carr: A couple of types of adaptations take place in your brain. One is a strengthening of the synaptical connections between the neurons involved in using that instrument, in using that tool. And basically these are chemical – neural chemical changes. So you know, cells in our brain communicate by transmitting electrical signals between them and those electrical signals are actually activated by the exchange of chemicals, neurotransmitters in our synapses. And so when you begin to use a tool, for instance, you have much stronger electrochemical signals being processed in those – through those synaptical connections. And then the second, and even more interesting adaptation is in actual physical changes,anatomical changes. Your neurons, you may grow new neurons that are then recruited into these circuits or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways. On the other hand, you know, the brain likes to be efficient and so even as its strengthening the pathways you’re exercising, it’s pulling – it’s weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you’re not exercising so much.
  • And it was only in around the year 800 or 900 that we saw the introduction of word spaces. And suddenly reading became, in a sense, easier and suddenly you had to arrival of silent reading, which changed the act of reading from just transcription of speech to something that every individual did on their own. And suddenly you had this whole deal of the silent solitary reader who was improving their mind, expanding their horizons, and so forth. And when Guttenberg invented the printing press around 1450, what that served to do was take this new very attentive, very deep form of reading, which had been limited to just, you know, monasteries and universities, and by making books much cheaper and much more available, spread that way of reading out to a much larger mass of audience. And so we saw, for the last 500 years or so, one of the central facts of culture was deep solitary reading.
  • What the book does as a technology is shield us from distraction. The only thinggoing on is the, you know, the progression of words and sentences across page after page and so suddenly we see this immersive kind of very attentive thinking, whether you are paying attention to a story or to an argument, or whatever. And what we know about the brain is the brain adapts to these types of tools.
  • ...12 more annotations...
  • we adapt to the environment of the internet, which is an environment of kind of constant immersion and information and constant distractions, interruptions, juggling lots of messages, lots of bits of information.
  • Because it’s no longer just a matter of personal choice, of personal discipline, though obviously those things are always important, but what we’re seeing and we see this over and over again in the history of technology, is that the technology – the technology of the web, the technology of digital media, gets entwined very, very deeply into social processes, into expectations. So more and more, for instance in our work lives. You know, if our boss and all our colleagues are constantly exchanging messages, constantly checking email on their Blackberry or iPhone or their Droid or whatever, then it becomes very difficult to say, I’m not going to be as connected because you feel like you’re career is going to take a hit.
  • With the arrival – with the transfer now of text more and more onto screens, we see, I think, a new and in some ways more primitive way of reading. In order to take in information off a screen, when you are also being bombarded with all sort of other information and when there links in the text where you have to think even for just a fraction of a second, you know, do I click on this link or not. Suddenly reading again becomes a more cognitively intensive act, the way it was back when there were no spaces between words.
  • If all your friends are planning their social lives through texts and Facebook and Twitter and so forth, then to back away from that means to feel socially isolated. And of course for all people, particularly for young people, there’s kind of nothing worse than feeling socially isolated, that your friends are you know, having these conversations and you’re not involved. So it’s easy to say the solution, which is to, you know, becomes a little bit more disconnected. What’s hard it actually doing that.
  • if you want to change your brain, you change your habits. You change your habits of thinking. And that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking and that means, you know, setting aside time to engage in more contemplative, more reflective ways of thinking, to be – to screen out distractions. And that means retreating from digital media and from the web and from Smart Phones and texting and Facebook and Tweeting and everything else.
  • The Thinker was, you know, in a contemplative pose and was concentrating deeply, and wasn’t you know, multi-tasking. And because that is something that, until recently anyway, people always thought was the deepest and most distinctly human way of thinking.
  • we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings.
  • the ability to pay attention also is very important for our ability to build memories, to transfer information from our short-term memory to our long-term memory. And only when we do that do we weave new information into everything else we have stored in our brains. All the other facts we’ve learned, all the other experiences we’ve had, emotions we’ve felt. And that’s how you build, I think, a rich intellect and a rich intellectual life.
  • On the other hand, there is a cost. We lose – we begin to lose the facilities that we don’t exercise. So adaptation has both a very, very positive side, but also a potentially negative side because ultimately our brain is qualitatively neutral. It doesn’t pare what it’s strengthening or what it’s weakening, it just responds to the way we’re exercising our mind.
  • the book in some ways is the most interesting from our own present standpoint, particularly when we want to think about the way the internet is changing us. It’s interesting to think about how the book changed us.
  • So we become, after the arrival of the printing press in general, more attentive more attuned to contemplative ways of thinking. And that’s a very unnatural way of using our mind. You know, paying attention, filtering out distractions.
  • what we lose is the ability to pay deep attention to one thing for a sustained period of time, to filter out distractions.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

The State of the Short Story - 0 views

  • stories demand the focus of bedtime. They're meant to be read straight through from beginning to end. You can’t read a story and multitask. And they tend to arise from nighttime concerns—either the four o’clock in the morning awareness that you’ve taken a wrong turn, and will spend your life paying the price (see Chekhov, passim) or else the lure of some dangerous path, which holds the promise of salvation and the risk of ruin. Nightmare stuff, dream stuff
  • Short stories bring you up short. They demand a wakeful attention; a good one keeps you thinking when it’s over. They take the subjects of the night and expose them to the bright light of day. They run counter to our yearnings for immersion, companionship, distraction … and for all of these reasons, in my mind they’ve come to stand for a kind of difficulty, emotional difficulty, that we are in danger of losing when we fetishize the charms of the long novel.
  • There is a time for multi-tasking and a time for losing yourself. The short story offers something else: a chance to pay close attention -- and have that attention rewarded because, for once, every little plot twist, every sentence, counts.
Javier E

Welcome to Google Island | Gadget Lab | Wired.com - 0 views

  • As soon as you hit Google’s territorial waters, you came under our jurisdiction, our terms of service. Our laws–or lack thereof–apply here. By boarding our self-driving boat you granted us the right to all feedback you provide during your journey. This includes the chemical composition of your sweat.
  • Unified logins let us get to know our audience in ways we never could before. They gave us their locations so that we might better tell them if it was raining outside. They told us where they lived and where they wanted to go so that we could deliver a more immersive map that better anticipated what they wanted to do–it let us very literally tell people what they should do today. As people began to see how very useful Google Now was, they began to give us even more information. They told us to dig through their e-mail for their boarding passes–Imagine if you had to find it on your own!–they finally gave us permission to track and store their search and web history so that we could give them better and better Cards. And then there is the imaging. They gave us tens of thousands of pictures of themselves so that we could pick the best ones–yes we appealed to their vanity to do this: We’ll make you look better and assure you present a smiling, wrinkle-free face to the world–but it allowed us to also stitch together three-dimensional representations. Hangout chats let us know who everybody’s friends were, and what they had to say to them. Verbal searches gave us our users’ voices. These were intermediary steps. But it let us know where people were at all times, what they thought, what they said, and of course how they looked. Sure, Google Now could tell you what to do.
  • “We learned so much about regulation with Google Health. It turns out, the government has rules about health records, and that people care about these rules for some reason. So we began looking around for ways to avoid regulation. For example, government regulation meant it was much easier to experiment with white space in Kenya than in the United States. So we started thinking: What if the entire world looked more like Kenya? Or, even better, Somalia? Places where there are no laws. We haven’t adapted mechanisms to deal with some of our old institutions like the law. We aren’t keeping up with the rate of change we caused through technology. If you look at the laws we have, they’re very old. A law can’t be right if it’s 50 years old. Like, it’s before the Internet
  • ...2 more annotations...
  • I don’t want this,” I stammered, removing the glasses. “Sure you do, you just aren’t aware of that yet. For many years now, we’ve looked at everything you’ve looked at online. Everything. We know what you want, and when you want it, down to the time of day. Why wait for you to request it? And in fact, why wait for you to discover that you even want to request it? We can just serve it to you.”
  • “These are Google Spiders. They’ve crawled the entire island, and now we’re ready to release them globally. We’re sending them everywhere, so that we can make a 3D representation of the entire planet, and everyone on it. We aren’t just going to recreate the planet, though–we’re going to make it better.” “Governments are too focused on democracy and rule of law. On Google Island, we’ve found those things to be distractions. If democracy worked so well, if a majority public opinion made something right, we would still have Jim Crow laws and Google Reader. We believe we can fix the world’s problems with better math. We can tear down the old and rebuild it with the new. Imagine Minecraft. Now imagine it photorealistic, and now imagine yourself living there, or at least, your Google Being living there. We already have the information. All we need is an invitation. This is the inevitable and logical end point of Google Island: a new Google Earth.”
Javier E

Vikings' Struggles Come to Life in History Channel's Series - NYTimes.com - 0 views

  • Propelled by the tale of the legendary Norse adventurer Ragnar Lothbrok, his family and his band of followers, the lushly produced, effects-enhanced series dazzles with evocative scenery and dynamic displays of superherolike derring-do and physical stamina.
  • Mr. Hirst immersed himself in what had been written about Viking culture — basically documentation by outside observers since theirs was an illiterate society. He found the material limited and biased.
  • “They’re always the guys who break in through the door, slash up your house and rape and pillage for no good reason, except that they enjoy the violence,” he said. “I wanted to tell the story from the Vikings’ point of view, because their history was written by Christian monks, basically, whose job it was to exaggerate their violence.”
  • ...3 more annotations...
  • Despite History’s mantle of preserving and purveying an accurate picture of the past, hewing to the letter of historical accuracy wasn’t possible in the case of a dramatic series based on fragmented documentation, hence a large degree of dramatic license was employed.
  • “I especially had to take liberties with ‘Vikings’ because no one knows for sure what happened in the Dark Ages,” Mr. Hirst said. “Very little was written then.” The bottom line, he explained, was: “We want people to watch it. A historical account of the Vikings would reach hundreds, occasionally thousands, of people. Here we’ve got to reach millions.”
  • he was hard put to replicate authentic fabrics and woods. One of the biggest challenges he faced, he added, was improvising lighting sources for Viking homes and halls, which had no windows, making engaging photography of a strictly realistic interior setting impossible.
Emily Horwitz

Ah, Wilderness! Nature Hike Could Unlock Your Imagination : Shots - Health News : NPR - 0 views

  • Want to be more creative? Drop that iPad and head to the great outdoors.
  • David Strayer, a cognitive neuroscientist in Uta
  • t sent students out into nature with computers, to test their attention spans. "It was an abysmal failure,
  • ...12 more annotations...
  • students didn't want to be anywhere near the computers
  • pencil-and-paper creativity quiz, the so-called Remote Associates Test, which asks people to identify word associations that aren't immediately obvious. Four days into the trip, they took the test again
  • 45 percent improvement.
  • hardly scientific.
  • took the test four days into the wilderness did 50 percent better than those who were still immersed in modern lif
  • Outward Bound is notoriously strict about bringing artifacts of modern life into the wilderness.
  • Half of the 56 hikers took the test before going backpacking in the wilderness, and the other half took the RAT test on the fourth day of their trip. The groups went into the wild in Alaska, Colorado, Maine and Washington.
  • researchers had already taken the test once.
  • exposure to nature over a number of days, which has been shown in other studies to improve thinking
  • exercise
  • abandoning electronic devices
  • constant texting and checking in on Facebook are not making us think more clearly.
  •  
    An interesting connection between being in nature and being creative and mentally present.
Javier E

How to Fall in Love With Math - NYTimes.com - 3 views

  • EACH time I hear someone say, “Do the math,” I grit my teeth.
  • Imagine, if you will, using, “Do the lit” as an exhortation to spell correctly.
  • my field is really about ideas above anything else. Ideas that inform our existence, that permeate our universe and beyond, that can surprise and enthrall.
  • ...4 more annotations...
  • Think of it this way: you can appreciate art without acquiring the ability to paint, or enjoy a symphony without being able to read music. Math also deserves to be enjoyed for its own sake, without being constantly subjected to the question, “When will I use this?”
  • In schools, as I’ve heard several teachers lament, the opportunity to immerse students in interesting mathematical ideas is usually jettisoned to make more time for testing and arithmetic drills.
  • Keith Devlin argues in his book “The Math Gene,” human beings are wired for mathematics. At some level, perhaps we all crave it.
  • So what math ideas can be appreciated without calculation or formulas? One candidate that I’ve found intrigues people is the origin of numbers. Think of it as a magic trick: harnessing emptiness to create the number zero, then demonstrating how from any whole number, one can create its successor.
Javier E

What Is College For? (Part 2) - NYTimes.com - 0 views

  • How, exactly, does college prepare students for the workplace? For most jobs, it provides basic intellectual skills: the ability to understand relatively complex instructions, to write and speak clearly and cogently, to evaluate options critically. Beyond these intellectual skills, earning a college degree shows that you have the “moral qualities” needed for most jobs: you have (to put it a bit cynically), for a period of four years and with relatively little supervision, deferred to authority, met deadlines and carried out difficult tasks even when you found them pointless and boring.
  • This sort of intellectual and moral training, however, does not require studying with experts doing cutting-edge work on, say, Homeric poetry, elementary particle theory or the philosophy of Kant. It does not, that is, require the immersion in the world of intellectual culture that a college faculty is designed to provide. It is, rather, the sort of training that ought to result from good elementary and high school education.
  • students graduating from high school should, to cite one plausible model, be able to read with understanding classic literature (from, say, Austen and Browning to Whitman and Hemingway) and write well-organized and grammatically sound essays; they should know the basic outlines of American and European history, have a good beginner’s grasp of at least two natural sciences as well as pre-calculus mathematics, along with a grounding in a foreign language.
  • ...4 more annotations...
  • Is it really possible to improve grade school and high school teaching to the level I’m suggesting? Yes, provided we employ the same sort of selection criteria for pre-college teachers as we do for other professionals such as doctors, lawyers and college professors. In contrast to other professions, teaching is not now the domain of the most successful students — quite the contrary. I’ve known many very bright students who had an initial interest in such teaching but soon realized that there is no comparison in terms of salary, prestige and working conditions.
  • Given this transformation in pre-college education, we could expect it to provide basic job-training for most students. At that point, we would still face a fundamental choice regarding higher education. We could see it as a highly restricted enterprise, educating only professionals who require advanced specialized skills. Correspondingly, only such professionals would have access to higher education as a locus of intellectual culture.
  • On the other hand, we could — as I would urge — see college as the entrée to intellectual culture for everyone who is capable of and interested in working at that level of intellectual engagement
  • Raising high school to the level I am proposing and opening college to everyone who will profit from it would be an expensive enterprise. We would need significant government support to ensure that all students receive an education commensurate with their abilities and aspirations, regardless of family resources. But the intellectual culture of our citizens should be a primary part of our national well-being, not just the predilection of an eccentric elite. As such, it should be among our highest priorities.
Emily Freilich

What Happens When A Language's Last Monolingual Speaker Dies? : Code Switch : NPR - 1 views

  • "This is a sad day for all Chickasaw people because we have lost a cherished member of our Chickasaw family and an unequaled source of knowledge about our language and culture,
  • Dickerson didn't learn another language because, Hinson says, she didn't need English. She was from a traditional community, Kali-Homma', and didn't work in a wage economy.
  • xperts say the rest of the 65 Chickasaw speakers, all of whom are bilingual, might be a big enough pool to preserve the language.
  • ...2 more annotations...
  • "What's important in Chickasaw is quite different than [what's important] in English. ... For her, she saw a world from a Chickasaw worldview, without the interference of English at all."
  • Hinson's program tries to counter further erosion of Chickasaw by offering language immersion programs — for both kids and adults. Tools, including an iPhone app and a stream of videos, make the language accessible to anyone,
Javier E

Grand Tour of the Self - NYTimes.com - 1 views

  • selfie sticks, the latest and most obnoxious tool in the kit of digital narcissism.
  • viewing the world through a selfie stick is like skiing in that artificial snow park in Dubai. It further isolates and cocoons the visitor inside a zone of self-projected experience.
  • these elongated facial recorders are all the rage among travelers. “Like it or not,” a recent post on BuzzFeed reported, “everyone is going to be wielding a selfie stick.”
  • ...2 more annotations...
  • when technology changes the travel experience itself — from immersion and surprise to documentary one-upmanship — it defeats the point of the journey. We travel to freshen senses dulled by routine. We travel for discovery and reinvention.
  • a park ranger in Washington State told me about a group of kids trying to get a fix on 500-year-old trees at the lower elevation of Mount Rainier. They could not fully fathom what they were experiencing, he said, until they could filter it through their phones — as pictures or Wikipedia definitions. Nature deficit disorder, so called, is a symptom of being connected to everything, while being unable to connect to anything.
Javier E

College for Grown-Ups - NYTimes.com - 0 views

  • If we were starting from zero, we probably wouldn’t design colleges as age-segregated playgrounds in which teenagers and very young adults are given free rein to spend their time more or less as they choose. Yet this is the reality.
  • Rethinking the expectation that applicants to selective colleges be fresh out of high school would go far in reducing risk for young people while better protecting everyone’s college investment. Some of this rethinking is already underway. Temporarily delaying college for a year or two after high school is now becoming respectable among the admissions gatekeepers at top schools. Massive online open courses (MOOCs) and other forms of online learning make it possible to experience fragments of an elite education at little or no cost.
  • people are tinkering further with conventional campus models. The Minerva Project, a San Francisco start-up with offices two blocks from Twitter, offers classic seminar-style college courses via a sophisticated interactive online learning platform and accompanies them with residencies in cities all over the world. Nearby in the SoMa district, Dev Bootcamp, a 19-week immersive curriculum that trains people of all ages for jobs in the tech industry, is a popular alternative. Some successfully employed graduates brag of bypassing college altogether.
  • ...2 more annotations...
  • At Stanford, where I teach, an idea still in the concept phase developed by a student-led team in the university’s Hasso Plattner Institute of Design calls for the replacement of four consecutive college years in young adulthood with multiple residencies distributed over a lifetime. What the designers call Open Loop University would grant students admitted to Stanford multiple years to spend on campus, along with advisers to help them time those years strategically in light of their personal development and career ambitions. Today’s arbitrarily segregated world of teenagers and young adults would become an ever-replenished intergenerational community of purposeful learners.
  • the status quo is not sustainable. Unrelenting demand for better-educated workers, rapidly developing technological capacity to support learning digitally and the soaring costs of conventional campus life are driving us toward substantial change.
Javier E

How to Be French - NYTimes.com - 0 views

  • I’m pursuing French citizenship. The whole procedure can take years. Amid repeated requests for new documents, some would-be French people just give up.
  • This may be by design. “The difficulty of the ordeal seems a means of testing the authenticity of his/her commitment to the project of becoming French,” the sociologists Didier Fassin and Sarah Mazouz concluded in their 2009 paper “What Is It to Become French?” Officials can reject an applicant because he hasn’t adopted French values, or merely because his request isn’t “opportune.”
  • There’s a long tradition of Frenchification here. Napoleon Bonaparte was born Napoleone di Buonaparte and spoke French with a thick Corsican accent. He and others spent the 19th century transforming France from a nation with a patchwork of regional languages and dialects to one where practically everyone spoke proper French.
  • ...4 more annotations...
  • Schools were their main instrument. French schools follow a national curriculum that includes arduous surveys of French philosophy and literature. Frenchmen then spend the rest of their lives quoting Proust to one another, with hardly anyone else catching the references.
  • Even the rituals of friendship are different here. The Canadian writer Jean-Benoît Nadeau, who just spent a year in Paris, says there are clues that a French person wants to befriend you: She tells you about her family; she uses self-deprecating humor; and she admits that she likes her job. There’s also the fact that she speaks to you at all. Unlike North Americans, “the French have no compunction about not talking to you.”
  • Apparently, being a Parisian woman has its own requirements. The new book “How to Be Parisian Wherever You Are” says Parisiennes are “imperfect, vague, unreliable and full of paradoxes” and have “that typically French enthusiasm for transforming life into fiction.” I need to cultivate an “air of fragility,” too.
  • Apparently nobody expects me to achieve a state of inner Frenchness. At a naturalization ceremony that the two sociologists observed, an official told new citizens that they were granted French nationality because they had assimilated “not to the point where you entirely resemble native French people, yet enough so that you feel at ease among us.”
Javier E

Opinion | What Do We Actually Know About the Economy? (Wonkish) - The New York Times - 0 views

  • Among economists more generally, a lot of the criticism seems to amount to the view that macroeconomics is bunk, and that we should stick to microeconomics, which is the real, solid stuff. As I’ll explain in a moment, that’s all wrong
  • in an important sense the past decade has been a huge validation for textbook macroeconomics; meanwhile, the exaltation of micro as the only “real” economics both gives microeconomics too much credit and is largely responsible for the ways macroeconomic theory has gone wrong.
  • Finally, many outsiders and some insiders have concluded from the crisis that economic theory in general is bunk, that we should take guidance from people immersed in the real world – say, business leaders — and/or concentrate on empirical results and skip the models
  • ...28 more annotations...
  • And while empirical evidence is important and we need more of it, the data almost never speak for themselves – a point amply illustrated by recent monetary events.
  • chwinger, as I remember the story, was never seen to use a Feynman diagram. But he had a locked room in his house, and the rumor was that that room was where he kept the Feynman diagrams he used in secret.
  • What’s the equivalent of Feynman diagrams? Something like IS-LM, which is the simplest model you can write down of how interest rates and output are jointly determined, and is how most practicing macroeconomists actually think about short-run economic fluctuations. It’s also how they talk about macroeconomics to each other. But it’s not what they put in their papers, because the journals demand that your model have “microfoundations.”
  • The Bernanke Fed massively expanded the monetary base, by a factor of almost five. There were dire warnings that this would cause inflation and “debase the dollar.” But prices went nowhere, and not much happened to broader monetary aggregates (a result that, weirdly, some economists seemed to find deeply puzzling even though it was exactly what should have been expected.)
  • What about fiscal policy? Traditional macro said that at the zero lower bound there would be no crowding out – that deficits wouldn’t drive up interest rates, and that fiscal multipliers would be larger than under normal conditions. The first of these predictions was obviously borne out, as rates stayed low even when deficits were very large. The second prediction is a bit harder to test, for reasons I’ll get into when I talk about the limits of empiricism. But the evidence does indeed suggest large positive multipliers.
  • The overall story, then, is one of overwhelming predictive success. Basic, old-fashioned macroeconomics didn’t fail in the crisis – it worked extremely well
  • In fact, it’s hard to think of any other example of economic models working this well – making predictions that most non-economists (and some economists) refused to believe, indeed found implausible, but which came true. Where, for example, can you find any comparable successes in microeconomics?
  • Meanwhile, the demand that macro become ever more rigorous in the narrow, misguided sense that it look like micro led to useful approaches being locked up in Schwinger’s back room, and in all too many cases forgotten. When the crisis struck, it was amazing how many successful academics turned out not to know things every economist would have known in 1970, and indeed resurrected 1930-vintage fallacies in the belief that they were profound insights.
  • mainly I think it reflected the general unwillingness of human beings (a category that includes many though not necessarily all economists) to believe that so many people can be so wrong about something so big.
  • . To normal human beings the study of international trade and that of international macroeconomics might sound like pretty much the same thing. In reality, however, the two fields used very different models, had very different intellectual cultures, and tended to look down on each other. Trade people tended to consider international macro people semi-charlatans, doing ad hoc stuff devoid of rigor. International macro people considered trade people boring, obsessed with proving theorems and offering little of real-world use.
  • does microeconomics really deserve its reputation of moral and intellectual superiority? No
  • Even before the rise of behavioral economics, any halfway self-aware economist realized that utility maximization – indeed, the very concept of utility — wasn’t a fact about the world; it was more of a thought experiment, whose conclusions should always have been stated in the subjunctive.
  • Kahneman and Tversky and Thaler and so on deserved all the honors they received for helping to document the specific ways in which utility maximization falls short, but even before their work we should never have expected perfect maximization to be a good description of reality.
  • True, a model doesn’t have to be perfect to provide hugely important insights. But here’s my question: where are the examples of microeconomic theory providing strong, counterintuitive, successful predictions on the same order as the success of IS-LM macroeconomics after 2008? Maybe there are some, but I can’t come up with any.
  • The point is not that micro theory is useless and we should stop doing it. But it doesn’t deserve to be seen as superior to macro modeling.
  • And the effort to make macro more and more like micro – to ground everything in rational behavior – has to be seen now as destructive. True, that effort did lead to some strong predictions: e.g., only unanticipated money should affect real output, transitory income changes shouldn’t affect consumer spending, government spending should crowd out private demand, etc. But all of those predictions have turned out to be wrong.
  • But, you say, we didn’t see the Great Recession coming. Well, what do you mean “we,” white man? OK, what’s true is that few economists realized that there was a huge housing bubble
  • But data never speak for themselves, for a couple of reasons. One, which is familiar, is that economists don’t get to do many experiments, and natural experiments are rare
  • The other problem is that even when we do get something like natural experiments, they often took place under economic regimes that aren’t relevant to current problems.
  • Both of these problems were extremely relevant in the years following the 2008 crisis.
  • you might be tempted to conclude that the empirical evidence is that monetary expansion is inflationary, indeed roughly one-for-one.
  • But the question, as the Fed embarked on quantitative easing, was what effect this would have on an economy at the zero lower bound. And while there were many historical examples of big monetary expansion, examples at the ZLB were much rarer – in fact, basically two: the U.S. in the 1930s and Japan in the early 2000
  • These examples told a very different story: that expansion would not, in fact, be inflationary, that it would work out the way it did.
  • The point is that empirical evidence can only do certain things. It can certainly prove that your theory is wrong! And it can also make a theory much more persuasive in those cases where the theory makes surprising predictions, which the data bear out. But the data can never absolve you from the necessity of having theories.
  • Over this past decade, I’ve watched a number of economists try to argue from authority: I am a famous professor, therefore you should believe what I say. This never ends well. I’ve also seen a lot of nihilism: economists don’t know anything, and we should tear the field down and start over.
  • Obviously I differ with both views. Economists haven’t earned the right to be snooty and superior, especially if their reputation comes from the ability to do hard math: hard math has been remarkably little help lately, if ever.
  • On the other hand, economists do turn out to know quite a lot: they do have some extremely useful models, usually pretty simple ones, that have stood up well in the face of evidence and events. And they definitely shouldn’t defer to important and/or rich people on polic
  • : compare Janet Yellen’s macroeconomic track record with that of the multiple billionaires who warned that Bernanke would debase the dollar. Or take my favorite Business Week headline from 2010: “Krugman or [John] Paulson: Who You Gonna Bet On?” Um.The important thing is to be aware of what we do know, and why.Follow The New York Times Opinion section on Facebook and Twitter (@NYTopinion), and sign up for the Opinion Today newsletter.
1 - 20 of 39 Next ›
Showing 20 items per page